Dorothea Winter: „Die conditio humana bleibt auch im digitalen Zeitalter bestehen.“

In an era where algorithms steer the decisions of states and artificial intelligence redraws the contours of our society, the tension lies in the difference between human and machine. Machines do not act autonomously; they obey programmed imperatives, incapable of penetrating reasons or bearing responsibility – a principle sharply outlined by Julian Nida-Rümelin, philosopher and former Bavarian Minister of Culture (1995–2001), in his work “Digital Humanism” (2018).


The Humanistic University of Berlin was founded in 2022. Nida-Rümelin’s scientific collaborator Dorothea Winter is responsible for the Master’s program in Applied Ethics, a course designed over four semesters for professionals from medicine, business, and administration to explore the ethical caesuras of digitalization. Anchored in Nida-Rümelin’s demand for a technology that strengthens human authorship, the course aims to be an arena where “the conditio humana is defended against the cold logic of machines.” This interview addresses an urgent confrontation with racial profiling, loss of creativity, and the malleability of digital power. Profound questions are reflected upon: Will our creativity be replaced by machines? Is there a risk of humans becoming slaves to technology? Does the conditio humana – the capacity for moral judgment and responsibility – persist in the digital age?

————-

19. Oktober 2025

Continue reading in German

Interview Directory 

IN FOCUS/ATF

Name: Dorothea Winter

Master's programme in Applied Ethics, Humanistic University Berlin

‘They face ethical dilemmas in their everyday lives, which are exacerbated by these crises.’ 


Ms Winter, the Humanist University and thus also the Applied Ethics degree programme were founded in 2022 – what has been the response like in the first four years?


The response has been enormous – and this is strongly linked to the social situation. We live in a time of multiple crises: climate change, digitalisation, geopolitical conflicts, growing social divisions, loss of trust in democratic institutions. People feel that purely technocratic responses are no longer sufficient. We are living in a time of multiple crises: climate change, digitalisation, geopolitical conflicts, growing social divisions, loss of trust in democratic institutions. People feel that purely technocratic responses are no longer sufficient. Our students are all working professionals and come from very different fields of practice – medicine, social work, education, business, administration, culture. And yet they are united by one experience: in their everyday lives, they face ethical dilemmas that are exacerbated by these crises. They bring different perspectives to the table, but they are motivated by the same fundamental questions: How can we take responsibility without overburdening ourselves? How can we ensure humanity in organisations and institutions? How can we shape technology, politics and economics in such a way that they work for human life rather than against it? It is precisely this shared need that makes the resonance so strong. 


What do you hope to achieve in society with the ‘Applied Ethics’ degree programme?


We are not interested in proclaiming ‘wisdom from above’, but in opening up spaces for maturity.

Digital transformation, climate change, crises of democracy – these are all questions that we can only solve collectively and reflectively. Digital transformation, climate change, crises of democracy – these are all issues that we can only solve collectively and reflectively. I hope that we can achieve something like a translation service: between philosophy and practice, between theory and everyday life. When graduates go out into the world and not only know what Kant and Conradi write, but can say in a company, an NGO or an administration: 


‘Wait a minute, we need to think more broadly about responsibility here’ – then we have achieved something. ‘It is precisely this difference – the human condition – that remains even in the digital age.’ It would be a category error to attribute authorship or even responsibility to them.“


Prof. Nida-Rümelin's 2018 book ‘Digital Humanism’ advocates for technology that strengthens human authorship. Can you explain the lessons of ‘digital humanism’ to our readers? How have you implemented his ideas in the degree programme? 


Nida-Rümelin makes it very clear in Digital Humanism: machines do not act autonomously. They follow programmed rules, while humans are able to understand reasons, weigh them up and take responsibility on that basis. It is precisely this difference – the human condition – that remains in the digital age. He also emphasises that digital systems have no moral judgement. It would be a category error to attribute authorship or even responsibility to them. AI is a tool, not a subject. 


‘And he expressly warns against an ideology that exaggerates the importance of machines and glorifies them as substitutes for humans – that is misleading.’


In this respect, he also clearly differs from transhumanist fantasies that suggest we could or should ‘overcome humanity’. For him, digital humanism therefore means that we actively and normatively shape digitalisation. Technology must not shape people, but must be developed and regulated in such a way that it strengthens freedom, democracy, cultural diversity and participation. And he expressly warns against an ideology that exaggerates the importance of machines and glorifies them as substitutes for people – this is misleading. 


We take up these guidelines in our studies by talking not only about technical possibilities, but also about institutional and political framework conditions. Our part-time students bring with them experience from medicine, administration, education and business. They learn to view digital systems not only as tools, but also to examine them critically: does this system promote autonomy and responsibility – or does it undermine them? This transforms ‘digital humanism’ from a theory into a practical compass for everyday professional life. 


‘When AI creates risk profiles based on location, socio-economic background or physical appearance, people are treated preventively as potential offenders (...) This contradicts the presumption of innocence, one of the central principles of the rule of law.’ 


I saw your interview on Arriba Media. Can you explain the dangers of racial profiling by AI again? 


How else can security needs and, above all, the investigation of crimes be guaranteed? The dangers lie primarily in the fact that AI is not “neutral” but reinforces existing prejudices. For example, if the police have carried out particularly intensive checks in certain neighbourhoods in the past, this surveillance data will later also be found in the training data of an AI system. The system “learns” from this that crime is more prevalent in these areas. As a result, these very neighbourhoods are subject to increased surveillance again. This creates a feedback loop that says nothing about actual crime rates, but places certain groups under permanent suspicion.


Added to this is the danger of so-called presumption of guilt. Digital systems recognise patterns, but without context. If, for example, an AI creates risk profiles based on location, socio-economic background or physical appearance, people are treated preventively as potential offenders – without having done anything at all. This contradicts the presumption of innocence, one of the central principles of the rule of law. ‘Another problem is the lack of transparency. Many of these systems are black boxes.’ Another problem is the lack of transparency. Many of these systems are black boxes: even experts often cannot understand exactly why a person has been flagged.


And in an emergency, the question remains: who is responsible? The developer? The police? The state? This is precisely where dangerous grey areas arise. Of course, people have a legitimate need for security. But security must not be organised at the expense of fundamental rights. Efficiency is not an end in itself. Better intelligence is not achieved through general suspicion, but through targeted investigative work – controlled by the rule of law, with transparency, judicial oversight and opportunities for correction. 


‘To put it bluntly, a society does not become safer by monitoring certain groups more and more closely.’ 


To put it bluntly, a society does not become safer by monitoring certain groups more and more closely. It becomes safer when trust, the rule of law and fundamental rights also apply in the digital age. AI can certainly help here – but only if it is regulated, transparent and democratically embedded. ‘So if we “submit”, it is not to AI itself, but to the economic and political structures that determine its use.’ 


In your interview with Telekom, you predict that people will not become ‘slaves to AI’ and that AI will not take anything away from us, especially in the area of creativity. Can you explain this to our readers? 


The image of a ‘servant’ is actually misleading. Machines don't have their own desires or intentions. They don't crave anything, they don't want anything. So when we ‘submit,’ we are not submitting to AI itself, but to the economic and political structures that determine its use.


The machine is a tool – how we use it is a question of design. ‘But that doesn't mean creativity disappears. It changes.’ This is particularly evident in the field of creativity. AI systems such as GPT or Midjourney can generate texts, images or music that are astonishing. But they do so by imitating patterns in existing data. What is missing is the biographical resonance. When a poem by Rilke or a painting by Käthe Kollwitz moves us, it is because these works convey existential experience, suffering, hope and a connection to the world. A machine cannot do that. It can imitate – but it cannot exist. However, that does not mean that creativity is disappearing. It is changing. An example: in the past, writing an article or essay was considered a genuinely human task. Today, AI can provide raw material. The creative act then lies more in curating, evaluating and thinking ahead. It's similar in music: AI can generate melodies, but the decision about what really carries a song, what breaks, emotions or cultural references it contains – that remains human. I even think that AI can open up creative freedom for us. When routine tasks are automated, we have more time to deal with the question: What do we actually want to say? 


What makes us unique as authors? In this sense, AI can be a productive irritant – it forces us to rethink the core of our creativity. The real danger is not that machines will “take over”, but that

markets and platforms will exploit the creative process purely for economic gain. When AI images displace artists, it is not because the machine paints better, but because business models devalue human labour. This is a social and political issue – not a technical one. 


"That's why I say: we will not become servants of AI as long as we as a society do not make ourselves so. Creativity remains deeply human because it is linked to experience, physicality and responsibility.“


That's why I say: we will not become servants of AI as long as we as a society do not make ourselves so. Creativity remains deeply human because it is linked to experience, physicality and responsibility. AI can help, inspire and challenge us – but it cannot replace the creative moment. 


"This leads to a crucial question: Which activities do we want to consciously keep in human hands, even if a machine could technically take them over?" 


As an ethicist, how do you argue with people who are losing their jobs to AI – such as cashiers in supermarkets or journalists in editorial offices? How can digitalisation still be humanistic? 


First of all, the concerns are real. But the forecasts differ: some paint a picture of massive mass unemployment, while others see more selective shifts. What is becoming apparent is that fewer entire jobs will disappear; rather, tasks within professions will change. An example: in journalism, AI may write stock market reports in seconds. But investigative research, the classification of political events or conversations with sources – that remains human work. 


"Humanistic does not mean reflexively demonising technology. It means putting it at the service of human autonomy and dignity." 


Or take the field of law: AI can review standard contracts or sort sample cases. But weighing up individual cases and legal arguments in court – that cannot simply be automated. The opportunities in medicine are particularly exciting. AI can make skin cancer screenings much more accurate because it analyses thousands upon thousands of images of moles and detects the smallest deviations that the human eye would overlook. In psychotherapy, initial approaches are being made with VR glasses and AI-supported simulations that open up new avenues for people with traumatic experiences in protected, controlled environments. In other words, AI can relieve the burden on doctors and therapists and give patients new opportunities – but it cannot replace the relationship of trust, empathy and social interaction. ‘Digitalisation is not inevitable; it can be shaped – and that is precisely where the responsibility of our time lies.’ This leads to a crucial question: which activities do we want to consciously keep in human hands, even if a machine could technically take them over? One example is nursing care. Of course, robots could take over many tasks in the future – but do we want machines to read to people? Probably not. On the other hand, do we want nursing staff to be supported by robotic lifting arms to move patients in a way that is easy on their backs? Absolutely yes. This is an example of how technology does not replace human work, but rather makes it easier and better. So being humanistic does not mean reflexively demonising technology. It means putting it at the service of human autonomy and dignity. This means that we need social debates about where AI can provide meaningful support and where we do not want to forego human interaction, judgement and creativity. Digitalisation is not inevitable; it can be shaped – and that is precisely where the responsibility of our time lies.

———-


Dorothea Winter ©Private

TOP STORIES

DÜSSELDORF

Dr Bastian Fleermann:  ‘From the first to the very last day in Düsseldorf, there was resistance from very courageous men and women, from very different directions.’

_____________

ENVIRONMENT

007: Who is Paul Watson and why was he arrested?

The Whaling Moratorium

and a little history

August 2024 (EN)

_____________

FREEDOM OF SPEACH

Julian Assange und die Pressefreiheit. Eine kleine Chronik (DE)

_____________

DÜSSELDORF OPERA

The surprising switch of course

_____________

PHOTOGRAPHY

Without censorship: World Press Photo publishes the regional winners of the 2024 photo competition

April 2024

_____________

FLORENCE

Anselm Kiefer's "Fallen Angels" in Palazzo Strozzi: the angels who were driven out of heaven because of their rebellion against God. A symbolism.

_____________

April 2024

DÜSSELDORF

Controversy surrounding the Düsseldorf Photo Biennale

March 2024

_____________

WAR

Wood as a relic and contemporary witness of the war. The Gesamtkunstwerk "1914/1918 - Not then, not now, not ever!" is being shown again at the Kunsthalle Rostock

March 2024

_____________

ISRAEL

Düsseldorf commemorates the destroyed Great Synagogue in difficult times. Presentation "missing link_" by Mischa Kuball

November 2023

_____________

ENGLISH CHRISTMAS

10 years of Glow Wild at Kew Wakehurst.

October 2023

_____________

TRIENNALE MILANO

Pierpaolo Piccioli explains his fascination with art.

_____________

NEW MUSEUMS

Universal genius Peter Behrens: Peter Behren's office was where Mies van der Rohe, Le Corbusier and Hans Walter Gropius worked together at times. 110 years of the Behrensbau. 

29 May 2023

_____________

NEW MUSEUMS

The Deutsches Fotoinstitut

is coming in big steps.The Bernd and Hilla Becher Prize will be awarded for the first time.

19 May, 2023

_____________

VISIONS, ARCHITECTURE

Abrahamic Family House Peace Project. The most important design element of the buildings is sunlight.

MARCH 10, 2023

_____________

CHECK THE THINGS YOU WANT TO THROW AWAY

HA Schult's Trash People at the Circular Valley Forum in Wuppertal on 18 November 2022. 

NOVEMBER 19, 2022
_____________

THE OPERA OF THE FUTURE

Düsseldorf, capital of North Rhine-Westphalia will receive the opera house of the future.

FEBRUARY 15, 2023
_____________

FLORENCE

The extraordinary museums of Florence in 2023.

JANUARY 1, 2023

_____________

DÜSSELDORF

Do you already know Düsseldorf? You'd be surprised what the city, located directly on the Rhine, has to offer.

DECEMBER 11, 2022

RELATED TALK