ProjectCan software be "responsible"?

Basic data

Can software be "responsible"?
4/1/2019 to 3/31/2020
Abstract / short description:
In Artificial Intelligence computer programs take "decisions on actions",
for example in the control of autonomous cars. Such programs need to be able
to take decisions in situations which were not foreseen by the programmer. As
much as such programs are used in daily life, the question arises where the
responsibility for the decisions taken by a computer program can be located.
In the project, the following questions will be addressed:
• How can the responsibility be located for decisions, taken by artificial
intelligence computer programs, within the hierarchy of programmer, IT
company, product vendor (OEM), and individual product user?
• Which forms of "attribution of responsibility" are needed to ensure the
acceptance of artificial intelligence in daily life by the society?
These question should be, on the one hand, approached from the engineering
side—identifying the "responsibility" in software engineering; on the other
hand we will draw on case studies on the evolution of the acceptance of technical
innovation, which involve questions concerning responsibility, by the society
to identify better the specific complex of problems of artificial intelligence.
Verantwortung; Künstliche Intelligenz

Involved staff


Carl Friedrich von Weizsäcker Center
Other institutions
Faculty of Science
University of Tübingen
Wilhelm Schickard Institute of Computer Science (WSI)
Department of Informatics, Faculty of Science

Contact persons

Institute of Applied Physics (IAP)
Department of Physics, Faculty of Science

Local organizational units

Wilhelm Schickard Institute of Computer Science (WSI)
Department of Informatics
Faculty of Science


Hannover, Niedersachsen, Germany

will be deleted permanently. This cannot be undone.