Are humans still at the center of innovation? This question is at the heart of the debate on roboethics, a topic that SIRI identifies as fundamental to the future development of robotics. "Human creates and has always created his tools to improve work - explains SIRI’s presidente Domenico Appendino, tracing the history of robotics - but also to reduce risks, fatigue and protect his health. For this reason, tools are deeply linked to the social and ethical aspects of man throughout his history. If we look at the history of robotics, we can see that the theme of roboethics has also been addressed in literature, starting from Asimov's famous Laws of Robotics to the play 'RUR, Rossum's Universal Robots'. Today, in this manufacturing scenario, these themes are still important and it is therefore essential to analyze them".
Paolo Benanti, professor of Moral Theology and Ethics of Technologies at the Pontifical University, stresses that the future challenge will not be between human and robot, but between human with robots and human without robots. In this future that perhaps it’s already our present, Benanti points out another essential dichotomy: the one between algoethics and algocracy. "Algoethics - explains Benanti - was born in response to what is called algocracy, i.e. the 'dominion of algorithms', a society based on the massive application of algorithms. It becomes necessary to study the ethical problems and social implications (but also political, economic and organizational) arising from the increasing use of information technologies".
For Andrea Bertolini, researcher and director of the EURA Center of Excellence on the regulation of robotics and AI, maximum attention must be paid to the definition of the regulations that will govern the development and applications of artificial intelligence systems. "There is a European regulatory interest - highlights Bertolini - and it is the so-called 'Brussels effect', i.e. the first to formulate rules defines them for all players (including other countries). This is a very important game and should be considered an opportunity for innovation. This regulatory activity will be fundamental for the development of European robotics".
Talking about robotics, it is important to take into account also the concept of social robots, i.e. those robots whose purpose is to welcome and assist people. "The goal - explains Antonio Sgorbissa, professor at Dibris University of Genoa - is to create robots that have also cultural competences, meaning that they are able to adapt their acting according to the person in front of them. Social robotics enlights issues worthy to be considered in other contexts, i.e. determining the trust of the operator towards the machine, making the algorithms capable of explaining their choices and their working methods and, finally, developing autonomous machines capable of recognizing different cultures and acting accordingly".
“It is important to stress out” - Appendino concludes - "that industrial robotics is still very far from many of these experiences that, instead, concern bots, i.e. artificial intelligences working on computers. However, the progress is so fast that these themes could be here tomorrow, without almost realizing it. In the meantime, we see the very idea of company and society changing, with the worker becoming a machine operator, more and more similar to an employee. I conclude, however, by recalling that a knife can serve a mother to prepare a meal to her child, and a murderer to kill someone. The difference is always how a tool is used: the same happens with robots that, let's remember, are not friends or workmates, but tools used by those who manage them and, still for a long time, not able to self-determine".
Dr. Susanne Bieller
IFR General Secretary
Phone: +49 69-6603-1502
E-Mail: secretariat(at)ifr.org
Silke Lampe
Communication Manager
Phone: +49 69-6603-1697
E-Mail: secretariat(at)ifr.org
Sibylle Friess
Membership Management
Phone: +49 69-6603-1124
E-Mail: secretariat(at)ifr.org
Credits · Legal Disclaimer · Privacy Policy ·World Robotics Terms of Usage · © IFR 2025