The rapid growth in autonomous technology has made it possible to develop intelligent systems that can think and act like humans and can self-govern. Such intelligent systems can make ethical decisions on behalf of humans by learning their ethical preferences. When considering ethics in the decision-making process of autonomous systems that represent humans for ethical decision-making, the main challenge is agreement on ethical principles, as each human has its own ethical beliefs. To address this challenge, we propose a hybrid approach that combines human ethical principles with automated negotiation to resolve conflicts between autonomous systems and reach an agreement that satisfies the ethical beliefs of all parties involved.

Don't You Agree with My Ethics? Let's Negotiate!

Scoccia, Gian Luca;Inverardi, Paola;
2023-01-01

Abstract

The rapid growth in autonomous technology has made it possible to develop intelligent systems that can think and act like humans and can self-govern. Such intelligent systems can make ethical decisions on behalf of humans by learning their ethical preferences. When considering ethics in the decision-making process of autonomous systems that represent humans for ethical decision-making, the main challenge is agreement on ethical principles, as each human has its own ethical beliefs. To address this challenge, we propose a hybrid approach that combines human ethical principles with automated negotiation to resolve conflicts between autonomous systems and reach an agreement that satisfies the ethical beliefs of all parties involved.
2023
9781643683942
Autonomous Systems, Machine Ethics, Automated Negotiation, Ethical Decision Making
File in questo prodotto:
File Dimensione Formato  
2023_FAIA_368_Memon.pdf

accesso aperto

Tipologia: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 203.69 kB
Formato Adobe PDF
203.69 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12571/30415
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 0
social impact