What does accountability actually mean, and how does it apply to AI ethics? We’ll also discuss what moral agency and responsibility mean and the difficulty of assigning blame.
icon

II. What is accountability?

Accountability means the state of being responsible or answerable for a system, its behavior, and its potential impacts. Accountability is an acknowledgement of responsibility for actions, decisions, and products.

Responsibility can be legal or moral (ethical). Legally, an actor is responsible for an event when a legal system is liable to penalise that actor for that event. Morally, an actor is responsible for an act, if they can be blamed for the action. Moral and legal responsibility are different things. They do not always coincide; an agent can be legally responsible even if they were not morally responsible, and vice versa. In this course, we´ll focus only on moral aspects of responsibility.

In AI ethics, there are three different senses or dimensions of accountability. They point to a different means of action including:

  • The question of determining the responsibility – which individuals (or groups) are accountable for the impact of algorithms or AI? Who is responsible for what effect within the overall socio-technical system?
  • A feature of the societal system that develops, produces, and uses AI
  • A feature of the AI system itself

You have reached the end of this section!

Continue to the next section: