What does AI ethics mean and what role do values and norms play? We’ll also look at the principles of AI ethics that we will follow in this course.
icon

III. Values and norms

Values and norms are the basic elements of ethics. The concept “value” means, roughly, the degree of importance of a thing or an action. Values provide ideals and standards with which to evaluate things, choices, actions, and events. In ethics, the focus is primarily on moral values, although other types of values – economic, aesthetic, epistemic (or knowledge-related) – are sometimes relevant morally. For example, economic factors may play morally significant role, if economic decisions have morally significant consequences to people.

Intrinsic and extrinsic values

Values can be divided into extrinsic (also called “instrumental”) and intrinsic values. For example, money has extrinsic or instrumental value. Money is valuable only because one can use it for other things, such as to provide better medical care for the people. These things, in turn, may be good for what they lead to: for example, for better health. And those things, in turn, may be good only for what they lead to – for example the better quality of life. Intrinsically valuable things are typically “big moral values” – happiness, freedom, wellbeing. These are things that are good as they are. For some, they also explain the “goodness to be found in all the other things” (cf. Aristotle, Nicomachean Ethics, 1094a).

Norms

Norms are value-based principles, commands and imperatives – such as the sets of AI guidelines. They tell what one should do, or what is expected of someone. Norms may be prescriptive (encouraging positive behavior; for example, “be fair”) or proscriptive (discouraging negative behavior; for example, “do not discriminate”).

There are several types of norms:

  • Some norms are merely statistical regularities: one notices that many computer scientists tend to wear black T-shirts.
  • Some norms are social norms; they tell what people in a group believe to be appropriate action in that group.
  • Moral norms are prescriptive or proscriptive rules with obligatory force beyond that of social or statistical expectations. For example, “Do not use AI for behaviour manipulation” is a moral norm.
  • Norms may also be legal norms. Importantly, a legal norm may not be a moral norm, and vice versa. Simply, the fact that “X is a law” does not make it a moral principle. Instead, one can always ask: “Is this law morally acceptable or not?”

You have reached the end of this section!

Continue to the next section: