Loading...

ADVANCES IN COMPUTER SCIENCES (ISSN:2517-5718)

Do Computers Care?

Ian Pyle*

Department of Computer Science, The University of York, England, United Kingdom

CitationCitation COPIED

Pyle I. Do Computers Care? Adv Comput Sci. 2020 Jan;3(1):119.

Abstract

A computer carefully executes its program, with the right operations all in the right order. But it appears to be careless about what the program does, or what its consequences are. Where, if anywhere, is the care? We discuss this apparent paradox.

Introduction

We must be careful with our language when discussing computers, particularly with Artificial Intelligence. It is tempting to infer other human characteristics such as purpose, responsibility, awareness, and intention. In this note, we explore the notion of care and its relevance to computer systems, particularly robots.

The subject is particularly relevant in the light of the government announcement that “Care robots could revolutionize UK care system and provide extra staff support” [1].

Language

Human languages have developed over tens (perhaps hundreds) of millennia, for humans to inform one another about human experiences and relationships, and explain them. Human language is powerful and extensive, being able to express what might be, as well as what is, what has been and what will be, often using stories (e.g. Greek myths about thunder and volcanoes). It is also necessarily limited, being constrained by human experience and imagination: the patterns we recognize identify and name – in particular systems and their properties.

Mathematics

During the last few centuries, another way of explaining has emerged, using the language of mathematics. This is more limited than stories in its scope, but more powerful within that scope. Newton explained gravity using the inverse square law; Maxwell explained electromagnetism using differential equations. Computer programming languages are now based on mathematics, but with additional features taken from ordinary language, e.g. FORTRAN (“Formula Translation”) [2], Algol60 (“Algorithmic Language of 1960”) [3].

Computers

The words we use to describe what a computer does are almost always chosen because of the analogies with human activities. Thus we refer to a computer's memory, activities of reading and writing, even computing! A computer system may appear to decide its actions, to have a purpose, be intelligent, have a mind-set, and (sometimes) a will of its own. Of these, only for intelligence is it deemed necessary to have the qualifier “artificial”; a similar qualification is recognized in “Machine Learning” having some similarities with human learning, but with possible differences. Because of this tendency, we have to be careful in discussing where responsibility may lie. So what about care? We must be careful and avoid being careless.

Affective Computing

Since the groundbreaking work of Picard [4], much has been done to find algorithms that can recognize, interpret, process, and simulate human affects such as emotions and empathy.

Intermediate human characteristics such as awareness, intension, purpose, that take account of causes and effects, may also be included.

Affective computing is valuable in medical informatics [5], particularly in relation to communication but not directed towards care. The ability to sense and respond to human emotions is reported by Brigham [6], without displaying care.

The difficulty of discovering suitable algorithms indicates that something additional(non- algorithmic) is needed. Successes have been achieved [7] using non-algorithmic cues, such as embodied computer agents and gender.

A key discriminate is the difference between the ways computers and brains work [8]. Computers act in accordance with their programs that are the algorithms and data structure they hold in their memories. Brains work in the ways that have evolved over many millennia, based on recognizing causes and effects, having purposes and the will to seek to achieve them.

Care

The English Concise Dictionary gives definitions of 'care' used as a noun or as a verb, with a range of meanings from anxiety to being responsible for maintenance. The central core seems to be concern for the well-being of a particular subject, usually (but not always) a person, such as care for a child, or care for the environment.

Interpreting this in the context of a computer system, we meet two problems: (a) the need to identify the subjects influenced by its actions (which relies on causality), and (b) the need to identify what action or inaction would affect the well-being of subject (which relies on empathy).

Considering a computer in an environment, the activity of the computer may depend on the situation in the environment; here we have the complementary relationship: the environment may be affected by the activity of the computer.

Duty of Care

In law, there is a moral or legal obligation to ensure the safety or well-being of others. According to Wikipedia [9], in English law, the duty of care is a legal obligation which is imposed on an individual requiring adherence to a standard of reasonable care while performing any acts that could foreseeable harm others. For example, employers have a duty of care to their employees. Thus, while care is primarily a duty of an individual, it may also apply to an organization. Can it apply to a computer?

The duty implies that the individual concerned has the ability to foresee possible harm that its actions can cause others. Computers fail this test on two counts: (a) they do not in any way foresee the consequences of their actions; and (b) they are not aware of any impact (benefit or harm) that their actions have on others. Thus, if a guided missile or military robot is programmed to kill someone, it does that without any care for the harm it causes.

In the case of a safety-related system, if a computer is responsible for the actions, the enclosing system must ensure that no potential victim is in a region in which the computer can cause harm.

Care for Oneself

While it is difficult to identify others that a computer may care for, there are some situations when a computer system cares for itself. For example, in the Linesman Radar Data Processing System [9], the Foundation layer included extensive facilities for identifying faults within the computer system and (when possible) rectifying them: sometimes by changing the hardware configuration to use alternative elements, sometimes by reloading software and reconstituting data. As explained in that paper, the behavior represents a kind of selfawareness, unusual in computer systems of that time. In a limited sense, the Linesman RDPS was aware of itself: it contained a model of itself, physical and logical, and it had the ability to make (limited) changes to its own configuration. However, it had no representation of significance in any of these.

To a lesser extent HUW [10] was self-aware, and found by measurement of availability to expose computer failures, which were addressed by writing extensive self-repair facilities.

Unfortunately, this produced a difficult software problem [11,12], which was found to be caused by a mistake in HUW caring for itself.

Ability to Care

A system may have some properties not possessed by any of its components, as well as those properties it inherits from its components. The new properties are said to “emerge” from the interactions between its components, and coherence between them. With this distinction in mind, we can see that the ability to care is an emergent property of a human (treated as a system), which requires other human abilities to enable it, specifically to be aware of causality (to recognize what effects its action might have on the other) and empathy (to recognize what the other needs for its well-being).

These abilities are way beyond what a computer can do. They cannot be expressed by algorithms.

Computers that Claim to Care

In a review of “computers that care”, [7] Brave, Nass & Hutchinson report a variety of caring functions carried out by computers, but none that would fulfill the duty of care mentioned above. As pointed out in section 6, such care depends on the ability to foresee possible harmful effects on others. Without the comprehensive consideration of consequential effects, the caring functions that such computers carry out do not constitute real care.

Category Error

Ordinary language encourages us to extrapolate the properties of one thing and apply them to another, as a metaphor. However, trying to analyze the meaning of a metaphor leads to a Philosophical problem called a “category error”: a semantic or ontological error in which things belonging to a particular category are presented as if they belong to a different category [12] or, alternatively, a property ascribed to a thing that could not possibly have that property. An example is the metaphor "time flies", which, if taken literally is not just false, but a category mistake [13]. To show that a category mistake has been committed one must typically show that once the phenomenon in question is properly understood, it becomes clear that the claim being made about it could not possibly be true.

Thus the question posed in the title of this paper can be considered as neither true nor false, but a category mistake. To avoid misunderstanding, however, we disregard this problem by treating the question literally, not as a metaphor.

Conclusion

Computers do not care (in the sense of “duty of care”), but not in the way that is usually implied. Computers are not careless (willfully neglecting or rejecting relationships with others), because they do not have the capacity to recognize or respond to such relationships.

The paradox arises because of the mistaken (language-motivated) application of human characteristics to a machine. Computers do not take account of the consequences of their actions, so the care with which they execute instructions is not of the kind that the word means in normal usage. Caring is a real (human) skill. 

This is not to exclude computers from being used for specific purposes associated with caring, but it is wrong to claim that “care robots” really do care about their subjects, or to expect them to carry any responsibility in that relationship.

Acknowledgement

I wish to thank the anonymous referee who drew my attention to the work on affective computing mentioned above, re-assuring me that others have already recognized the problem and are seeking to address it.

References

  1. Press release from the Department for Business, Energy &Industrial Strategy.
  2. Backus JW, Beeber RJ, Best S, Goldberg R, Haibt LM, et al. “The FORTRAN Automatic Coding System” (PDF). Western Joint Computer Conference. 1957 Feb;188–198.
  3. Backus JW, Bauer FL, Green J, Katz C, Mc Carthy J, et al. “RevisedReport on the Algorithmic Language Algol 60”. [originally]edited by Peter Naur. 1960.
  4. Picard RW. “Affective computing”, Media Laboratory Perceptual Computing Section Technical Report No. 321 and “Affective computing”. Cambridge, Massachusetts.
  5. Luneski A, Konstantinidis E, Bamidis PD. “Affective medicine. Areview of affective computing efforts in medical informatics”.Methods Inf Med. 2010 Apr;49(3):207-218.
  6. Brigham TJ. Merging Technology and Emotions: “Introduction to Affective Computing”. Med Ref Serv Q. 2017;Dec;36(4):399-407.
  7. Brave S, Nass C, Hutchinson K. “Computers that care: Investigating the effects of orientation of emotion exhibited byan embodied computer agent”. International Journal of Human Computer Studies. 2005 Feb;62 (2):161-178.
  8. Harvey RJ. Can computers think? Differences and similarities between computers and brains. Prog Neurobiol.1995 Feb; 45(2):99-127.
  9. Duty of care in English law.
  10. Pyle IC. “Software for the Linesman Radar Data Processing System. “The Computer Journal. 2019 Jun;62(6):806–819.
  11. McLatchie RCF. “HUW, an interactive computer system on IBM System 360/65', SEAS XIV. Conference, Grenoble, 1969.
  12. Pyle I. A British Multi-Access System on an IBM System/360'Computer. Current Trends in Computer Sciences Applications.2019 Nov;1(4).
  13. Pyle IC, McLatchie RCF, Grandage B. “A second-order bug withDelayed Effect,” Softw Pract Exp. 1971;1,:231–233.
  14. https://en.wikipedia.org/wiki/Category_mistake