Answer Writing Practice for UPSC IAS Mains Exam: Paper - III (General Studies – II) - 11 April 2019


Answer Writing Practice for UPSC IAS Mains Exam


UPSC Syllabus:

  • Paper - III : General Studies - II : Governance, Constitution, Polity, Social Justice and International Relations

Q. Rapid interaction with Social machines may lead to long-term psychological & social effects. Suggest measures to overcome these challenges. (250 words)

Model Answer:

  • Introduction
  • Machines are learning about us
  • Using Social Machines
  • Case study
  • Associated challenges
  • Conclusion

Introduction:

Over the past few years, we have seen a rapid proliferation of smart home devices. These gadgets understand the sentences we target at them well enough to respond with actions or conversations with a level of intelligence that has been steadily getting better and more human with each passing year.

Machines are learning about us:

  • The more we use them, the more they are getting to learn about us, and the world we each uniquely inhabit, until eventually they will be able to respond so convincingly to everything we tell them that they will become indistinguishable from human companions.
  • Considering the rate at which they are progressing, we can already imagine a future where the use of touch-based input devices will be as quaint and charmingly old-fashioned as handwritten letters. To an entire generation, typing will be a skill they would have never needed to master. That future is not as far away as it might seem.

Using Social Machines:

  • As we have become more and more trusting of our smart devices and they have begun to understand exactly what it is we mean when we say the things we say, the uses to which they can be put have started to exceed even our wildest imaginings of what might have been possible.
  • Elderly people, particularly those with failing mental faculties, have begun to lean on these devices for answers, knowing that even if they are asking it a question for the 100th time, they will receive the same patient response, something no human caregiver could be expected to provide.
  • As these conversational devices become the hub for all the connected devices in a home, they will be able to actively monitor the well-being of its inhabitants and guide them verbally on what to do in case of an emergency.
  • We have already seen the ease with which children interact with devices. We have all been guilty of encouraging them to have long conversations with the smart assistants on our phone when we want to keep them occupied because we are busy with something else.
  • Toy manufacturers such as Mattel have seized this opportunity to produce interactive toysthat can actively engage with their young owners. The Hello Barbie doll uses cloud-based Artificial Intelligence (AI) to converse with children on topics as divergent as music, fashion and careers as well as abstract emotional topics such as how the child is feeling.

As good as these devices are as caregivers, no one has studied the longterm psychological effects of interaction with social machines, particularly in the context of the very old or very young. Before we allow these devices to firmly ensconce themselves in our world, we’d do well to evaluate how our increased reliance on these seemingly intelligent devices is going to affect us.

Case study:

  • A couple of years ago, police officers investigating a murder in Bentonville, Arkansas, issued a warrant requesting all electronic data in the form of audio recordings, transcribed records or other text records from a smart home device at the scene of the crime.
  • Though they knew that the device was not always listening and recording, they were hoping that the device might have been intentionally woken up to play a song at an opportune moment and they were hoping that an analysis of the background audio might offer evidence of an argument or fight.
  • As much as requests like these raise questions of privacy and freedom of speech, it is likely that courts will allow investigators access to this information if it could help solve a crime.

Associated challenges:

  • As we use these smart devices for a wider range of purposes, we are going to be faced with a number of legal and ethical questions about what we need to do about what these devices are told.
  • For instance, what should one do when a child tells his smart home device that his uncle is touching him in an unwelcome way? Does the manufacturer of the device have a moral obligation to report this information to law enforcement?
  • Does that obligation assume greater urgency if there is credible risk that the child might be harmed?
  • If the entities that had information about that threat and consciously chose not to report it, will it liable if the child in question does get harmed, or worse, dies?
  • How does all of this square with the general and overarching obligation to be particularly sensitive of a child’s right to privacy?

Conclusion:

These are questions that manufacturers of smart devices already have to answer as they build an ever-increasing library of responses to the conversations that their smart devices are having. As users have become more and more comfortable with confiding in their smart assistants, they have already begun to ask their artificial companions for help in dealing with suicidal feelings and clinical depression.

In such situations, the right response from the smart device can mean the difference between life and death. Conversational AI programmers have had to collaborate with psychologists to figure out what those responses should be, but they will inevitably make mistakes as they grapple with increasingly complex situations.

Click Here for Answer Writing Practice Archive

हिन्दी में उत्तर लेखन अभ्यास कार्यक्रम के लिए यहाँ क्लिक करें