2018--2019学年人教版选修七Unit 1 Living well reading课时作业(8)
2018--2019学年人教版选修七Unit 1 Living well reading课时作业(8)第1页

第一节 阅读理解

  Every day, we are moving closer to some kind of artificial intelligence(人工智能). Progress in big data, machine learning and robotics are going to give us a world where computers are effectively intelligent in terms of how we deal with them. Should you be scared by this? Absolutely, but not in the usual "robot overlords" (机器人帝国) kind of way. Instead, the real fear should be about getting human beings wrong, not getting AI right.

  The key to the technology is the ability of computers to recognize human emotions based on the ''activation" of muscles in the face. A computer can identify the positions of facial muscles and use them to infer the emotional state of its user. Then the machine responds in ways that take that emotional state into account.

  One potential application of it is to provide "emotional robots" for the elderly. Having a machine that could speak in a kind way would comfort a lonely older person. That is a good thing, right? But that won't also relieve us from questioning how we ended up in a society that takes care of the elderly because we don't know what else to do with them? Can't we have more humane solutions than robots?

  "Emotion data" aren't the same thing as the real and vivid emotional experiences we human beings have. Our emotions are more than our faces or voices. How can they be pulled out like a thread, one by one, from the fabric(组织) of our being?

  Research programs can come with much philosophical(哲学的) concern, too. From the computers' point of view, what the computing technology captures are emotions, but at its root is a reduction of human experience whose outward expressions can be captured algorithmically (计算上). As the technology is used in the world, it can reframe the world in ways that can be hard to escape from.

The technology will clearly have useful applications, but once it treats emotions as data, we may find that it is the only aspect of emotion we come to recognize or value. Once billions of dollars floods into this field, we will find ourselves trapped in a technology that is reducing our lives. Even worse, our "emotion data" will be used against us to make money for someone else.