On
a recent family outing, my mother and sister got into a shouting match.
But they weren't mad at each other — they were yelling at the iPhone's
turn-by-turn navigation system. I interrupted to say that the phone
didn't understand — or care — that they were upset. "Honey, we know," my
mom replied.
"But it should!" She had a point.
After all, computers and technology are becoming only smarter, faster
and more intuitive. Artificial intelligence is creeping into our lives
at a steady pace. Devices and apps can anticipate what we need,
sometimes even before we realize it ourselves. So why shouldn't they
understand our feelings?
If emotional reactions
were measured, they could be valuable data points for better design and
development. Emotional artificial intelligence, also called affective
computing, may be on its way. But should it be? After all, we're already
struggling to cope with the always-on nature of the devices in our
lives. Yes, those gadgets would be more efficient if they could respond
when we are frustrated, bored or too busy to be interrupted, yet they
would also be intrusive in ways we can't even fathom today.
It
sounds like a science-fiction movie, and in some ways it is. Much of
this technology is still in its early stages, but it's inching closer to
reality. Companies like Affectiva, a start-up spun out of the MIT Media
Lab, are working on software that trains computers to recognise human
emotions based on their facial expressions and physiological responses. A
company called Beyond Verbal, which has just raised close to $3 million
in venture financing, is working on a software tool that can analyze
speech and, based on the tone of a person's voice, determine whether it
indicates qualities like arrogance or annoyance, or both.
Microsoft
recently revealed the Xbox One, the next-generation version of its
flagship game console , which includes an update of Kinect, its
motion-tracking device that lets people control games by moving their
hands and bodies. The new Kinect, which goes on sale later this year,
can be controlled by voice but is not programmed with software to detect
emotions in those interactions. But it does include a higher-definition
camera capable of tracking fine skeletal and muscular changes in the
body and face.
The machine can already detect
the physics behind bodily movements , and calculate the force behind a
punch or the height of a jump. In addition, one of the Kinect's new
sensors uses infrared technology to track a player's heartbeats. That
could eventually help the company detect when a player's pulse is racing
during a fitness contest — and from excitement after winning a game.
For avid gamers like myself, the possibilities for more immersive,
interactive play are mind-boggling.
Albert
Penello, a senior director of product planning at Microsoft, says the
company intends to use that data to give designers insight into how
people feel when playing its games — a kind of feedback loop that can
help shape future offerings and experiences. He says Microsoft takes
privacy very seriously and will require game developers to receive
explicit permission from Xbox One owners before using the data.
Microsoft
says games could even adapt in real time to players' physical response,
amping up the action if they aren't stimulated enough, or tamping it
down if it's too scary. "We are trying to open up game designers to the
mind of the players," Mr. Penello said. "Are you scared or are you
laughing? Are you paying attention and when are you not?" Eventually, he
said, the technology embedded in the Kinect camera could be used for a
broader range of applications, including tracking reactions while
someone is looking at ads or shopping online, in the hope of
understanding what is or isn't capturing the person's interest.
But
he said those applications were not a top priority for the company.
Some companies have experimented with technologies like eye-tracking
software to see what parts of commercials draw the most attention from
viewers.
No comments:
Post a Comment