Artificial Intelligence Is Trying to Learn The Human Intent

Artificial Intelligence has mastered the art of understanding and responding to simple statements such as, “Siri, what is the day today?” or “Alexa, play love me like you do”. But they are machines and not humans like us, which is why it is not possible for them to understand and then respond to complicated statements.

Yejin Choi, who is a natural language processing researcher at the University of Washington in Seattle, also agrees to the fact that artificial intelligence needs to be further worked upon.

Artificial intelligence usually fails when it is expected to differentiate between two very similar things like tone and idioms. This is when scientists know that there is still a lot to put in machines with artificial intelligence.

1.What are the researchers doing to achieve a more human like machine?

Researchers and scientists are working day and night to make it possible for Artificial Intelligence to understand and respond to words and phrases that are beyond the level of their strict dictionary definitions.

Lately, a conference for Artificial Intelligence took place where a group came forward with a system that analyses what a person is actually saying. There was another group that came forward with an Artificial Intelligence which could differentiate between literal and figurative phrases in written state.

Louis-Philippe Morency, who is an artificial intelligence researcher at the Carnegie Mellon University in Pittsburgh believes that a person’s facial expressions or tone of speech can easily change the meaning of their sentence.

He further explains that when a person watches a bad movie and then calls it “sick” with a pitched voice and a frown, that would not mean what “sick” most of the times indicate to.

This is when Louis-Philippe Morency and his team came up with the idea of making Artificial Intelligence learn this difference by simply making it watch YouTube videos.

This is how a machine can be expected to tell the difference between words that are the same but have more than one meaning to be differentiated by the person’s facial expression and pitch of their voice.

2.What outcome did they receive?

The above method sure did work. Artificial Intelligence was successful in identifying 78 percent of negative and positive emotions a video contained, as reported by Louis-Philippe Morency and his team in the month January 31st. The researchers also succeeded in differentiating between different emotions and expressions. Artificial Intelligence could understand some emotions better than the other. For example, it could distinguish between happiness and sadness with approximately 87.3 to 83.4 percent correctly while some neutral expressions could be distinguished to only about 69.7 percent accurately.

Louis-Philippe Morency is now working on ways in which Artificial Intelligence will be able to recognise if someone is expressing something out of sarcasm. Now that’s something that would be commendable if achieved. This is very important to be achieved as Artificial Intelligence is getting more and more involved in our daily lives.