The Year of AI: 2014 Brings Impressive Advances and a Glimpse Into the Future of Artificial Intelligence

From speakers you can text to sales associate robots making their way through Lowe’s, the last year brought a variety of impressive technological advances that demonstrated the potential of artificial intelligence (AI).

While we are still many generations away from seeing software that will be able to fully mimic human intelligence, we are now in an era where humans can have conversations and other interactions with technology in a way that wasn’t possible in the past.

Technology Review pointed to Microsoft’s Cortana as a standout example of how technology can now recognize, process and respond to human speech. The virtual assistant is essentially taking applications like Apple’s Siri to the next level.

Cortana is built into the mobile version of windows and actually learns about the person using the software over time.

While other voice recognition software from the past allowed users to launch apps or receive quick answers to simple questions about a celebrity’s age or what the weather is like, Cortana actually has the ability to control certain apps, connect reminders to specific people and accurately follow a series of questions and demands.

“After asking for ‘The best Mexican restaurants in Palo Alto,’ he could narrow down the candidates Cortana listed by asking ‘Which ones take reservations?’ and then ‘Call the second one,’ after making a decision,” Technology Review reported about Joe Belfiore, vice president of Microsoft’s operation system group, who presented the new software back in April.

The year also revealed the possibilities of using artificial intelligence to monitor homes through thermostats, security systems and more.

Technologies like the Nest thermostat, which recently entered its second generation of devices, actually learns the users’ preferred temperature, daily routine and energy costs.

After compiling enough information about its new owner, Nest adjusts accordingly without consumers having to think twice about touching the thermostat.

Then there was the introduction of Canary, the smart bot that can defend a consumer’s home from intruders without setting off unnecessary false alarms.

The security system is another artificially intelligent device that learns a family’s schedule and routine and adjusts accordingly.

For example, if the kids are always coming home from school around 3 p.m., the device will know to expect their arrival and won’t sound off immediately when they enter the door.

If the device detects any sort of unusual activity inside the home, it will send push notifications to the buyers’ mobile device alerting them of the new behavior.

Other new technologies from researchers at Facebook allow systems to scan different photos and tell if the same person is present in each photo.

Google also revealed a system that can describe images with short, simple sentences, essentially moving technology closer to having the ability to actually “see” as opposed to just “sense.”

Three main factors contribute to technology being able to function more and more like a human — parallel computation, big data and better algorithms.

The more efficiently technology is able to “think” in a parallel process, collect and store big data and function using better algorithms, the closer it becomes to being more human-like.

These are the reasons 2014 introduced the world to more accurate body trackers that allowed consumers’ motions in person to impact the progress of their video games, cars that were able to drive and park themselves and even Facebook news feeds and search engine results that were so accurate and personalized that many people grew uncomfortable with how smart technology could really be.

Perhaps the greatest AI superstar, however, is still a famous computer by the name of Watson.

Watson famously won a game of Jeopardy in the past, but this year it became incredibly close to breaking barriers in cancer treatment.

IBM announced that a version of its futuristic supercomputer is close to being able to use genomic data to choose personalized treatment plans for cancer patients.

The quickly advancing technology also has some experts predicting that we aren’t too far from seeing some of our wildest sci-fi dreams come to life.

According to Ray Kurzweil, the author of five different books on AI and the founder of futurist organization the Singularity University, we could even start seeing more human-like machine intelligence integrated into our daily lives by 2029.

Could IBM’s Watson Get to the Bottom of the Issues in Ferguson?

IBM’s “cognitive supercomputer” is starting to get involved in law enforcement, and now there is speculation that the data-crunching device could get to the bottom of issues with law enforcements in Ferguson, Missouri, and other Black communities across the nation.

IBM’s supercomputer, better known as Watson, garnered a lot of attention after it soared to victory on the Jeopardy game show more than three years ago.

Since then, the computer has been used in matters pertaining to food science, customer service and helping veterans prepare for life after the military.

Now, authorities are hoping Watson can get more involved with police investigations like the shooting of Michael Brown, the unarmed teen who was fatally shot Aug. 9 by Ferguson police officer Darren Wilson.

Police investigations very quickly lead to thousands upon thousands of pages of reports, statements and lab results that investigators have to sort through.

While this can be a lengthy process for a human and leave tons of room for someone to miss a connection, Watson has the potential to complete years of work in a matter of seconds.

“There may be something in lead No. 25 that doesn’t make sense until you get to lead No. 2,050,” Tucson, Arizona, chief of police Roberto Villasenor told Mashable.com “How is a human going to tie those things together? Cognitive computing can.”

While authorities hope to get Watson involved in their investigations, Villasenor made it clear that humans will still need to be very involved in checking out leads and checking Watson’s results.

“It cannot be a computer or a human analysis,” he said. “It has to be an ‘and.’ We say, ‘Watson said this – let’s go check it out.’ ”

The analytical power behind the supercomputer could allow it to get a deeper understanding of issues with police that are currently being debated in situations like Ferguson.

Many influential figures are launching national discussions to try to figure out how to solve law enforcement issues in urban communities that are leading to Black men being killed and aggressive police tactics being used.

This is where Villasenor believes Watson could help sort through the chaos in Ferguson.

“There are a lot of theories being thrown out in the news media,” he said. “Being able to trudge through all the information and data, and put out accurate information, as opposed to speculation or analysis based on speculation and supposed truth that’s being put out through third-party hearsay … You need to filter through that.”

As Mashable writer Pete Pachal pointed out, it isn’t fair to say that all the comments regarding Ferguson have been “third-party hearsay.”

“Most of the commentators on Ferguson cite some statistics or studies to support their point of view,” Pachal wrote.

The difference, however, is that Watson would be able to digest more relevant data, link relevant information and sort through more files than humans can alone.

With the issue of police militarization being prominent in the media, Watson can sort through all the records that reveal what type of equipment was sold to which police agencies and departments and how this equipment has been used so far.

By analyzing this information, Watson can clearly identify if different, more aggressive tactics are being used more often in Black, urban communities.

“There are mounds of information out there that we’re going to need help sorting through to help us not necessarily answer the question, but at least define the problem,” Villasenor added. “We need to get the data-driven information, and not go with anecdotal information because there’s a lot of emotion behind it. We need to try and get past the emotion and find the truth. It may be bad, but we need to find out what it is so we can adjust.”