IBM’s Annual ’5 in 5′ List Senses Artificial Intelligence On The Horizon

Every year, IBM releases a short list of technological innovations it believes will change our lives in the next five years. They call it “5 in 5.” It may sound impossible, but this longstanding leader in the tech industry has been pretty accurate in the past. Many of the past 5 in 5 technologies are working their way into the real world (or have already arrived).

This year, the annual list of predictions took on a rather unusual theme. We usually think of technology as something inherently “un-human” but IBM feels that boundary may become even more blurry in the future. This year IBM presents ‘The 5 in 5′ in five sensory categories, highlighting innovations that will touch our lives and see us into the future.

IBM Five in Five, innovation, technology, AI, senses

Image via IBM

We’re no longer living in a time when machines can only operate after receiving commands from a human. We’ve already entered the era of cognitive computing, where systems learn instead of passively relying on programming. According to IBM, emerging technologies will continue to push the boundaries of human limitations to enhance and augment our senses with machine learning, artificial intelligence (AI), advanced speech recognition and more.

Touch (above) – We already interact with touch screens on phones and tablets, but this technology will likely become far more sophisticated in the next five years. IBM predicts that infrared and haptic technologies will enable a smart phone’s touchscreen technology and vibration capabilities to simulate the physical sensation of touching something. Soon shopping online could include actually feeling the softness of a sweater or the coolness of a leather jacket.

Sight

IBM Five in Five

Image via IBM

Rather than relying on meta data or hidden keywords to return relevant images during a web search, IBM says that future computers will actually be able to “see” the meaning of certain pictures. In the future, computer vision might save a life by analyzing patterns to make sense of visuals in the context of big data. These systems could gather information and detect anomalies—such as spotting a tiny area of diseased tissue in an MRI and applying it to the patient’s medical history for faster, more accurate diagnosis and treatment.

Hearing – In the future, computers could be equipped with ultra-sensitive sound monitoring equipment. With these artificial ears, machines will be able to pick up on and interpret sounds in their environment, from the creaking of a tree during a storm to a baby’s cries. IBM inventors predict that soon a computer might be able to tell you if that tree is about to break and fall into the road, or if the baby is sick rather than just hungry.

Smell

Taste

We associate aroma, both good and bad, with our noses, but really its our brain’s ability to process chemicals that allows us to know the pleasure of baking bread or the stench of spoiled milk. “In five years, computers will be able to construct never-before-heard-of recipes to delight palates – even those with health or dietary constraints – using foods’ molecular structure,” writes Dr. Lav Varshney, an IBM research scientist. This means that even though your diet calls for rice cakes, they could be made to taste as amazing as a steak dinner.

Beth Buczynski is a freelancer writer and editor currently living in the Rocky Mountain West. Her articles appear on Care2, Ecosalon and Inhabitat, just to name a few. So far, Beth has lived in or near three major U.S. mountain ranges, and is passionate about protecting the important ecosystems they represent. Follow Beth on Twitter as @ecosphericblog