The research that IBM has conducted across multiple disciplines cannot go unnoticed or ignored. While there are many other scientists and research groups who dedicated thousands of hours to developing the latest devices, IBM has been a leader in development for the past decade at least. IBM Research (the R&D division of IBM) is the largest industrial research organization in the world and has helped to create devices including the ATM, magnetic storage, Fortran, silicon-on-insulator technology, DRAM, and the Watson computing system.
So with all these achievements and some of the brightest researchers in the world on their team, does IBM reserve the right to make predictions on the future? Can IBM accurately predict what technological revolutions will change the world and become commonplace?
This is what IBM believes and has released their annual 5 in 5 where they predict five technological innovations that will come to fruition in the next five years.
1. AI and Mental Health
First up, IBM predicts that in five years what we write and say will be used by artificial intelligence systems to not only determine our mental state but also make diagnoses on potential mental conditions. Watson, IBM's highly advanced computing system, is already being used in medical diagnosis.
One area that medical professions tend to lack in is mental health. Systems like Watson and any AI developed from it could be used to analyze mental state based on interaction with a patient or even analyzing their written communications. Based on how an individual is articulating and the choice of words they use to express themselves, IBM thinks it's possible for a cognitive computer to determine if that individual is becoming depressed, increasingly forgetful, or even imaginative. Especially if combined with data gathered from wearable tech, it could be possible to attain a very clear picture of a patient's mental well-being.
Computers could have a serious advantage in this sector of work as, unlike humans, computers can quickly and efficiently sift through large amounts of data in a short time to find minute details that can aid in diagnosis and treatment.
Using AI to identify mental health issues could be a big step. Image courtesy of IBM Research
2. Hyperimaging and AI
Within the next five years, IBM speculates that hyperimaging will become affordable, portable, and maybe even commonplace. IBM refers to this as "superhero vision".
The human eye can only see a very narrow band of the electromagnetic spectrum but, thanks to hyperimaging and AI, other parts of the spectrum may be brought to light. One example of how an increased spectrum view could help would be in cars where photons with wavelengths of millimeters are used—autonomous vehicles that rely on sensors could certainly use the ability to better perceive environmental conditions and objects.
Longer wavelength light (such as those wavelengths in the infrared spectrum) can penetrate fog and mist better than normal light, which is one of the reasons why old street lights were orange in color. This allows for better detection of obstacles, cars, and pedestrians in dangerous conditions. The data from such a “long wave” spectrum camera could then be fed into an artificial intelligence system that could identify the risk and then take precautions if needed.
Hyperimaging may also find its way into the medical field where doctors could look at a patient in an infrared spectrum to find hot spots which may indicate injury or internal bleeding. They could also use smaller wavelengths to look for potential skin damage.
IBMs hyperimaging prototype. Image courtesy of IBM Research
3. Macroscope Data Organization
Understanding individual datasets is considerably easier than trying to understand the bigger picture. With the massive amounts of data gathered by sensors and online systems, however, it becomes increasingly difficult for human researchers to discern relevant information.
IBM has predicted that machine learning algorithms will be used to gather large quantities of information that is otherwise beyond our understanding (e.g., try and comprehend the number of stars in the universe. You just can’t). The key step is that the algorithms could then condense the information so that it conveys comprehensible meaning to humans. IBM calls these "macroscopes", a way to comprehend large amounts of data.
Considering that the world is constantly becoming more integrated with electronics and the internet, there is too much information for scientists to study effectively. Instead of spending valuable time on understanding data, researchers spend considerable time on cleansing and checking the data to be analyzed. By using AI and advanced algorithms to handle the data sanitization, scientists will finally be able to look at large systems and be presented with data that can actually be comprehended.
Researchers using macroimaging techniques. Image courtesy of IBM Research
4. Medical "Lab-on-a-Chip” Tech
IBM has predicted that nanotechnology and silicon devices will help to catch diseases earlier than ever before. Implanted devices could constantly take measurements of samples such as blood or urine (AKA "liquid biopsies") and immediately inform you to seek a doctor if there are any suspicious readings. The idea is to take a biochemistry lab and essentially shrink it down to a single silicon chip, a so-called "lab-on-a-chip".
Lab-on-a-chip technology has been in development for some time, including in cancer detection, but IBM's prediction is that it would achieve widespread use. This could have major repercussions on the medical field. One major advantage of such technology is that it could prevent diseases becoming terminal by catching problems in their infancy.
“At IBM Research, scientists are developing lab-on-a-chip nanotechnology that can separate and isolate bioparticles down to 20 nanometers in diameter, a scale that gives access to DNA, viruses, and exosomes." —IBM Research
An implanted medical device could potentially save lives if it could identify the presence of viral or bacterial infections early on. For example, rabies is curable so long as the vaccine is administered as soon as possible after exposure to the virus. Diagnosing rabies can be difficult in the early stages as it is commonly mistaken for other diseases. If the technology works there is no doubt that it would completely change the way diseases, infectious or otherwise, are identified.
Researchers demonstrating a prototype of a medical sensor. Image courtesy of IBM Research
5. Smart Sensors and the Environment
IBM expects smart sensors to not only become cheaper but to be implemented in environmental measuring systems. One potential application is the installation of sensors near gas and oil fields that can measure very small amounts of leakage. Such measurements could help scientists understand the true impact of mining and mineral extraction on the environment and thus develop systems to mitigate against environmental damage (or even hazardous conditions for workers). Sensor readings from pollution could also help to pinpoint leakages so that repairs could be done in minutes instead of weeks.
IBM Research is working with natural gas producers to develop an intelligent methane monitoring systems as well as research into silicon photonics which transfer data by light and thus allows for faster computation speed. The idea is to have these sensors embedded into the ground, into infrastructure, and even onto drones for the maximum sensor coverage.
Researchers with the methane sensor in development. Image courtesy of IBM Research
Prediction or Fantasy?
It’s not uncommon to see many predictions fail miserably. Any of IBM's 5 in 5 could easily become another example of an unfulfilled tech prophecy (say, how nuclear fusion has been 20 years away since the 1960s).
Are the predictions made by IBM accurate or just plain silly? An industry giant like IBM has incentive to sensationalize certain future works, but these predictions are most likely based on their current technological abilities.