Why Artificial Intelligence Will Be Analog

December 6, 2016
77 Views

Image

Image

For many people, the word “digital” is synonymous with modern, technologically advanced programs or devices capable of performing complex processes in a fraction of the time that it would take a manual or analog system to do the same thing. Analog, on the other hand, is generally thought of as old-fashioned and something that needs to be converted to digital in order to be in line with the modern technology, even though much of what we take for granted in terms of technology actually runs on analog components.

In fact, one of the most transformative trends in technology today relies heavily on analog technology. When you think of artificial intelligence, you probably think of robots or at least high-powered computers like IBM’s Watson— the epitome of modern technology — you probably don’t think analog, with its reliance on capturing real-time data and measuring the changes in the signals put out by physical phenomena.

The Executive Office of the President recently said that advances in AI will make it easier than ever to search public records and streamline healthcare. According to the White House report, this technology has already benefited patients at John Hopkins University and other medical institutions.

While analog is relatively simple and more accurate when compared to digital, which is used to convert the analog signal into a representation of the values, it also suffers from a lack of flexibility. The more complex the analog signal, the less reliable it becomes (especially over longer distances), and as technology and the need for precision became an even greater priority, digital saw a rise in use.

So what does all of this have to do with artificial intelligence?

What’s a Robot, Anyway?

Most of us, when we think of robots, we think of Rosie from “The Jetsons,” or the Droid C-3PO from “Star Wars,” or something similar. In these representations, the robots take on a humanoid form — and act like humans in many ways, typically lacking only emotion. However, the idea of robots has expanded considerably in recent years. Many of us carry devices with “personal assistants” like Siri built in or use a device like the Amazon Echo in our homes. While these devices might not be able to walk the dog or launch an attack on an enemy starship, they do “learn” about us and our preferences in order to become more useful.

The problem, though, is that while these robots become more intelligent, they can consume a great deal of power, and need to remain connected at all times in order to work. Whether it’s a personal assistant device like Amazon’s Alexa or a smart thermostat, the device is going to run on a computer that must be small and low-power, which is where analog comes in.

In essence, analog is similar to the human brain. Consider the human body in comparison to a fitness monitor, like a FitBit. These devices are both analog and digital. The analog sensors are what collects the data about the number of steps your take, your heart rate, etc. That data is then converted using an analog-to-digital converter, so that the readings can be more easily processed using the algorithms in the microprocessor of the device, putting the raw data into a form that we can use. Over time, the device “learns” your patterns, so it can make recommendations as to the number of steps you should take, how much sleep you need, etc.

This is a form of artificial intelligence, and the same concepts apply to other AI devices as well. Essentially, as science historian George Dyson points out, the brain itself is an analog computer. Our senses take in information in analog format, whether a sight, sound, smell, etc., which is then processed by the neural pathways of the brain. Those pathways are essentially algorithms, and much like an algorithm cannot process information on its own, your brain cannot process information without analog (sensory) input.

In other words, artificial intelligence is never going to work if all we do is create instructions on how it should process information. For something to be truly intelligent, it needs to be developed in such a way that it can naturally develop and evolve based on inputs — much like your fitness tracker will not know how to set your daily step goals until it collects data on how much you walk each day.

The possibilities for artificial intelligence and how it can change our lives are almost unimaginable. However, it’s important to realize that as advanced as the concept of AI may be, it’s actually heavily reliant on what many consider “old” technology — in fact, it won’t be able to work at all without analog inputs.

 

You may be interested

Is Big Data the Salvation of the Newspaper Industry?
Analytics
0 shares481 views
Analytics
0 shares481 views

Is Big Data the Salvation of the Newspaper Industry?

Rehan Ijaz - May 27, 2017

The newspaper industry has been declining for the past decade. In 2007, Paul Gillan, a former reporter, launched the website…

Big Data is the Key to the Future of Multi-Device Marketing
Big Data
0 shares515 views
Big Data
0 shares515 views

Big Data is the Key to the Future of Multi-Device Marketing

Ryan Kh - May 26, 2017

Digital marketers must reach customers across multiple devices. According to Criteo Mobile eCommerce Report, 40% of all online transactions involve…

Empowering Partners and Customers with Data Insights: A Win-Win for Everyone
Analytics
0 shares469 views
Analytics
0 shares469 views

Empowering Partners and Customers with Data Insights: A Win-Win for Everyone

Guy Greenberg - May 26, 2017

All businesses in the digital age rely on analytics for various activities: Product managers rely on analytics to gain insights…