Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    predictive analytics risk management
    How Predictive Analytics Is Redefining Risk Management Across Industries
    7 Min Read
    data analytics and gold trading
    Data Analytics and the New Era of Gold Trading
    9 Min Read
    composable analytics
    How Composable Analytics Unlocks Modular Agility for Data Teams
    9 Min Read
    data mining to find the right poly bag makers
    Using Data Analytics to Choose the Best Poly Mailer Bags
    12 Min Read
    data analytics for pharmacy trends
    How Data Analytics Is Tracking Trends in the Pharmacy Industry
    5 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: Why Artificial Intelligence Will Be Analog
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Analytics > Why Artificial Intelligence Will Be Analog
AnalyticsBusiness Intelligence

Why Artificial Intelligence Will Be Analog

Sean Mallon
Sean Mallon
6 Min Read
Image
SHARE

For many people, the word “digital” is synonymous with modern, technologically advanced programs or devices capable of performing complex processes in a fraction of the time that it would take a manual or analog system to do the same thing. Analog, on the other hand, is generally thought of as old-fashioned and something that needs to be converted to digital in order to be in line with the modern technology, even though much of what we take for granted in terms of technology actually runs on analog components.

In fact, one of the most transformative trends in technology today relies heavily on analog technology. When you think of artificial intelligence, you probably think of robots or at least high-powered computers like IBM’s Watson— the epitome of modern technology — you probably don’t think analog, with its reliance on capturing real-time data and measuring the changes in the signals put out by physical phenomena.

The Executive Office of the President recently said that advances in AI will make it easier than ever to search public records and streamline healthcare. According to the White House report, this technology has already benefited patients at John Hopkins University and other medical institutions.

While analog is relatively simple and more accurate when compared to digital, which is used to convert the analog signal into a representation of the values, it also suffers from a lack of flexibility. The more complex the analog signal, the less reliable it becomes (especially over longer distances), and as technology and the need for precision became an even greater priority, digital saw a rise in use.

More Read

Technologies and Analyses in CBS’ Person of Interest
Data and Dating: How Agencies are Using Big Data to Find the Perfect Match
Social Engineering — Hacking by Asking
A single version of the truth?
3 Big Data Pitfalls and How to Avoid Them

So what does all of this have to do with artificial intelligence?

What’s a Robot, Anyway?

Most of us, when we think of robots, we think of Rosie from “The Jetsons,” or the Droid C-3PO from “Star Wars,” or something similar. In these representations, the robots take on a humanoid form — and act like humans in many ways, typically lacking only emotion. However, the idea of robots has expanded considerably in recent years. Many of us carry devices with “personal assistants” like Siri built in or use a device like the Amazon Echo in our homes. While these devices might not be able to walk the dog or launch an attack on an enemy starship, they do “learn” about us and our preferences in order to become more useful.

The problem, though, is that while these robots become more intelligent, they can consume a great deal of power, and need to remain connected at all times in order to work. Whether it’s a personal assistant device like Amazon’s Alexa or a smart thermostat, the device is going to run on a computer that must be small and low-power, which is where analog comes in.

In essence, analog is similar to the human brain. Consider the human body in comparison to a fitness monitor, like a FitBit. These devices are both analog and digital. The analog sensors are what collects the data about the number of steps your take, your heart rate, etc. That data is then converted using an analog-to-digital converter, so that the readings can be more easily processed using the algorithms in the microprocessor of the device, putting the raw data into a form that we can use. Over time, the device “learns” your patterns, so it can make recommendations as to the number of steps you should take, how much sleep you need, etc.

This is a form of artificial intelligence, and the same concepts apply to other AI devices as well. Essentially, as science historian George Dyson points out, the brain itself is an analog computer. Our senses take in information in analog format, whether a sight, sound, smell, etc., which is then processed by the neural pathways of the brain. Those pathways are essentially algorithms, and much like an algorithm cannot process information on its own, your brain cannot process information without analog (sensory) input.

In other words, artificial intelligence is never going to work if all we do is create instructions on how it should process information. For something to be truly intelligent, it needs to be developed in such a way that it can naturally develop and evolve based on inputs — much like your fitness tracker will not know how to set your daily step goals until it collects data on how much you walk each day.

The possibilities for artificial intelligence and how it can change our lives are almost unimaginable. However, it’s important to realize that as advanced as the concept of AI may be, it’s actually heavily reliant on what many consider “old” technology — in fact, it won’t be able to work at all without analog inputs.

Share This Article
Facebook Pinterest LinkedIn
Share
BySean Mallon
Sean is a freelance writer and big data expert with a passion for exploring the depths of information that can be extracted from massive datasets. With years of experience in the field, he has developed a deep understanding of how data can be harnessed to drive insights and make informed decisions.

Follow us on Facebook

Latest News

street address database
Why Data-Driven Companies Rely on Accurate Street Address Databases
Big Data Exclusive
predictive analytics risk management
How Predictive Analytics Is Redefining Risk Management Across Industries
Analytics Exclusive Predictive Analytics
data analytics and gold trading
Data Analytics and the New Era of Gold Trading
Analytics Big Data Exclusive
student learning AI
Advanced Degrees Still Matter in an AI-Driven Job Market
Artificial Intelligence Exclusive

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

Interview: Karl Rexer – Rexer Analytics

18 Min Read

The emergence of the mega-tech vendor economy

8 Min Read

The dangers of scores in decision making

5 Min Read
data analytics in sports industry
Big Data

Here’s How Data Analytics In Sports Is Changing The Game

6 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

giveaway chatbots
How To Get An Award Winning Giveaway Bot
Big Data Chatbots Exclusive
AI chatbots
AI Chatbots Can Help Retailers Convert Live Broadcast Viewers into Sales!
Chatbots

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?