Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    data driven insights
    How Data-Driven Insights Are Addressing Gaps in Patient Communication and Equity
    8 Min Read
    pexels pavel danilyuk 8112119
    Data Analytics Is Revolutionizing Medical Credentialing
    8 Min Read
    data and seo
    Maximize SEO Success with Powerful Data Analytics Insights
    8 Min Read
    data analytics for trademark registration
    Optimizing Trademark Registration with Data Analytics
    6 Min Read
    data analytics for finding zip codes
    Unlocking Zip Code Insights with Data Analytics
    6 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: Why Artificial Intelligence Will Be Analog
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Analytics > Why Artificial Intelligence Will Be Analog
AnalyticsBusiness Intelligence

Why Artificial Intelligence Will Be Analog

Sean Mallon
Sean Mallon
6 Min Read
Image
SHARE

For many people, the word “digital” is synonymous with modern, technologically advanced programs or devices capable of performing complex processes in a fraction of the time that it would take a manual or analog system to do the same thing. Analog, on the other hand, is generally thought of as old-fashioned and something that needs to be converted to digital in order to be in line with the modern technology, even though much of what we take for granted in terms of technology actually runs on analog components.

In fact, one of the most transformative trends in technology today relies heavily on analog technology. When you think of artificial intelligence, you probably think of robots or at least high-powered computers like IBM’s Watson— the epitome of modern technology — you probably don’t think analog, with its reliance on capturing real-time data and measuring the changes in the signals put out by physical phenomena.

The Executive Office of the President recently said that advances in AI will make it easier than ever to search public records and streamline healthcare. According to the White House report, this technology has already benefited patients at John Hopkins University and other medical institutions.

While analog is relatively simple and more accurate when compared to digital, which is used to convert the analog signal into a representation of the values, it also suffers from a lack of flexibility. The more complex the analog signal, the less reliable it becomes (especially over longer distances), and as technology and the need for precision became an even greater priority, digital saw a rise in use.

More Read

BI
BI Democratization Could Mean Big Value
Forex Trading with R : Part 2
If You Want Trust in Washington, Get a Database?
When Is ‘Big Data’ Too Big for Analytics?
TIME’s Best Inventions of…

So what does all of this have to do with artificial intelligence?

What’s a Robot, Anyway?

Most of us, when we think of robots, we think of Rosie from “The Jetsons,” or the Droid C-3PO from “Star Wars,” or something similar. In these representations, the robots take on a humanoid form — and act like humans in many ways, typically lacking only emotion. However, the idea of robots has expanded considerably in recent years. Many of us carry devices with “personal assistants” like Siri built in or use a device like the Amazon Echo in our homes. While these devices might not be able to walk the dog or launch an attack on an enemy starship, they do “learn” about us and our preferences in order to become more useful.

The problem, though, is that while these robots become more intelligent, they can consume a great deal of power, and need to remain connected at all times in order to work. Whether it’s a personal assistant device like Amazon’s Alexa or a smart thermostat, the device is going to run on a computer that must be small and low-power, which is where analog comes in.

In essence, analog is similar to the human brain. Consider the human body in comparison to a fitness monitor, like a FitBit. These devices are both analog and digital. The analog sensors are what collects the data about the number of steps your take, your heart rate, etc. That data is then converted using an analog-to-digital converter, so that the readings can be more easily processed using the algorithms in the microprocessor of the device, putting the raw data into a form that we can use. Over time, the device “learns” your patterns, so it can make recommendations as to the number of steps you should take, how much sleep you need, etc.

This is a form of artificial intelligence, and the same concepts apply to other AI devices as well. Essentially, as science historian George Dyson points out, the brain itself is an analog computer. Our senses take in information in analog format, whether a sight, sound, smell, etc., which is then processed by the neural pathways of the brain. Those pathways are essentially algorithms, and much like an algorithm cannot process information on its own, your brain cannot process information without analog (sensory) input.

In other words, artificial intelligence is never going to work if all we do is create instructions on how it should process information. For something to be truly intelligent, it needs to be developed in such a way that it can naturally develop and evolve based on inputs — much like your fitness tracker will not know how to set your daily step goals until it collects data on how much you walk each day.

The possibilities for artificial intelligence and how it can change our lives are almost unimaginable. However, it’s important to realize that as advanced as the concept of AI may be, it’s actually heavily reliant on what many consider “old” technology — in fact, it won’t be able to work at all without analog inputs.

Share This Article
Facebook Pinterest LinkedIn
Share
BySean Mallon
Sean is a freelance writer and big data expert with a passion for exploring the depths of information that can be extracted from massive datasets. With years of experience in the field, he has developed a deep understanding of how data can be harnessed to drive insights and make informed decisions.

Follow us on Facebook

Latest News

accountant using ai
AI Improves Integrity in Corporate Accounting
Exclusive
ai and law enforcement
Forensic AI Technology is Doing Wonders for Law Enforcement
Artificial Intelligence Exclusive
langgraph and genai
LangGraph Orchestrator Agents: Streamlining AI Workflow Automation
Artificial Intelligence Exclusive
ai fitness app
Will AI Replace Personal Trainers? A Data-Driven Look at the Future of Fitness Careers
Artificial Intelligence Big Data Exclusive

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

Inferring Intent on Mobile Devices

6 Min Read
data analytics for better email marketing
Analytics

Email Marketers Use Data Analytics for Optimal Customer Segmentation

12 Min Read

Business rules, accelerating change and Decision Management

4 Min Read
Image
AnalyticsCulture/LeadershipPolicy and GovernanceSocial Data

Data Innovation in Africa

6 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

AI chatbots
AI Chatbots Can Help Retailers Convert Live Broadcast Viewers into Sales!
Chatbots
ai in ecommerce
Artificial Intelligence for eCommerce: A Closer Look
Artificial Intelligence

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?