Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    sales and data analytics
    How Data Analytics Improves Lead Management and Sales Results
    9 Min Read
    data analytics and truck accident claims
    How Data Analytics Reduces Truck Accidents and Speeds Up Claims
    7 Min Read
    predictive analytics for interior designers
    Interior Designers Boost Profits with Predictive Analytics
    8 Min Read
    image fx (67)
    Improving LinkedIn Ad Strategies with Data Analytics
    9 Min Read
    big data and remote work
    Data Helps Speech-Language Pathologists Deliver Better Results
    6 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: Watson, you know my methods… IBM moves towards HAL
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Business Intelligence > Decision Management > Watson, you know my methods… IBM moves towards HAL
Business IntelligenceDecision Management

Watson, you know my methods… IBM moves towards HAL

Barry Devlin
Barry Devlin
0 Min Read
SHARE

CognitiveComputingWatson.jpgI’ve previously written about IBM Watson, its success in “Jeopardy!” and some of the future applications that its developers envisaged for it.  IBM has moved the technology towards the mainstream in a number of presentations at the

CognitiveComputingWatson.jpgI’ve previously written about IBM Watson, its success in “Jeopardy!” and some of the future applications that its developers envisaged for it.  IBM has moved the technology towards the mainstream in a number of presentations at the Information on Demand (IOD) Conference in Las Vegas last week.  While Watson works well beyond the normal bounds of BI, analyzing and reasoning in soft (unstructured) information, the underlying computer hardware is very much the same (albeit faster and bigger) as we have used since the beginnings of the computer era.

But, I was intrigued by an announcement that IBM made in August last that I came across a few weeks ago:
“18 Aug 2011: Today, IBM researchers unveiled a new generation of experimental computer chips designed to emulate the brain’s abilities for perception, action and cognition. The technology could yield many orders of magnitude less power consumption and space than used in today’s computers.

In a sharp departure from traditional concepts in designing and building computers, IBM’s first neurosynaptic computing chips recreate the phenomena between spiking neurons and synapses in biological systems, such as the brain, through advanced algorithms and silicon circuitry. Its first two prototype chips have already been fabricated and are currently undergoing testing.

More Read

ai in business presentations
5 AI Tools for Rocking Business Presentations
Algorithmic Trading Communities Show the Benefits of AI
The Top AI-Based Web Design Trends For 2022
AI Leads to Breakthroughs in GUI Brainstorming Software
The Rise of AI in The Digital Era Kicks into High Gear

“Called cognitive computers, systems built with these chips won’t be programmed the same way traditional computers are today. Rather, cognitive computers are expected to learn through experiences, find correlations, create hypotheses, and remember – and learn from – the outcomes, mimicking the brains structural and synaptic plasticity… The goal of SyNAPSE is to create a system that not only analyzes complex information from multiple sensory modalities at once, but also dynamically rewires itself as it interacts with its environment – all while rivaling the brain’s compact size and low power usage.”

Please excuse the long quote, but, for once :-), the press release says it as well as I could!  For further details and links to some fascinating videos, see here.

What reminded me of this development was another blog post from Jim Lee in Resilience Economics entitled “Why The Future Of Work Will Make Us More Human”. I really like the idea of this, but I’m struggling with it on two fronts.

Quoting David Autor, an economist at MIT, Jim argues that that outsourcing and “othersourcing” of jobs to other countries and machines respectively are polarizing labor markets towards opposite ends of the skills spectrum: low-paying service-oriented jobs that require personal interaction and the manipulation of machinery in unpredictable environments at one end and well-paid jobs that require creativity, ambiguity, and high levels of personal training and judgment at the other.  The center-ground – a vast swathe of mundane, repetitive work that computers do much better than us – will disappear.  These are jobs involving middle-skilled cognitive and productive activities that follow clear and easily understood procedures and can reliably be transcribed into software instructions or subcontracted to overseas labor.  This will leave two types of work for humans: “The job opportunities of the future require either high cognitive skills, or well-developed personal skills and common sense,” says Lee in summary.

My first concern is the either-or in the above approach; I believe that high cognitive skills are part and parcel of well-developed personal skills and common sense.  At which end of this polarization would you place teaching, for example?  Education (in the real meaning of the word – from the Latin “to draw out” – as opposed to hammering home) spans both ends of the spectrum.

From the point of view of technology, my second concern is that our understanding of where computing will take us, even in the next few years, has been blown wide open, first by Watson and now by neurosynaptic computing.  What we’ve seen in Watson is a move from Boolean logic and numerically focused computing to a way of using and understand and using soft information that is much closer to the way humans deal with it.  Of course, it’s still far from human.  But, with an attempt to “emulate the brain’s abilities for perception, action and cognition”, I suspect we’ll be in for some interesting developments in the next few years.  Anyone else remember HAL from “2001, A Space Odessey”?

TAGGED:ibm
Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

sales and data analytics
How Data Analytics Improves Lead Management and Sales Results
Analytics Big Data Exclusive
ai in marketing
How AI and Smart Platforms Improve Email Marketing
Artificial Intelligence Exclusive Marketing
AI Document Verification for Legal Firms: Importance & Top Tools
AI Document Verification for Legal Firms: Importance & Top Tools
Artificial Intelligence Exclusive
AI supply chain
AI Tools Are Strengthening Global Supply Chains
Artificial Intelligence Exclusive

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

IBM to acquire SPSS

2 Min Read
Image
Inside Companies

Final Jeopardy- Stephen Baker’s quest to know everything

3 Min Read

A Brief History of Data Quality

5 Min Read

New IBM study on business analytics and optimization

8 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

data-driven web design
5 Great Tips for Using Data Analytics for Website UX
Big Data
giveaway chatbots
How To Get An Award Winning Giveaway Bot
Big Data Chatbots Exclusive

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?