Watson, you know my methods… IBM moves towards HAL

0 Min Read

CognitiveComputingWatson.jpgI’ve previously written about IBM Watson, its success in “Jeopardy!” and some of the future applications that its developers envisaged for it.  IBM has moved the technology towards the mainstream in a number of presentations at the

CognitiveComputingWatson.jpgI’ve previously written about IBM Watson, its success in “Jeopardy!” and some of the future applications that its developers envisaged for it.  IBM has moved the technology towards the mainstream in a number of presentations at the Information on Demand (IOD) Conference in Las Vegas last week.  While Watson works well beyond the normal bounds of BI, analyzing and reasoning in soft (unstructured) information, the underlying computer hardware is very much the same (albeit faster and bigger) as we have used since the beginnings of the computer era.

But, I was intrigued by an announcement that IBM made in August last that I came across a few weeks ago:
“18 Aug 2011: Today, IBM researchers unveiled a new generation of experimental computer chips designed to emulate the brain’s abilities for perception, action and cognition. The technology could yield many orders of magnitude less power consumption and space than used in today’s computers.

In a sharp departure from traditional concepts in designing and building computers, IBM’s first neurosynaptic computing chips recreate the phenomena between spiking neurons and synapses in biological systems, such as the brain, through advanced algorithms and silicon circuitry. Its first two prototype chips have already been fabricated and are currently undergoing testing.

“Called cognitive computers, systems built with these chips won’t be programmed the same way traditional computers are today. Rather, cognitive computers are expected to learn through experiences, find correlations, create hypotheses, and remember – and learn from – the outcomes, mimicking the brains structural and synaptic plasticity… The goal of SyNAPSE is to create a system that not only analyzes complex information from multiple sensory modalities at once, but also dynamically rewires itself as it interacts with its environment – all while rivaling the brain’s compact size and low power usage.”

Please excuse the long quote, but, for once :-), the press release says it as well as I could!  For further details and links to some fascinating videos, see here.

What reminded me of this development was another blog post from Jim Lee in Resilience Economics entitled “Why The Future Of Work Will Make Us More Human”. I really like the idea of this, but I’m struggling with it on two fronts.

Quoting David Autor, an economist at MIT, Jim argues that that outsourcing and “othersourcing” of jobs to other countries and machines respectively are polarizing labor markets towards opposite ends of the skills spectrum: low-paying service-oriented jobs that require personal interaction and the manipulation of machinery in unpredictable environments at one end and well-paid jobs that require creativity, ambiguity, and high levels of personal training and judgment at the other.  The center-ground – a vast swathe of mundane, repetitive work that computers do much better than us – will disappear.  These are jobs involving middle-skilled cognitive and productive activities that follow clear and easily understood procedures and can reliably be transcribed into software instructions or subcontracted to overseas labor.  This will leave two types of work for humans: “The job opportunities of the future require either high cognitive skills, or well-developed personal skills and common sense,” says Lee in summary.

My first concern is the either-or in the above approach; I believe that high cognitive skills are part and parcel of well-developed personal skills and common sense.  At which end of this polarization would you place teaching, for example?  Education (in the real meaning of the word – from the Latin “to draw out” – as opposed to hammering home) spans both ends of the spectrum.

From the point of view of technology, my second concern is that our understanding of where computing will take us, even in the next few years, has been blown wide open, first by Watson and now by neurosynaptic computing.  What we’ve seen in Watson is a move from Boolean logic and numerically focused computing to a way of using and understand and using soft information that is much closer to the way humans deal with it.  Of course, it’s still far from human.  But, with an attempt to “emulate the brain’s abilities for perception, action and cognition”, I suspect we’ll be in for some interesting developments in the next few years.  Anyone else remember HAL from “2001, A Space Odessey”?

TAGGED:
Share This Article
Exit mobile version