IBM’s TrueNorth launches the concept of bio-inspired computing

IBM TrueNorth chip

Note: Although IBM’s TrueNorth announcement is not specific to social media, it is a fundamentally different way to deal with networked data with computing assets that simulate the human mind. It will provide the computing ability to analyze social media sentiment and intent thousands of times better than current computing hardware is able to do. As such, TrueNorth is an interesting technology that should be on your radar over the next 5-10 years as this technology is commercially launched.

On August 7, 2014, IBM announced the development of the TrueNorth microprocessor through a paper published in the journal Science. This chip represents the first true proof of concept for bringing a neural computing approach to remote and low-power environments. At the same time, this chip also manages to do this while working at only 70 mW, which is multiple orders of magnitude less than a similar processor working under traditional computing constraints.

This neural approach is important because brains are fundamentally different from traditional computing environments. We react to our environment based on sensory inputs that travel through neurons and start building patterns used to understand our environment. In contrast, a traditional computing microprocessor uses linear and sequential processing to solve problems through brute force. In addition, this approach brings memory and processing together, as each experiential event is actually tied to the optimization of data processing.

TrueNorth contains 1 million programmable neurons, 256 million programmable synapses, and 5.4 billion transistors, the latter of which compares to the transistor counts of the most advanced microprocessors that are commercially available. However, a traditional microprocessor with this transistor count, working under the von Neumann architecture that has defined computing for the last 70 years, typically requires over 100 watts to run. In comparison, the ability to run a microprocessor of this complexity at 1/1000th the power provides a unique opportunity to start bringing this level of cognitive intelligence to sensors, remote environments, and low-power environments that previously lacked the ability to cognitively analyze data.

The potential for TrueNorth is in providing high performance sensory computing to environments that cannot support traditional computing or server environments. 70 mW of power can be provided for an hour through commercial household batteries quite easily. By combining this low power requirement with on-demand sensory inputs, a TrueNorth-based sensor could potentially make intelligent and pattern-based observations of its environment for months or even years in remote and challenging environments.

TrueNorth was initially tested as a cluster of 16 chips put together into 16 million neurons, which roughly matches the size of a frog’s brain and spinal cord. This provides an immediate comparison for the potential opportunity of TrueNorth from a robotics perspective in being able to independently support automation on the complexity of a frog’s reflexes, range of motion, and bodily functions.

In comparison, with traditional computing approaches, scientists have struggled to simulate even a few hundred neurons, even with the computing power of over a billion transistors. IBM’s approach represents an improvement of multiple orders of magnitude in developing an artificial mind.

Bringing TrueNorth to the real world

The potential for this chip will only continue to scale as TrueNorth eventually goes into production and can be combined into multi-core clusters that can come closer to imitating the human brain. However, even without this clustering, the near-immediate opportunity is in providing a new level of “intelligence” to any device or sensor using over 1W of power, or to provide sensory analysis on demand in low power environments. Although IBM’s lead researcher on this project, IBM Fellow Dr. Dharmendra Modha, estimates that it will take 6-7 years for this chip to be fully ready for mass production, the variety of use cases for this chip may force IBM’s hand in accelerating this timeframe.

Some of the more obvious use cases for this chip include deep-sea and space exploration, remote environmental analysis, plant and refinery sensors, and augmented analysis for any equipment that already draws energy. In comparison to the multi-watt power requirements that most modern appliances have, an additional 70 mW would be a trivial energy draw for the level of insight that could be gained.

But this chip could also bring about the smart lamp, the smart electrical socket, or a truly smart watch that can put the quantified self in context of both the experiential and digital worlds. The Internet of Things could become a Neural Net of Things. Smart solar panels (which typically provide at least 75 to 100 watts of power) could provide insight into the true efficiency of solar conversion, based on weather and environmental factors, far more easily than current solar trackers. The pattern recognition associated with a neurosynaptic computing environment will also prove useful in synthetic visual, audio, and tactile inputs that could be used to enhance our own senses through smart glasses, hearing aids, or gloves. (Think of how TrueNorth-enabled gloves, robotics, and Oculus Rift could work together to create touch-sensitive virtual environments.)

Recreating the human brain

TrueNorth is a second generation chip that represents over a decade of research. It has been funded by DARPA since 2008 and was developed by IBM in collaboration with Cornell Tech and iniLabs. The goal of this research is to build a computing environment that can simulate the complexity and efficiency of the brain, which manages to power 100 billion neurons on 20 watts per day. To put that in perspective, 20 watts per day is a bit over 400 kCals per day, or about 2 ounces of gasoline. To put it another way, this is roughly the same amount of power that a smartphone would use in a day.

So, how close does TrueNorth actually get to matching the efficiency of the human brain? Here’s a quick comparison:

Neurons Wattage Neurons per Watt
TrueNorth 1 million 70 milliwatts 14 million:1
Human Brain 100 billion 20 watts 5 billion:1

A rough estimate says that TrueNorth is approximately 1/350th as efficient as the brain at present. This may seem like artificial intelligence is still quite far away from matching biological efficiencies. However, noted futurist Ray Kurzweil has often observed that technological progress does not happen linearly, but exponentially. This phenomenon is seen most obviously with Moore’s Law, which stated that the transistor count on a chip would double every two years. Combined with the physical expansion of computing capacity created, computational power ends up roughly doubling every year.
This is purely hypothetical, but consider if IBM is able to make the same or similar type of progress with TrueNorth. In a Kurzweilian vision of the world, getting 1% towards a technological goal means that it is likely to happen within seven years, since progress doubles every year. With this assumption, even being only 1/350th of the way towards a goal today would mean that it would happen in nine years.

Even if this timeframe is very optimistic, the ramifications of the creation of this chip are enormous. We now have a model for non-von Neumann computing that starts to imitate a more efficient and sensory model of processing. We can expect a chip that will match the processing efficiency of the human brain, if not the actual capacity, at some point in our lifetime. And this chip model changes both our assumptions of what can be computed in challenging environments and our assumptions on the ubiquity of cognitive computing.

The emergence of bio-inspired computing

This approach will lead to a fundamental change in how CIOs need to think about computing. Currently, we think of sensors, applications, and CPUs, but as computing more closely aligns to biological structures, mechanisms, and concepts, we will need new measures. One potential example is clocking computing based on a neuronal or fractional brain measurement rather than megahertz or gigahertz. Imagine being able to literally put 0.1% of your brain power (a milliBrain) into a specific observation and judging the complexity of a problem based on the amount of artificial brain power assigned to it. This type of computation does not fit into our current expectations for programming efforts, since the observations and pattern recognition capabilities of a fractional brain will not necessarily be standardized over time. As we start to equate computing challenges with the percentage of a human brain needed to process and analyze the challenge, fundamental assumptions of application development and application performance will change based on the reality of bio-inspired computing outputs.

The biggest gap in realizing this new paradigm is the current lack of expertise in creating neural applications. This trend is actually part of a larger technological trend where biology, data, and programming are increasingly coming together. In 2012, Harvard scientists placed 700 terabytes of data on a single gram of DNA, effectively translating the bases of DNA into binary data stores. Biofeedback is poised to be the next big gaming feature.With the second generation TrueNorth processor, IBM now has a chip that can imitate neuronal activity. These bio-inspired technologies provide the potential to sense, store, and analyze data several orders of magnitude more efficiently than traditional computing approaches allow.

However, we as a culture lack the developers to truly take advantage of these new biological programming assets. To move forward, IBM will need to make Compass (its TrueNorth simulator) and Corelet (its programming environment) more readily available. Corelet has been available for a year, but up until now the viability of this technology was not readily apparent to the commercial market. As a starting point, IBM seems to be focused on opening this environment up for video, signal, and object recognition to recreate the cognitive activities associated with sight. With time, IBM will also be able to support audio and tactile inputs to create neural computing environments that do not simply collect data, but can also react both cognitively and reflexively in real-time to changes in their environment.

Guidance for the bio-inspired paradigm shift

In short, the future is catching up to us faster than ever before. As amazing as all of this may sound, IBM has both the hardware and the beginnings of a new form of computing that will fundamentally change pattern and environmental recognition. Given that both the hardware and programming environments already exist, highly competitive technological industries (including financial services, petrochemical, and retail) and early adopter companies across all industries should at least be taking a look at this new developer environment and prepare for the near future. In today’s world of technology, proofs of concept like this turn into fully commercialized technology in a couple of years and can become a standard in 5-10 years. Because of the incredible rate of change in technological advancement, it is vital that forward-facing companies prepare for fundamental shifts in computing. Just as social networking and cloud computing were fundamental changes in the foundational basis of application development, this neural approach will be a sea change in application development and functionality.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>