By Bryan Glick
Computer chips powered directly by the sun and cooled by water. Data stored on a single electron. Self-learning cognitive systems. Chips with as many synapses and neurons as the human brain. A supercomputer that analyses in just one day more than double the world’s current internet traffic.
Such a list may seem like the realm of sci-fi to some, but these are all projects currently underway at IBM’s Zurich research labs, and are likely to produce commercially available products in the next 10-15 years.
“What would you do with a thousand times the capability [of today’s computers]?” asks Matthias Kaiserswerth, the director of the Zurich labs. “We are actively working to make this happen in the next 10 years.”
The humble datacentre has reached a turning point. It already costs more in electricity to operate and cool a datacentre than it does to build and run the computers it contains. The more processing power and storage space we demand, the more energy is used.
What’s more, the basic chip and storage technologies are close to the physical limits of current design and manufacturing techniques. IfMoore’s Law is to continue, we need new paradigms for how computers are made.
There is only so far that current technology can scale, due to physical size, energy use and heat generation, says Kaiserswerth.
This is the starting point for the work carried out by IBM researchers in Zurich – a team that has won two Nobel prizes.
IBM likes to show off its computers. It has, over the years, famously developed the first computer to beat a grand master at chess, and more recently the first to beat a top competitor on the US quiz show Jeopardy. Watson, the game-show winner, is described as a “self-learning” system, using the very latest in statistical and analytical software to work out the most likely answer to a question.
But we humans still retain one great advantage, even in defeat. A system such as Watson requires about 200,000 Watts of energy – the human brain it defeated uses just 20 Watts.
“In the brain, energy and cooling is delivered by the same fluid – blood. We want to replicate this for chips,” says Bruno Michel, one of IBM’s researchers.
IBM has already built its first “synapse chip“, a processor with 262 programmable synapses, designed to mimic the way the brain processes information – although Kaiserswerth is quick to stress that it is not a “brain on a chip”, and more about learning lessons from how the brain works and applying them to chip design.
The human brain, by comparison, has about 100 trillion synapses.
But one of the things that makes the brain so energy efficient is the fact that its key components – the synapses and neurons – are so close together. The conventional two-dimensional design of computer chips means comparatively big distances between components such as processors and memory – that slows down speeds, and requires more energy to bridge the gap.
GUIDE TO GREEN IT
So, IBM is working on a stacked, or 3D chip, where components are layered on top of each other, reducing the distances, increasing performance and reducing the electricity needed to power it. Michel predicts that 3D chips can theoretically improve system performance by a factor of 5,000 – although the ability to deliver this is about 15 years away.
Even then, there will be new ways needed to provide enough energy to power a computer based on such advanced 3D chips – one that could provide the power of the largest supercomputer today in a system the size of a desktop PC.
To address this, IBM is researching ways of powering the chip directly from the sun.
On the roof of the Zurich labs is a giant concave mirror – it looks more like a large satellite dish. The mirror focuses and concentrates the sunlight directly onto a single chip, which converts 43% of the solar energy to power the chip.
The light reflected from what is still a fairly low level of solar concentration would be enough to permanently damage your eyes if you looked at it without a filter. Ultimately, IBM needs to find a way to concentrate sunlight by a factor of 1,000 onto a specific point on a chip.
Even then, the chip will still need to be cooled – and it is likely that will be done with water.
“We know the future design of a chip with concentrated solar power and water cooling. We are aiming to get there through our research,” says Michel.
A prototype chip already exists, with tiny pipes on top feeding the coolant directly into the structure of the processor.
New storage technologies
Of course, a faster, more energy-efficient computer will require more storage capacity too.
One of the projects driving these requirements is IBM’s involvement in the Square Kilometre Array (SKA), an international consortium building the world’s largest and most sensitive radio telescope.
When completed in 2024, SKA will generate 10 exabytes of data every day – that’s about 10 petabytes every second, roughly double the current levels of global internet traffic, according to IBM.
Processing and storing that much data will require technologies that do not exist today – so-calledexascale computing – with processing power estimated at 2.5 exaflops. One exaflop is 1,000,000,000,000,000,000 (1018) floating point operations per second.
The supercomputers that will support SKA will also need to analyse all that information, in near real time, to remove unnecessary data and store only what is required for the project.
“You have to screen out data, reduce the order of magnitude by two to six times, and analyse in real time,” says IBM fellow Evangelos Eleftheriou.
SKA will need new storage technology, in what Eleftheriou calls the biggest change in IT architecture since IBM’s System 360 mainframe, launched in 1964. This will be a “data-centric model”, where data is retained in persistent memory, and is surrounded by many central processing units (CPUs) – unlike today’s model where the CPU sits at the centre and calls in data from different media as needed.
SUPPLIER PROFILE: IBM
This will involve blurring the boundaries between what current paradigms see as memory and storage. “Memory/IO hierarchy will eventually disappear and be replaced by flat, globally addressable memory,” says Eleftheriou.
IBM is developing a technology called phase change memory (PCM), which overcomes the scaling problems of existing DRAM memory. PCM exploits the different electrical resistance of two distinct solid phases of a metal alloy – changing the physical properties of the metal to store a bit. The first commercial PCM chips are expected by 2016.
Even tape storage will continue to have a role to play, according to the supplier. Experts have been predicting the death of tape as a storage medium for years, but IBM researchers predict it will continue to be the best way to store archived data for a long time yet.
Eleftheriou has demonstrated a standard tape cartridge able to store 35TB of data – about 44 times the capacity of today’s IBM LTO Generation 4 cartridge. A capacity of 35TB is sufficient to store the text of 35 million books, which would require 248 miles of bookshelves.
“The only drawback of tape is the slow access time,” he says. IBM is developing ways to use policies to move data between tape and disk or memory so that it is readily available when needed.
The research at Zurich does not stop at the level of technologies such as chips and storage. Researchers are looking at the use of nanotechnology in chip design. Nanowires – connections a thousand times thinner than a human hair – can reduce the voltage used within an individual switch as it changes its state from binary zero to one.
Analysis at an atomic level takes things even further. “We have shown in principle that a single atom can be used to store a single bit,” says researcher Fabian Mohn.
And if that is not enough, IBM scientists have even proved that they can control the natural spin of electrons using magnetic forces. The discovery could lead to new ways of designing processor gates that require significantly less voltage to induce the change of state from one to zero.
Meanwhile, Watson – the self-learning system that won Jeopardy – is now finding practical uses in business. IBM is working with a leading US cancer hospital to develop a new version of Watson to assist oncologists with cancer diagnosis and treatment. The healthcare version of Watson would take patient data and look through the huge quantities of published literature and research on cancer to make recommendations on likely prognosis and possible treatments.
IBM predicts that the combination of Watson’s big data handling, with exascale computing, cognitive chips and nanotechnology, is the future of IT.
“IT for the back office has happened. Where it’s interesting is where it’s facing outwards,” says Kaiserswerth. “We are entering the cognitive systems era, with computers a thousand times more powerful than now.”
By Jarrett Neil Ridlinghafer
Founder & CEO/CTO Synapsesynergygroup.com
Posted from WordPress for Android