Games consoles reveal the supercomputer within

February 16, 2008

WHEN Todd Martínez broke his son’s Sony PlayStation he didn’t realise this would change the course of his career as a theoretical chemist. Having dutifully bought a PlayStation 2 as a replacement, he was browsing through the games console’s technical specification when he realised it might have another use. “I noticed that the architecture looked a lot like high-performance supercomputers I had seen before,” he says. “That’s when I thought about getting one for myself.”

Six years on and Martínez has persuaded the supercomputing centre at the University of Illinois, Urbana-Champaign, to buy eight computers each driven by two of the specialised chips that are at the heart of Sony’s PlayStation 3 console. Together with his student Benjamin Levine he is using them to simulate the interactions between the electrons in atoms. Scaled up over entire molecules, the results could pave the way to predicting how a protein will interact with a drug.

Martínez and Levine are not the only researchers who have turned to gaming hardware to do their number crunching. That’s because the kinds of calculations required to produce the mouth-wateringly realistic graphics now seen in high-end video games are similar to those used by chemists and physicists as they simulate the interactions between particles in systems ranging in scale from the molecular to the astronomical. Rotating, enlarging or reflecting an object from one frame to the next in a game, for example, requires a technique called matrix multiplication. Modelling the interactions between thousands of electrons in a molecule calls for similar techniques.

Such simulations are usually carried out on a supercomputer, but time on these machines is expensive and in short supply. By comparison, games consoles are cheap and easily available, and they come with the added bonus of some innovative hardware. For example, the Wii, made by Nintendo, has a motion-tracking remote control unit that is far cheaper than a comparable device would be if researchers had to build it from scratch.

One key advance is the ease with which scientists can now program games consoles for their own purposes. Although consoles do a great job of rendering images, games programs don’t require software to save data once it has been used to render the image. Scientists, by contrast, need to be able to store the results of the calculations they have fed into their machines.

Things started to get easier in 2002, when demand from computer enthusiasts who wanted to use their PlayStations as fully fledged desktop machines prompted Sony to release software that allowed the PlayStation 2 to run the Linux operating system. That allowed scientists to reprogram the consoles to run their calculations. Then in 2006 came the big breakthrough, with the launch by IBM, Sony and Toshiba of the Cell chip that now drives Sony’s PlayStation 3 (see Timeline). With one central processor and eight “servant” processors (New Scientist, 19 February 2005, p 23), it is vastly more powerful than the PS2 chip, and was designed from day 1 to run Linux.

The release of the Cell has accelerated  research into black holes by Gaurav Khanna, an astrophysicist at the University of Massachusetts, Dartmouth. He has strung together 16 PS3 consoles to calculate the properties of the gravity waves that are expected to be produced when two black holes merge. Meanwhile, a collaboration between IBM and the Mayo Clinic in Rochester, Minnesota, is using the Cell’s ability to render high-resolution video graphics to do the same with data gathered by MRI and other medical scanning techniques. The aim is to make diagnosis easier and faster – by using the images to determine whether a tumour has grown or shrunk, for example.

Other researchers are pushing for even more speed. One of Martínez’s students, Ivan Ufimtsev, is experimenting with the NVIDIA GeForce 8800 GTX graphical processing unit (GPU) for PCs, which was released in November 2006. The GPU has 128 processors – compared to the Cell’s eight – and when slotted into a PC, helps turn it into a high-quality gaming engine. To start with, these cards were hard to program, just like the PS2 without the Linux add-on, but NVIDIA soon cottoned on to the sales opportunities that scientists like Martínez could offer for its product. In February 2007 it released the Compute Unified Device Architecture, a software package that allows the C programming language to be used to program the GPUs.

The results were staggering. When Martínez used it to simulate the repulsion between two electrons in an atom, he found that the calculation ran 130 times faster than it did on an ordinary desktop computer (Journal of Chemical Theory and Computation, DOI: 10.1021/ct700268q). He is now calculating the energy of the electrons in 1000 atoms, which add up to the size of a small protein. “We can now do the things we were killing ourselves to do,” he says.

Martínez predicts that it will soon be possible to use the GPU to predict more accurately which drug molecules will most strongly interact with a protein and how they will react, which could revolutionise pharmaceutical research. Similarly, Koji Yasuda at Nagoya University in Japan reported in a paper published this month (Journal of Computational Chemistry, vol 29, p 334) that he used the same GPU to map the electron energies in two molecules: the anti-cancer drug paclitaxel and the cyclic peptide valinomycin.

Games hardware still isn’t perfect for science. The Cell’s eight processors and the NVIDIA GPUs are forced to round decimal numbers to seven decimal places. As numbers are repeatedly multiplied together, this small error becomes magnified. In a game, the result might be nothing more serious than a car appearing slightly closer to a wall than it should, but in research such inaccuracies can be show-stoppers.

It’s not just the chips that researchers can usefully borrow from gaming hardware. Take the Wii’s hand-held remote control, which contains an accelerometer that can sense in which direction it is being moved, and how vigorously. It transmits this information via a Bluetooth link to the console, where it is used to adjust the graphics to respond to the player’s movements in real time.
Monitoring Parkinson’s

The device recently grabbed attention as a tool for surgeons to improve their technique (New Scientist, 19 January, p 24). Meanwhile, neurologist Thomas Davis at the Vanderbilt Medical Center in Nashville, Tennessee, is using it to measure movement deficiencies in Parkinson’s patients. By attaching up to four Wii remotes to different limbs, Davis captures data for tremor, speed and smoothness of movement, and gait. This data is then sent via the Bluetooth link to a laptop running software that allows Davis to assess quantitatively how well a patient can move. Davis hopes this can be used in clinical trials for Parkinson’s drugs to replace the scoring scales now used, which are based on a doctor observing a patient’s condition.

Others are using the console to assess the progress of patients who have had a stroke or a head injury by monitoring their performance as they play Wii games. Johnny Chung Lee at Carnegie Mellon University in Pittsburgh, Pennsylvania, is using the Wii remote as a virtual reality research tool. As the wearer’s head moves, the Wii tracks it and displays images dependent on where the wearer is looking. Meanwhile, a team at the University of Valladolid in Spain hopes to use the Wii remote to rotate and manipulate ultrasound images more intuitively.

Computer gamers have always hankered after the latest console or PC hardware to run ever more realistic-looking games. Now scientists are lining up right beside them.

From issue 2643 of New Scientist magazine, 16 February 2008, page 26-27

And not an xbox360 in sight….


Is the biofuel dream over?

December 14, 2007
  • 15 December 2007
  • From New Scientist Print Edition.
  • Fred Pearce
  • Peter Aldhous

Can biofuels help save our planet from a climate catastrophe? Farmers and fuel companies certainly seem to think so, but fresh doubts have arisen about the wisdom of jumping wholesale onto the biofuels bandwagon…….

About 12 million hectares, or around 1 per cent of the world’s fields, are currently devoted to growing biofuels. Sugar cane and maize, for example, are turned into bioethanol, a substitute for gasoline, while rapeseed and palm oil are made into biodiesel. That figure will grow because oil is so costly, and because biofuels supposedly emit fewer greenhouse gases than fossil fuels.

But a slew of new studies question the logic behind expanding biofuel production. For a start, there may not be enough land to grow the crops on or water to irrigate them, given other demands on global agriculture. Worse, any cuts in carbon dioxide emissions gained by burning less fossil fuels may be wiped out by increased emissions of the greenhouse gas nitrous oxide from fertilisers used on biofuel crops.

In parts of the world, shortage of water is already putting a brake on agricultural productivity. According to Johan Rockström, executive director of the Stockholm Environment Institute in Sweden, switching 50 per cent of the fossil fuels that will be devoted to electricity generation and transport by 2050 to biofuels would use between 4000 and 12,000 extra cubic kilometres of water per year. To put that in perspective, the total annual flow down the world’s rivers is about 14,000 km3.

A more modest target of quadrupling world biofuel production to 140 billion litres a year by 2030 – enough to replace 7.5 per cent of current gasoline use, would require an extra 180 km3 of water to be extracted from rivers and underground reserves, calculates Charlotte de Fraiture at the International Water Management Institute, based near Columbo in Sri Lanka.

That target may be manageable across much of the globe. But in China and India, where water is in short supply and most crops require artificial irrigation, de Fraiture argues that there is not enough water even to meet existing government plans to expand biofuel production.

Another contentious issue is how much land is available to grow biofuels (New Scientist, 25 September 2006, p 36). And the answer appears to be not much, a point that Sten Nilsson, deputy director of the International Institute for Applied Systems Analysis in Laxenburg, Austria, makes using a “cartographic strip-tease” based on a new global mapping study.

Beginning with a world map showing land not yet built upon or cultivated, Nilsson progressively strips forests, deserts and other non-vegetated areas, mountains, protected areas, land with an unsuitable climate, and pastures needed for grazing (see Maps). That leaves just 250 to 300 million hectares for growing biofuels, an area about the size of Argentina.

Even using a future generation of biofuel crops – woody plants with large amounts of cellulose that enable more biomass to be converted to fuel – Nilsson calculates that it will take 290 million hectares to meet a tenth of the world’s projected energy demands in 2030. But another 200 million hectares will be needed by then to feed an extra 2 to 3 billion people, with a further 25 million hectares absorbed by expanding timber and pulp industries.

So if biofuels expand as much as Nilsson anticipates, there will be no choice but to impinge upon land needed for growing food, or to destroy forests and other pristine areas like peat bogs. That would release carbon now stashed away in forests and peat soils (New Scientist, 1 December, p 50), turning biofuels into a major contributor to global warming

De Fraiture is more optimistic. Her modest projection for a quadrupling of biofuel production assumes that maize production will be boosted by 20 per cent, sugar cane by 25 per cent and oil crops for biodiesel by 80 per cent. Assuming future improvements in crop yields, de Fraiture estimates that this might be done on just 30 million hectares of land – or 2.5 times the area now under cultivation.

Even today’s biofuel yields depend on generous applications of nitrogen-containing fertiliser. That contributes to global warming, as some of the added nitrogen gets converted into nitrous oxide, which is a potent greenhouse gas. Over 100 years it creates 300 times the warming effect of CO2, molecule for molecule. And now researchers led by Paul Crutzen of the Max Planck Institute for Chemistry in Mainz, Germany, who won a share of a Nobel prize for his work on the destruction of the ozone layer, claim that we have underestimated these emissions. Factor in their revised figures, and cuts in CO2 emissions as a result of replacing fossil fuels may be wiped out altogether.

“Fertilisers contribute to global warming, as some of the added nitrogen gets converted into a potent greenhouse gas”

The Intergovernmental Panel on Climate Change suggests that between 1 and 2 per cent of nitrogen added to fields gets converted to nitrous oxide, based on direct measurements of emissions from fertilised soils. But nitrogen from fertiliser also gets into water and moves around the environment, continuing to emit nitrous oxide as it goes. To estimate these “indirect” emissions, Crutzen and his colleagues calculated how much nitrogen has built up in the atmosphere since pre-industrial times, and estimated how much of this could be attributed to the use of fertilisers.

This suggested that between 3 and 5 per cent of the nitrogen added to the soil in fertilisers ends up in the atmosphere as nitrous oxide. Crucially, that would be enough to negate cuts in CO2 emissions made by replacing fossil fuels. Biodiesel from rapeseed came off worse – the warming caused by nitrous oxide emissions being 1 to 1.7 times as much as the cooling caused by replacing fossil fuels. For maize bioethanol, the range was 0.9 to 1.5. Only bioethanol from sugar cane came out with a net cooling effect, its nitrous oxide emissions causing between 0.5 and 0.9 times as much warming as the cooling due to fossil fuel replacement.

These simple calculations, which set increased nitrous oxide emissions against reductions in CO2 emissions caused by replacing gasoline or diesel with biofuels, do not account for all the greenhouse gas emissions associated with producing, processing and distributing the various fuels. Now Michael Wang of the Argonne National Laboratory in Illinois has taken Crutzen’s upper estimate for nitrous oxide emissions and plugged it into a sophisticated computer model which does just that. When he did so, bioethanol from maize went from giving about a 20 per cent cut in greenhouse gas emissions, compared to gasoline, to providing no advantage at all. Still, Wang suspects that Crutzen’s method may overestimate nitrous oxide emissions. “It is a very interesting approach,” he says. “But there may be systematic biases.”

Crutzen stresses that his paper is still being revised in response to comments he has received since August, when a preliminary version appeared online. “Here and there the numbers may change. But the principle doesn’t,” he says. “It’s really telling us about a general problem with our lack of knowledge about the nitrogen cycle.”

With governments and businesses backing biofuels as part of a “green” future, that represents a disturbing gap in our knowledge.

From issue 2634 of New Scientist magazine, 15 December 2007, page 6-7

So the biofuel solution is running into problems, well its an emerging technology and it isn’t the only solution / possibility for humans to switch away from fossil fuels and other green house contributors.

Onwards, always onwards 🙂