Games consoles reveal the supercomputer within

February 16, 2008

WHEN Todd Martínez broke his son’s Sony PlayStation he didn’t realise this would change the course of his career as a theoretical chemist. Having dutifully bought a PlayStation 2 as a replacement, he was browsing through the games console’s technical specification when he realised it might have another use. “I noticed that the architecture looked a lot like high-performance supercomputers I had seen before,” he says. “That’s when I thought about getting one for myself.”

Six years on and Martínez has persuaded the supercomputing centre at the University of Illinois, Urbana-Champaign, to buy eight computers each driven by two of the specialised chips that are at the heart of Sony’s PlayStation 3 console. Together with his student Benjamin Levine he is using them to simulate the interactions between the electrons in atoms. Scaled up over entire molecules, the results could pave the way to predicting how a protein will interact with a drug.

Martínez and Levine are not the only researchers who have turned to gaming hardware to do their number crunching. That’s because the kinds of calculations required to produce the mouth-wateringly realistic graphics now seen in high-end video games are similar to those used by chemists and physicists as they simulate the interactions between particles in systems ranging in scale from the molecular to the astronomical. Rotating, enlarging or reflecting an object from one frame to the next in a game, for example, requires a technique called matrix multiplication. Modelling the interactions between thousands of electrons in a molecule calls for similar techniques.

Such simulations are usually carried out on a supercomputer, but time on these machines is expensive and in short supply. By comparison, games consoles are cheap and easily available, and they come with the added bonus of some innovative hardware. For example, the Wii, made by Nintendo, has a motion-tracking remote control unit that is far cheaper than a comparable device would be if researchers had to build it from scratch.

One key advance is the ease with which scientists can now program games consoles for their own purposes. Although consoles do a great job of rendering images, games programs don’t require software to save data once it has been used to render the image. Scientists, by contrast, need to be able to store the results of the calculations they have fed into their machines.

Things started to get easier in 2002, when demand from computer enthusiasts who wanted to use their PlayStations as fully fledged desktop machines prompted Sony to release software that allowed the PlayStation 2 to run the Linux operating system. That allowed scientists to reprogram the consoles to run their calculations. Then in 2006 came the big breakthrough, with the launch by IBM, Sony and Toshiba of the Cell chip that now drives Sony’s PlayStation 3 (see Timeline). With one central processor and eight “servant” processors (New Scientist, 19 February 2005, p 23), it is vastly more powerful than the PS2 chip, and was designed from day 1 to run Linux.

The release of the Cell has accelerated  research into black holes by Gaurav Khanna, an astrophysicist at the University of Massachusetts, Dartmouth. He has strung together 16 PS3 consoles to calculate the properties of the gravity waves that are expected to be produced when two black holes merge. Meanwhile, a collaboration between IBM and the Mayo Clinic in Rochester, Minnesota, is using the Cell’s ability to render high-resolution video graphics to do the same with data gathered by MRI and other medical scanning techniques. The aim is to make diagnosis easier and faster – by using the images to determine whether a tumour has grown or shrunk, for example.

Other researchers are pushing for even more speed. One of Martínez’s students, Ivan Ufimtsev, is experimenting with the NVIDIA GeForce 8800 GTX graphical processing unit (GPU) for PCs, which was released in November 2006. The GPU has 128 processors – compared to the Cell’s eight – and when slotted into a PC, helps turn it into a high-quality gaming engine. To start with, these cards were hard to program, just like the PS2 without the Linux add-on, but NVIDIA soon cottoned on to the sales opportunities that scientists like Martínez could offer for its product. In February 2007 it released the Compute Unified Device Architecture, a software package that allows the C programming language to be used to program the GPUs.

The results were staggering. When Martínez used it to simulate the repulsion between two electrons in an atom, he found that the calculation ran 130 times faster than it did on an ordinary desktop computer (Journal of Chemical Theory and Computation, DOI: 10.1021/ct700268q). He is now calculating the energy of the electrons in 1000 atoms, which add up to the size of a small protein. “We can now do the things we were killing ourselves to do,” he says.

Martínez predicts that it will soon be possible to use the GPU to predict more accurately which drug molecules will most strongly interact with a protein and how they will react, which could revolutionise pharmaceutical research. Similarly, Koji Yasuda at Nagoya University in Japan reported in a paper published this month (Journal of Computational Chemistry, vol 29, p 334) that he used the same GPU to map the electron energies in two molecules: the anti-cancer drug paclitaxel and the cyclic peptide valinomycin.

Games hardware still isn’t perfect for science. The Cell’s eight processors and the NVIDIA GPUs are forced to round decimal numbers to seven decimal places. As numbers are repeatedly multiplied together, this small error becomes magnified. In a game, the result might be nothing more serious than a car appearing slightly closer to a wall than it should, but in research such inaccuracies can be show-stoppers.

It’s not just the chips that researchers can usefully borrow from gaming hardware. Take the Wii’s hand-held remote control, which contains an accelerometer that can sense in which direction it is being moved, and how vigorously. It transmits this information via a Bluetooth link to the console, where it is used to adjust the graphics to respond to the player’s movements in real time.
Monitoring Parkinson’s

The device recently grabbed attention as a tool for surgeons to improve their technique (New Scientist, 19 January, p 24). Meanwhile, neurologist Thomas Davis at the Vanderbilt Medical Center in Nashville, Tennessee, is using it to measure movement deficiencies in Parkinson’s patients. By attaching up to four Wii remotes to different limbs, Davis captures data for tremor, speed and smoothness of movement, and gait. This data is then sent via the Bluetooth link to a laptop running software that allows Davis to assess quantitatively how well a patient can move. Davis hopes this can be used in clinical trials for Parkinson’s drugs to replace the scoring scales now used, which are based on a doctor observing a patient’s condition.

Others are using the console to assess the progress of patients who have had a stroke or a head injury by monitoring their performance as they play Wii games. Johnny Chung Lee at Carnegie Mellon University in Pittsburgh, Pennsylvania, is using the Wii remote as a virtual reality research tool. As the wearer’s head moves, the Wii tracks it and displays images dependent on where the wearer is looking. Meanwhile, a team at the University of Valladolid in Spain hopes to use the Wii remote to rotate and manipulate ultrasound images more intuitively.

Computer gamers have always hankered after the latest console or PC hardware to run ever more realistic-looking games. Now scientists are lining up right beside them.

From issue 2643 of New Scientist magazine, 16 February 2008, page 26-27

And not an xbox360 in sight….


Some News…

December 15, 2007

Top 11 Warmest Years On Record Have All Been In Last 13 Years

ScienceDaily (Dec. 13, 2007) — The decade of 1998-2007 is the warmest on record, according to data sources obtained by the World Meteorological Organization (WMO). The global mean surface temperature for 2007 is currently estimated at 0.41°C/0.74°F above the 1961-1990 annual average of 14.00°C/57.20°F.

[spoiler]

The University of East Anglia and the Met Office’s Hadley Centre have released preliminary global temperature figures for 2007, which show the top 11 warmest years all occurring in the last 13 years. The provisional global figure for 2007 using data from January to November, currently places the year as the seventh warmest on records dating back to 1850.
Other remarkable global climatic events recorded so far in 2007 include record-low Arctic sea ice extent, which led to first recorded opening of the Canadian Northwest Passage; the relatively small Antarctic Ozone Hole; development of La Niña in the central and eastern Equatorial Pacific; and devastating floods, drought and storms in many places around the world.
The preliminary information for 2007 is based on climate data up to the end of November from networks of land-based weather stations, ships and buoys, as well as satellites. The data are continually collected and disseminated by the National Meteorological and Hydrological Services (NMHS) of WMO’s 188 Members and several collaborating research institutions. Final updates and figures for 2007 will be published in March 2008 in the annual WMO brochure for the Statement on the Status of the Global Climate.
WMO’s global temperature analyses are based on two different sources. One is the combined dataset maintained by both the Hadley Centre of the UK Meteorological Office, and the Climatic Research Unit, University of East Anglia, UK, which at this stage ranked 2007 as the seventh warmest on record. The other dataset is maintained by the US Department of Commerce’s National Oceanic and Atmospheric Administration (NOAA), which indicated that 2007 is likely to be the fifth warmest on record.
Since the start of the 20th century, the global average surface temperature has risen by 0.74°C. But this rise has not been continuous. The linear warming trend over the last 50 years (0.13°C per decade) is nearly twice that for the last 100 years.
According to the Intergovernmental Panel on Climate Change’s 4th Assessment (Synthesis) Report, 2007, “warming of the climate system is unequivocal, as is now evident from observations of increases in global average air and ocean temperatures, widespread melting of snow and ice, and rising global average sea level.”
2007 global temperatures have been averaged separately for both hemispheres. Surface temperatures for the northern hemisphere are likely to be the second warmest on record, at 0.63°C above the 30-year mean (1961-90) of 14.6°C/58.3°F. The southern hemisphere temperature is 0.20°C higher than the 30-year average of 13.4°C/56.1°F, making it the ninth warmest in the instrumental record since 1850.
January 2007 was the warmest January in the global average temperature record at 12.7°C/54.9°F, compared to the 1961-1990 January long-term average of 12.1°C/53.8°F.
Regional temperature anomalies
2007 started with record breaking temperature anomalies throughout the world. In parts of Europe, winter and spring ranked amongst the warmest ever recorded, with anomalies of more than 4°C above the long-term monthly averages for January and April.
Extreme high temperatures occurred in much of Western Australia from early January to early March, with February temperatures more than 5°C above average.
Two extreme heat waves affected south-eastern Europe in June and July, breaking previous records with daily maximum temperatures exceeding 40°C/104°F in some locations, including up to 45°C/113°F in Bulgaria. Dozens of people died and fire-fighters battled blazes devastating thousands of hectares of land. A severe heat wave occurred across the southern United States of America during much of August with more than 50 deaths attributed to excessive heat. August to September 2007 was extremely warm in parts of Japan, setting a new national record of absolute maximum temperature of 40.9°/105.6°F on 16 August.
In contrast, Australia recorded its coldest ever June with the mean temperature dropping to 1.5°C below normal. South America experienced an unusually cold winter (June-August), bringing winds, blizzards and rare snowfall to various provinces with temperatures falling to -22°C/-7.6°F in Argentina and -18°C/-0.4°F in Chile in early July.
Prolonged drought
Across North America, severe to extreme drought was present across large parts of the western U.S. and Upper Midwest, including southern Ontario/Canada, for much of 2007.  More than three-quarters of the Southeast U.S. was in drought from mid-summer into December, but heavy rainfall led to an end of drought in the southern Plains.
In Australia, while conditions were not as severely dry as in 2006, long term drought meant water resources remained extremely low in many areas. Below average rainfall over the densely populated and agricultural regions resulted in significant crop and stock losses, as well as water restrictions in most major cities.
China experienced its worst drought in a decade, affecting nearly 40 million hectares of farmland. Tens of millions of people suffered from water restrictions.
Flooding and intense storms
Flooding affected many African countries in 2007. In February, Mozambique experienced its worst flooding in six years, killing dozens, destroying thousands of homes and flooding 80,000 hectares of crops in the Zambezi valley.
In Sudan, torrential rains caused flash floods in many areas in June/July, affecting over 410,000 people, including 200,000 left homeless. The strong southwesterly monsoon resulted in one of the heaviest July-September rainfall periods, triggering widespread flash floods affecting several countries in West Africa, Central Africa and parts of the Greater Horn of Africa. Some 1.5 million people were affected and hundreds of thousands homes destroyed.
In Bolivia, flooding in January-February affected nearly 200,000 people and 70,000 hectares of cropland. Strong storms brought heavy rain that caused extreme flooding in the littoral region of Argentina in late March/early April. In early May, Uruguay was hit by its worst flooding since 1959, with heavy rain producing floods that affected more than 110,000 people and severely damaged crops and buildings. Triggered by storms, massive flooding in Mexico in early November destroyed the homes of half a million people and seriously affected the country’s oil industry.
In Indonesia, massive flooding on Java in early February killed dozens and covered half of the city of Jakarta by up to 3.7 metres of water. Heavy rains in June ravaged areas across southern China, with flooding and landslides affecting over 13.5 million people and killing more than 120. Monsoon-related extreme rainfall events caused the worst flooding in years in parts of South Asia. About 25 million people were affected in the region, especially in India, Pakistan, Bangladesh and Nepal. Thousands lost their lives. However, rainfall during the Indian summer monsoon season (June-September) for India was, generally, near normal (105% of the long-term average), but with marked differences in the distribution of rainfall in space and time.
A powerful storm system, Kyrill, affected much of northern Europe during 17-18 January 2007 with torrential rains and winds gusting up to 170km/h. There were at least 47 deaths across the region, with disruptions in electric supply affecting tens of thousands during the storm.
England and Wales recorded its wettest May-July period since records began in 1766, receiving 406 mm of rain compared to the previous record of 349 mm in 1789. Extensive flooding in the region killed nine and caused more than US$6 billion in damages.
Development of La Niña
The brief El Niño event of late 2006 quickly dissipated in January 2007, and La Niña conditions became well established across the central and eastern Equatorial Pacific in the latter half of 2007.
In addition to La Niña, unusual sea surface temperature patterns with cooler than normal values across the north of Australia to the Indian Ocean, and warmer than normal values in the Western Indian Ocean, were recorded. These are believed to have modified the usual La Niña impacts in certain regions around the world.
The current La Niña is expected to continue into the first quarter of 2008 at least.
Devastating tropical cyclones
Twenty-four named tropical storms developed in the North-West Pacific during 2007, below the annual average of 27. Fourteen storms were classified as typhoons, equalling the annual average. Tropical cyclones affected millions in south-east Asia, with typhoons Pabuk, Krosa, Lekima and tropical storms like Peipah among the severest.
During the 2007 Atlantic Hurricane season, 14 named storms occurred, compared to the annual average of 12, with 6 being classified as hurricanes, equalling the average. For the first time since 1886, two category 5 hurricanes (Dean and Felix) made landfall in the same season.
In February, due to tropical cyclone Gamède, a new worldwide rainfall record was set in French La Reunion with 3,929 mm measured within three days.
In June, cyclone Gonu made landfall in Oman, affecting more than 20,000 people and killing 50, before reaching the Islamic Republic of Iran. There is no record of a tropical cyclone hitting Iran since 1945.
On 15 November, tropical cyclone Sidr made landfall in Bangladesh, generating winds of up to 240 km/h and torrential rains. More than 8.5 million people were affected and over 3,000 died. Nearly 1.5 million houses were damaged or destroyed. Often hit by cyclones, Bangladesh has developed a network of cyclone shelters and a storm early-warning system, which significantly reduced casualties.
Australia’s 2006/2007 tropical season was unusually quiet, with only five tropical cyclones recorded, equalling the lowest number observed since at least 1943-44.
Relatively small Antarctic ozone hole
The 2007 Antarctic ozone hole was relatively small due to mild stratosphere winter temperatures. Since 1998, only the 2002 and 2004 ozone holes were smaller. In 2007, the ozone hole reached a maximum of 25 million square kms in mid-September, compared to 29 million square kms in the record years of 2000 and 2006. The ozone mass deficit reached 28 megatonnes on 23 September, compared to more than 40 megatonnes in the record year of 2006.
Record-low Arctic sea ice extent opened the Northwest Passage
Following the Arctic sea ice melt season, which ends annually in September at the end of the northern summer, the average “sea ice extent” was 4.28 million square kms, the lowest on record. The “sea ice extent” at September 2007 was 39% below the long-term 1979-2000 average, and 23% below the previous record set just two years ago in September 2005.For the first time in recorded history, the disappearance of ice across parts of the Arctic opened the Canadian Northwest Passage for about five weeks starting 11 August. Nearly 100 voyages in normally ice-blocked waters sailed without the threat of ice. The September rate of sea ice decline since 1979 is now approximately 10% per decade, or 72,000 square kms per year.
Sea level rise continues
The sea level continued to rise at rates substantially above the average for the 20th century of about 1.7 mm per year. Measurements show that the 2007 global averaged sea level is about 20 cm higher than the 1870 estimate. Modern satellite measurements show that since 1993 global averaged sea level has been rising at about 3 mm per year.
Global 10 Warmest Years Mean Global temperature (°C) (anomaly with respect to 1961-1990)
1998 0.52
2005 0.48
2003 0.46
2002 0.46
2004 0.43
2006 0.42
2007(Jan-Nov) 0.41
2001 0.40
1997 0.36
1995 0.28
UK 10 Warmest Years Mean UK Temperature (°C) (anomaly with respect to 1971-2000)
2006 +1.15
2007 (Jan to 10th Dec) + 1.10
2003 + 0.92
2004 + 0.89
2002 + 0.89
2005 + 0.87
1990 + 0.83
1997 + 0.82
1949 + 0.80
1999 + 0.78
Adapted from materials provided by World Meteorological Organization.

[/spoiler]

More fuel for the metaphorical fire.

PS3 one ups Xbox 360 with its DivX support

The Xbox 360 may have beaten Sony to to the punch with regards to supporting the DivX format but it seems that the PS3 will have the last laugh on the matter. First of all, unlike the Xbox 360, the PS3 is DivX certified meaning it will get full DivX functionality. This even allows for developers to utilize the solid form of compression for various in-game cut scenes.

quote:

Last month, DivX announced that the PS3 will soon support DivX, and, this month, Gizmodo met with the company, which shared some interesting details on the big move.

First of all, unlike the Xbox 360, the PS3 is DivX certified. While Microsoft’s console can only playback some DivX files, the PS3 will get full DivX functionality. This includes the ability for game developers to use the very efficient compression format for in-game cut-scenes.

This means DivX video cut scenes will reduce stress on the machine, theoretically allowing for better load times, less power consumption, and less heat output.

News Source: Blorge

Hurrah!  Well ok, I’m not a PS3 fanboy per say, but I do own one.  So in the interests of keeping the inter console wars fresh…. Hurrah!