Windows 7 Beta on the Dell XPS M1330 Laptop continued…

January 15, 2009

Last night I took the plunge and downloaded the beta release of Windows 7 and installed it on my Dell XPS M1330. This is the story so far:

Regular reads of my blog will know that I like to keep abreast of software and hardware developments as and when they happen and this is how I come to be writing this post from Windows 7. I am not a Windows fan but wanted to know what the new update is like and what Microsoft will eventually be asking customers to shell out more money for.

Ok, the install.

I went for an upgrade from Vista Premium at first which took an amazingly long time to complete, about 2 and a half hours. It ran fine, however the internet refused to work which I worked out was because Zonealarm was installed. So, rather than weed out any other conflicts I decided to do a fresh install, working out at about 45 mins in total. And yes, now the internet is working. Wireless works out of the box also, no drivers needed. Display drivers for the M1330 by Nvidia from Dell also work fine. If I have any problems with any other drivers I’ll be sure to post an update.

Windows 7 on the Dell XPS M1330

Windows 7 on the Dell XPS M1330

For the new features Microsoft lists as new / improved in Windows 7 I will refer you to this wikipedia article.

I will no doubt be referencing these in future posts on this beta release.

For the main part I notice only a few cosmetic differences after using the operating system, but that is only after an hours proper usage.  Battery life does however seem to be improved when running off it.

I am going to set up a special static page on my blog for listing software that works and does not work with this beta.

That’s about it for now, check back for updates as and when they happen.


It’s Distro Time

June 22, 2008

OpenSUSE 11 Installing

The time has come around again for me to do some switching around with regards to the Operating System I use on my main computer, my Dell XPS M1330 laptop.

Up until yesterday I had been using a dual boot system consisting of Ubuntu 8.10 Hardy Heron which I have been using on and off since the alpha stage of it’s release.  Along with Ubuntu the other OS was Vista Home Premium.

So why did I feel the need to change this set up?

To be honest I have been more than happy with Ubuntu for a long time but I wanted to switch from the dual boot system with WIndows eating up half of my 250gig hard drive, to a streamlined linux only option.  I don’t use Vista often enough to justify it being installed here.  I do, however, have Vista Ultimate on our desktop should I need it for whatever reason.

After deciding to get rid of Vista I could simply have deleted the partition, formatted it to ext3 and added it to the Ubuntu partition and edited the grub bootloader.  But I decided to take this opportunity to try out a few new major Linux distribution releases and then stay with one based on whichever I prefer.

First off was Fedora 9 which although very smart looking failed me due to issues with the way the display is managed.  The fact that there are problems getting the proprietary Nvidia drivers working for my mobile laptop graphics card is something I can’t live with.  Stuck without this working properly the system runs hot and the display simply isn’t up to scratch.  Maybe Fedora will be worth another look when this issue is resolved properly through the repos provided.

Next up was Linux Mint, which is a perfectly fine distro.  It is basically a modified Ubuntu Heron which includes the restricted extras like codecs.  The front end is very smart indeed but I found that it was basically a pretty Ubuntu and the extras are things I already had working in Hardy Heron.

I am currently writing this on the Gnome version of openSUSE 11.  This is definitely a promising distro and one which I have not used for any length of time before.  That is about to change however as it installed like a charm detecting all the relevant hardware, including the wireless, out of the box.  I need to keep an eye on the battery life as that is one thing I have discovered with different Linus Distributions on laptops.  They all seem to use up battery power at different rates by default, with Ubuntu being the easiest on power consumption as far as I can tell.

Next on my hit list is Debian, which I have used before and I know will take a bit more setting up on this laptop.

At the moment though, as I said, it’s time to give openSUSE a fair run out.  I’ll post my thoughts on it later.


It's Distro Time

June 22, 2008

OpenSUSE 11 Installing

The time has come around again for me to do some switching around with regards to the Operating System I use on my main computer, my Dell XPS M1330 laptop.

Up until yesterday I had been using a dual boot system consisting of Ubuntu 8.10 Hardy Heron which I have been using on and off since the alpha stage of it’s release.  Along with Ubuntu the other OS was Vista Home Premium.

So why did I feel the need to change this set up?

To be honest I have been more than happy with Ubuntu for a long time but I wanted to switch from the dual boot system with WIndows eating up half of my 250gig hard drive, to a streamlined linux only option.  I don’t use Vista often enough to justify it being installed here.  I do, however, have Vista Ultimate on our desktop should I need it for whatever reason.

After deciding to get rid of Vista I could simply have deleted the partition, formatted it to ext3 and added it to the Ubuntu partition and edited the grub bootloader.  But I decided to take this opportunity to try out a few new major Linux distribution releases and then stay with one based on whichever I prefer.

First off was Fedora 9 which although very smart looking failed me due to issues with the way the display is managed.  The fact that there are problems getting the proprietary Nvidia drivers working for my mobile laptop graphics card is something I can’t live with.  Stuck without this working properly the system runs hot and the display simply isn’t up to scratch.  Maybe Fedora will be worth another look when this issue is resolved properly through the repos provided.

Next up was Linux Mint, which is a perfectly fine distro.  It is basically a modified Ubuntu Heron which includes the restricted extras like codecs.  The front end is very smart indeed but I found that it was basically a pretty Ubuntu and the extras are things I already had working in Hardy Heron.

I am currently writing this on the Gnome version of openSUSE 11.  This is definitely a promising distro and one which I have not used for any length of time before.  That is about to change however as it installed like a charm detecting all the relevant hardware, including the wireless, out of the box.  I need to keep an eye on the battery life as that is one thing I have discovered with different Linus Distributions on laptops.  They all seem to use up battery power at different rates by default, with Ubuntu being the easiest on power consumption as far as I can tell.

Next on my hit list is Debian, which I have used before and I know will take a bit more setting up on this laptop.

At the moment though, as I said, it’s time to give openSUSE a fair run out.  I’ll post my thoughts on it later.


Games consoles reveal the supercomputer within

February 16, 2008

WHEN Todd Martínez broke his son’s Sony PlayStation he didn’t realise this would change the course of his career as a theoretical chemist. Having dutifully bought a PlayStation 2 as a replacement, he was browsing through the games console’s technical specification when he realised it might have another use. “I noticed that the architecture looked a lot like high-performance supercomputers I had seen before,” he says. “That’s when I thought about getting one for myself.”

Six years on and Martínez has persuaded the supercomputing centre at the University of Illinois, Urbana-Champaign, to buy eight computers each driven by two of the specialised chips that are at the heart of Sony’s PlayStation 3 console. Together with his student Benjamin Levine he is using them to simulate the interactions between the electrons in atoms. Scaled up over entire molecules, the results could pave the way to predicting how a protein will interact with a drug.

Martínez and Levine are not the only researchers who have turned to gaming hardware to do their number crunching. That’s because the kinds of calculations required to produce the mouth-wateringly realistic graphics now seen in high-end video games are similar to those used by chemists and physicists as they simulate the interactions between particles in systems ranging in scale from the molecular to the astronomical. Rotating, enlarging or reflecting an object from one frame to the next in a game, for example, requires a technique called matrix multiplication. Modelling the interactions between thousands of electrons in a molecule calls for similar techniques.

Such simulations are usually carried out on a supercomputer, but time on these machines is expensive and in short supply. By comparison, games consoles are cheap and easily available, and they come with the added bonus of some innovative hardware. For example, the Wii, made by Nintendo, has a motion-tracking remote control unit that is far cheaper than a comparable device would be if researchers had to build it from scratch.

One key advance is the ease with which scientists can now program games consoles for their own purposes. Although consoles do a great job of rendering images, games programs don’t require software to save data once it has been used to render the image. Scientists, by contrast, need to be able to store the results of the calculations they have fed into their machines.

Things started to get easier in 2002, when demand from computer enthusiasts who wanted to use their PlayStations as fully fledged desktop machines prompted Sony to release software that allowed the PlayStation 2 to run the Linux operating system. That allowed scientists to reprogram the consoles to run their calculations. Then in 2006 came the big breakthrough, with the launch by IBM, Sony and Toshiba of the Cell chip that now drives Sony’s PlayStation 3 (see Timeline). With one central processor and eight “servant” processors (New Scientist, 19 February 2005, p 23), it is vastly more powerful than the PS2 chip, and was designed from day 1 to run Linux.

The release of the Cell has accelerated  research into black holes by Gaurav Khanna, an astrophysicist at the University of Massachusetts, Dartmouth. He has strung together 16 PS3 consoles to calculate the properties of the gravity waves that are expected to be produced when two black holes merge. Meanwhile, a collaboration between IBM and the Mayo Clinic in Rochester, Minnesota, is using the Cell’s ability to render high-resolution video graphics to do the same with data gathered by MRI and other medical scanning techniques. The aim is to make diagnosis easier and faster – by using the images to determine whether a tumour has grown or shrunk, for example.

Other researchers are pushing for even more speed. One of Martínez’s students, Ivan Ufimtsev, is experimenting with the NVIDIA GeForce 8800 GTX graphical processing unit (GPU) for PCs, which was released in November 2006. The GPU has 128 processors – compared to the Cell’s eight – and when slotted into a PC, helps turn it into a high-quality gaming engine. To start with, these cards were hard to program, just like the PS2 without the Linux add-on, but NVIDIA soon cottoned on to the sales opportunities that scientists like Martínez could offer for its product. In February 2007 it released the Compute Unified Device Architecture, a software package that allows the C programming language to be used to program the GPUs.

The results were staggering. When Martínez used it to simulate the repulsion between two electrons in an atom, he found that the calculation ran 130 times faster than it did on an ordinary desktop computer (Journal of Chemical Theory and Computation, DOI: 10.1021/ct700268q). He is now calculating the energy of the electrons in 1000 atoms, which add up to the size of a small protein. “We can now do the things we were killing ourselves to do,” he says.

Martínez predicts that it will soon be possible to use the GPU to predict more accurately which drug molecules will most strongly interact with a protein and how they will react, which could revolutionise pharmaceutical research. Similarly, Koji Yasuda at Nagoya University in Japan reported in a paper published this month (Journal of Computational Chemistry, vol 29, p 334) that he used the same GPU to map the electron energies in two molecules: the anti-cancer drug paclitaxel and the cyclic peptide valinomycin.

Games hardware still isn’t perfect for science. The Cell’s eight processors and the NVIDIA GPUs are forced to round decimal numbers to seven decimal places. As numbers are repeatedly multiplied together, this small error becomes magnified. In a game, the result might be nothing more serious than a car appearing slightly closer to a wall than it should, but in research such inaccuracies can be show-stoppers.

It’s not just the chips that researchers can usefully borrow from gaming hardware. Take the Wii’s hand-held remote control, which contains an accelerometer that can sense in which direction it is being moved, and how vigorously. It transmits this information via a Bluetooth link to the console, where it is used to adjust the graphics to respond to the player’s movements in real time.
Monitoring Parkinson’s

The device recently grabbed attention as a tool for surgeons to improve their technique (New Scientist, 19 January, p 24). Meanwhile, neurologist Thomas Davis at the Vanderbilt Medical Center in Nashville, Tennessee, is using it to measure movement deficiencies in Parkinson’s patients. By attaching up to four Wii remotes to different limbs, Davis captures data for tremor, speed and smoothness of movement, and gait. This data is then sent via the Bluetooth link to a laptop running software that allows Davis to assess quantitatively how well a patient can move. Davis hopes this can be used in clinical trials for Parkinson’s drugs to replace the scoring scales now used, which are based on a doctor observing a patient’s condition.

Others are using the console to assess the progress of patients who have had a stroke or a head injury by monitoring their performance as they play Wii games. Johnny Chung Lee at Carnegie Mellon University in Pittsburgh, Pennsylvania, is using the Wii remote as a virtual reality research tool. As the wearer’s head moves, the Wii tracks it and displays images dependent on where the wearer is looking. Meanwhile, a team at the University of Valladolid in Spain hopes to use the Wii remote to rotate and manipulate ultrasound images more intuitively.

Computer gamers have always hankered after the latest console or PC hardware to run ever more realistic-looking games. Now scientists are lining up right beside them.

From issue 2643 of New Scientist magazine, 16 February 2008, page 26-27

And not an xbox360 in sight….