Sunday, November 11, 2007

Games and Science

The great hype of the moment regarding PC Games is a game called Crysis. It is a science fiction first-person shooter computer game, and the story revolves around the usual alien invasion, this time in a fictional future, and you are part of a military force bound to fight the invasion. Just the usual shoot all the aliens, or demons, or enemy soldier, or whatever you want, game. A kind of story that is getting kind of tired. Anyway... All this hype has nothing to do with its "revolutionary" story, but is due to the higher machine requirements, and the level of realism of the game.

The left side picture shows the real life version, and the right side picture shows the in-game version.
(Click for a larger version.)

As you can see in the picture above, the realism level in the game is simply outstanding. If you want to see more, take a look at this small video made by the guys at Tom's Hardware Guide showing some of the things you can do while playing the game, and pay attention at the graphics. The physics realism as you can see in the video footage is also unbelievable. Unbelievable is also the system requirements of your computer, if you want to play this game. I am not going to the details about the minimum and recommended settings to play the game, but I will just say that even the best PC computer in the market will not run the game with all maximum settings without glitches. If you are curious about the system requirements take a look here. Amazing to say the least.

The point I want to talk about is not exactly about the game itself. You can find more about the game in the links above, and all over the Internet. The main point of this post is to compare a game like this with current scientific simulations. How can these game companies make something so perfect that simulates our environment almost perfectly in real time, and in science we use codes to run models that sometimes are static models, and the code takes hours and hours to run. What is the problem here?

The first thing to notice is that even the physics in a game being seemingly flawless, the amount of simplifications or approximations made to simulate the physics are enormous. The laws of physics there are far from being perfect, but that does not demerit the game, it shows that the only real thing limiting the game engine is the today's computer hardware. There are really a lot of approximations, but the amount of details that you can see is also huge. So there is not only one physical process being calculated in real-time, but a huge amount of physical processes being calculated at the same time in real-time. That is what make the games of today so perfect, and in the future things will seem more and more real.

In real science you have to model the physics in a much more detailed way than in a game, and physics is not so simply as it may seem looking at the game. Far from that, actually. Most of the problems in physics cannot be solved without approximations, an even the largest computer clusters cannot compute some complex physical processes in real-time, and they usually took hours and days until some results can be obtained.

Looking at this game, and knowing a little bit about scientific simulations, something came to my mind. Most of the current computer simulations that are done these days are not as efficient as they should be. People still use a language called Fortran 77 in scientific simulations. A language older than me, and I am an old one! This old language is totally out of date, and surely is far from being optimized to run in our current processors. I don't understand why people in science tend to be so conservative regarding a programming language. The CPU of almost every computer today is also a multi-processor CPU, so parallel programming is starting to be a very useful skill, and surely these old languages are also not optimized to make codes that run in various processors at the same time. If you want to do something more efficient today, you have to evolve and forget about these old languages many scientists still insist to use. They are still good languages when you are learning how to code, but they are just that.

Another new concept that should be taken into consideration in future scientific simulations is that it is not only the CPU that can process data. Currently the processing capacity of modern GPUs (graphic processing units), or video boards, are larger than today's CPUs capacity. Until some time ago, GPUs were only useful in processing graphical data, and 3D calculations, but the new NVIDIA GPUs can be used to do any kind of calculations, using C programming language, and this can take fully advantage of the GPU hundreds(!) of on-chip processors in a optimal way (this technology is called CUDA - Compute Unified Device Architecture).

The gaming industry is always running a little behind the hardware development, and as far as I know, science is a lot behind the gaming industry. I know that learning science is not easy (far from that actually), and that to keep the pace with developments in any scientific field at the same time as you keep the pace with hardware and programming languages development is an herculean task, but a lot of improvement could be reached with a small evolution regarding scientific coding. Some conservative researchers, from the old school, need to to understand that a 30 year old computer language is not exactly the best to be used in our current computers. And even not-so-old languages like C++, must used in a new way to improve processing efficiency. Computer today are very different from computers 10 years ago, and science must evolve as the games do.

(Maybe I should not complain too much, in the current way Entropy tends to be maximized... :P)

1 comment:

khris said...

stop being a nerd :P