Spending a week in Berlin, after my last visit 23 years ago, made me think of the Commodore Amiga 500+. It was the year 91 and I was an exchange student in Germany. The Commodore Amiga was pretty popular there and I was able to test awesome games with my classmates, and off course I ended up buying my favorite ones, like Lemmings. Years passed and my Commodore stopped working, but I still wanted to play some of those great games, but it was not possible to buy an Amiga anymore (it was discontinued in 1992). Then I discovered something new: I could run Amiga games on a PC using something called an Emulator. Although Emulators and virtualisation are not the same, as the guys from Computer World explain here, for me it was the beginning of a journey into emulators and virtualisation.
Some years later a friend of mine had a specialised software that was really hard to configure, and every time his PC or the hard disk crashed (which happened very often) he needed to spend a lot of time and money configuring it all over again. It was then that the curse which haunts all of us who study software engineering (I was still at the university at that time), or anything related to information technology, descended over me: "Hey! you're studying something about computers, solve my problem!" my friend said.
The problem was straight forward: he wanted to configure the operating system and his software for the last time, and then move this "package" (meaning his specialised software and operating system already configured) to a new PC whenever his old one crashed, all done in an easy and practical way. Using the internet I learned about virtual PCs (VPs). To be able to use a virtual PC you need to install a software for the virtualisation, and that's where the magic starts. You execute the virtualisation software and in a window inside your desktop you will see as if a new computer is booting up; in this brand-new computer you need to install a new operating system, software, etc.; exactly as you do with a new physical computer. So, we installed my friend's software in a virtual PC, he could now copy the VP (usually a huge folder) to a new PC every time his old one died. Then he could start the virtual machine that contained his software and would be ready to continue working! Sounds like problem solved, right? well almost; now he was complaining about his software running slower. The solution was to buy more RAM for the PC, because now the hardware was running two operating systems, the base operating system that consumes a lot of memory, and the virtualisation software, which does not require a significant amount of memory by itself, but it contains another operating system called the guest operating system that has same memory requirements as the host.
This is the idea behind virtualisation: multiple virtual computers running on top of one hardware, all sharing and consuming the same physical resources like RAM memory, processor, etc.
It is very practical to have virtual machines that are hardware agnostic since they run on top of any hardware. It is allows to better exploit your hardware by running multiple machines on it; for example, you can have a virtual server for your financial operations, that are heavy during month's end and another virtual server for your logistic operations, that are intense in the middle of the month, this means you will be taking full advantage of your hardware during the whole month.
I recommend reading this article to understand all aspects of virtualisation.
So far I have learned that one of the positive points of virtualisation is that software runs independently of the type of hardware, but on the down side, every virtual machine needs an instance of an operating system that consumes resources. In clouds, where the number of virtual machines is really big, the quantity of resources needed by the guest operating system also become considerable.
Near 2006 Linux introduced a very interesting solution for this problem: containers.
The idea behind containers is: on top of one physical host have only one operating system (no more waste of resources for each operating system on each virtual PC) that can run multiple instances of a program, and do it with certain level of isolation; meaning that each instance of the program believes it is running on a different machine, even with a different network address. Recently an implementation of containers called Docker (http://www.docker.com/) has been in the spotlight because companies like Google and Amazon are contributing to the project, and support this container technology in their own clouds.
The switch from virtualisation to containers, can save the world more energy than switching to electric cars, according to this article published by Wired.
Now that you have a glance of the difference between these 2 technologies, what is your opinion?

No hay comentarios.:
Publicar un comentario