Sony PS3 Outsells Nintendo Wii in Japan
Video game sales are closely watched for hints on how soon Sony can turn around its struggling game operations and how much growth momentum Nintendo
Let you know more about computer..
Introduction
Over the course of the past couple of months, the AnandTech IT team has been putting a lot of time into mapping out and clarifying the inner workings of several interesting forms of virtualization. The purpose of these articles was not primarily to cover "news" (however current the topic might be), but to create a sort of knowledge base on this subject at AnandTech, as we find many people interested in learning more about virtualization are met with a large amount of misinformation. Secondly, we believe a better knowledge on the subject will empower the people in control of their company's IT infrastructure and help them make the correct decisions for the job.
There's no denying that virtualization is changing company's server rooms all over the world. It is promoting both innovation and the preservation of aged applications that would otherwise not survive a migration to modern hardware platforms. Virtualization is completely redefining the rules of what can and cannot be done in a server environment, adding a versatility to it that is increasing with every new release. We believe that when making big changes to any existing system, the more information that is available to the people given that task, the better.
But what about desktop users?
While the above should make it clear why businesses are extremely interested in this technology, and why we have been digging so deep into its inner workings, we are also noticing an increased interest from the rest of our reader base, and have been flooded with questions about the how and why of all these different kinds of virtualization. Since we don't want to leave any interested people out in the cold, and the in-depth articles may seem a bit daunting to anyone looking to get an introduction, here is another article in our "virtualization series". In it, we will attempt to guide our readers through the different technologies and their actual uses, along with some interesting tidbits for regular desktop users.
"New" Virtualization vs. "Old" Virtualization
The recent buzz around the word "virtualization" may give anyone the impression that it is something relatively new. Nothing is further from the truth however, since virtualization has been an integral part of server and personal computing, almost from the very beginning. To keep using the single term "virtualization" for each of its countless branches and sprouted technologies does end up being quite confusing, so we'll try to shed some light on those.
How to Define Virtualization
To define it in a general sense, we could state that virtualization encompasses any technology - either software or hardware - that adds an extra layer of isolation or extra flexibility to a standard system. Typically, while increasing the amount of steps a job takes to complete, the slowdown is made up for with increased simplicity or flexibility for the part of the system affected. To clarify, the overall system complexity increases, in turn allowing the manipulation of certain subsystems to become a lot easier. In many cases, virtualization has been implemented to make a software developer's job a lot less aggravating.
Most modern day software has become dependent on this, making use of virtual memory for vastly simplified memory management, virtual disks to allow for partitioning and RAID arrays, sometimes even using pre-installed "virtual machines" (think of Java and .net) to allow for better software portability. In a sense, the entire point of an Operating System is to allow software a foolproof use of the computer's hardware, taking control of almost every bit of communication with the actual machinery, in an attempt to reduce complexity and increase stability for the software itself.
So if this is the general gist behind virtualization (and we can tell you it has been around for almost 50 years), what is this recent surge in popularity all about?
Baby Steps Leading to World-Class Innovations
Many "little" problems have called for companies like VMware and Microsoft to develop software throughout the years. As technology progresses, several hardware types become defunct and are no longer manufactured or supported. This is true for all hardware classes, from server systems to those old yet glorious video game systems that are collecting dust in the attic. Even though a certain architecture is abandoned by its manufacturers, existing software may still be of great (or perhaps sentimental) value to its owners. For that reason alone, virtualization software is used to emulate the abandoned architecture on a completely different type of machine.
A fairly recent example for this (besides the obvious video game system emulators) is found integrated into Apple's OS X: Rosetta. Using a form of real-time binary translation, it is able to change the behavior of applications written for the PowerPC architecture to match that of an x86-app. This allows a large amount of software that would normally have to be recompiled to survive an otherwise impossible change in hardware platforms, at the cost of some of its performance.
Hardware platforms have not been the only ones to change, however, and the changes in both desktop and server operating systems might force a company to run older versions of the OS (or even a completely different one) to allow the use of software coping with compatibility issues. Likewise, developers have a need for securely isolated environments to be able to test their software, without having to compromise their own system.
The market met these demands with products like Microsoft's Virtual PC and VMware Workstation. Generally, these solutions offer no emulation of a defunct platform, but rather an isolated environment of the same architecture as the host system. However, exceptions do exist (Virtual PC for the Mac OS emulated the x86 architecture on a PowerPC CPU, allowing virtual machines to run Windows).
Putting together the results of these methods has lead to a solution for the problem quietly growing in many a company's server room. While the development of faster and more reliable hardware was kicked up a notch, a lot of the actual server software lagged behind, unable to make proper use of the enormous quantity of resources suddenly available to it. Companies were left with irreplaceable but badly aging hardware, or brand new servers that suffered from a very inefficient resource-usage.
A new question emerged: Would it be possible to consolidate multiple servers onto a single powerful hardware system? The industry's collective answer: "Yes it is, and ours is the best way to do it."
f you're planning to buy shiny new hardware components to build an Intel HackMac, this article might be of interest for you.
If you have an AMD machine like most InfiniteMac members, you have probably already been thinking about your next computer upgrade. AMD works, but unfortunately there are several problems that are critical, such as the dual core / mouse movement bug, which forces several X2 AMD users to disable their second core. But no matter if you have already gained experience or if you're completely new to OSx86, just read this small guide. At the end you'll know which components work fine.
An Intel HackMac Kit is a great choice for everybody who's open-minded for new knowledge and enthusiastic about Mac OS X Leopard on PC. The only requirements are basic and easy-to-learn skills regarding OSx86. For instance, if there's a mac update available, then you should know exactly what to do, or at least where you can search for help to make it work.
Chipset & Motherboard
The best choice for your future Hackintosh is Socket 775, Intel's latest desktop CPU socket.
Nowadays, one knows which motherboards work nice for OSx86 use. We are in love with a series of motherboards built by Gigabyte. The Gigabyte GA-P35 motherboard variants with the Intel® P35 express chipset are our favourite ones. They are an excellent choice for fast and very reliable HackMacs. There are several different variants on the market, the cheaper ones don't include as much features as the more expensive variants. For example, most low cost models don't include Firewire. If you want your new system to have support for the widest range of features, then we gladly recommend the flagship of the GA-P35 series, the Gigabyte GA-P35-DS4. It supports Intel® Core™ 2 multi-core and 45nm processors and even DDR2 1066* memory for outstanding system performance. This indicates that it's a great choice for a future system. It also supports great graphics performance with a dual PCI-E x16 interface. Further, it has integrated SATA 3Gb/s with RAID function and features high speed Gigabit Ethernet and IEEE1394.
The Gigabyte GA-P35-DS4 works almost out out of the box with all the latest OSx86 releases out there. The only part that needs patching is the Audio, which is very easy to do.
The following variants of the GA-P35 and GA-EP35 series work very well. Make sure that you choose according to your requirements! If you aren't sure, GA-P35-DS4 is the greatest choice.
Gigabyte GA-P35-DS4
Gigabyte GA-P35-DS3
Gigabyte GA-P35C-DS3R
Gigabyte GA-P35-DS3R
Gigabyte GA-P35-DS3L
Gigabyte GA-P35-DS3P
Gigabyte GA-EP35-DS4
Gigabyte GA-EP35-DS3
Gigabyte GA-EP35C-DS3R
Gigabyte GA-EP35-DS3R
Gigabyte GA-EP35-DS3L
Gigabyte GA-EP35-DS3P
Processor
A great choice are Intel's latest desktop processors, Intel Core 2 Duo and Intel Core 2 Quad. They are very powerful, fast and reliable, just what we require for our new machine. Ask yourself how much performance you need. If you don't need maximum performance, you'll be fine with an Intel Core 2 Duo. Actually you can do almost everything, as long as you're not a power user. If you need extreme performance or if you just want your new system to be safe for the future, pick an Intel Core 2 Quad Processor.
They are screamers.
Intel Core 2 Duo
Intel Core 2 Duo E6750 2.66 GHz
Intel Core 2 Duo E8400 3.0 GHz
Intel Core 2 Quad
Intel Core 2 Quad Q6600 2.40 GHz
Intel Core 2 Quad Q9300 2.50 GHz
Intel Core 2 Quad Q9450 2.66 GHz
Graphics Card
No matter if you require top notch graphic performance or just a simple solution to provide Quartz Extreme & Core Image, many NVidia and ATI graphic cards work fine. The best solutions for OSx86 are currently provided by NVidia.
Some 512 MB variants don't seem to work without flashing the cards BIOS, please read the OSx86Project HCL before buying!
The following graphic cards are highly recommended.
GeForce 8800 GTX
GeForce 8800 GTS
GeForce 8600 GT
GeForce 7600 GT
GeForce 7300 GT
You can find the best deals to buy your graphic card at NewEgg and Buy.com, if you are located in the United States. If you are from Germany, you'll find the best deals at Amazon.de.
RAM
There is no specific RAM vendor that you have to choose, but make sure that it's DDR2 RAM.
I personally recommend Crucial. Maybe you'd like to consider a Crucial Ballistix 4GB DDR2 Kit, or a Crucial Ballistix 2GB DDR2 Kit. It's top notch RAM that'll work fine with the Gigabyte motherboard.
Case & Power Supply
Choose your favourite vendor for your case and power supply, but don't buy cheap ones. Case and Power are the most important things to keep your hardware cool and reliable, so make sure that you invest accordingly.
Addendum
Have fun with assembling and using your new machine. Please don't hesitate to add a comment with your new hardware specifications. We love to see new HackMacs and their happy users!
Finally, don't forget to buy your copy of Mac OS X Leopard, if you haven't done so yet! It's a nice and adorable box, I highly recommend it. And don't forget that it's a great piece of computer history. You will be glad to be able to look at it, when you have it in your cupboard after several years.
All thanks fly out to Apple for their incredible Mac OS X operating system. It helps us to enjoy our work every day and we truly love it at heart.