Tuesday, April 7, 2009

Sony PS3 Outsells Nintendo Wii in Japan

Sony Corp's PlayStation3 outsold Nintendo Co Ltd's Wii in March for the first time in 16 months in Japan thanks to the hot new PS3 titles from Sega Sammy and Capcom, a game magazine publisher said.

Video game sales are closely watched for hints on how soon Sony can turn around its struggling game operations and how much growth momentum Nintendo has left.

Domestic sales of the PS3 came to 146,948 units in the five weeks through March 29, compared with 99,335 units of the Wii and 43,172 units of Microsoft Corp's (MSFT.O) Xbox 360, Enterbrain said on Monday.

The latest versions of popular action adventure series from Sega Sammy Holdings Inc and Capcom Co Ltd—"Ryu Ga Gotoku 3" and "Resident Evil 5," respectively—came in first and second in game software sales in the period, helping drive PS3 demand.

Mizuho Investors Securities analyst Etsuko Tamura said that despite its strong showing in March, the PS3 is unlikely to threaten the Wii's global dominance as more software makers are focusing development resources on the Wii, a console with the largest user base among the current generation of hardware.

Read More......

Wednesday, April 1, 2009

An Introduction to Virtualization

Introduction

Over the course of the past couple of months, the AnandTech IT team has been putting a lot of time into mapping out and clarifying the inner workings of several interesting forms of virtualization. The purpose of these articles was not primarily to cover "news" (however current the topic might be), but to create a sort of knowledge base on this subject at AnandTech, as we find many people interested in learning more about virtualization are met with a large amount of misinformation. Secondly, we believe a better knowledge on the subject will empower the people in control of their company's IT infrastructure and help them make the correct decisions for the job.

There's no denying that virtualization is changing company's server rooms all over the world. It is promoting both innovation and the preservation of aged applications that would otherwise not survive a migration to modern hardware platforms. Virtualization is completely redefining the rules of what can and cannot be done in a server environment, adding a versatility to it that is increasing with every new release. We believe that when making big changes to any existing system, the more information that is available to the people given that task, the better.

But what about desktop users?

While the above should make it clear why businesses are extremely interested in this technology, and why we have been digging so deep into its inner workings, we are also noticing an increased interest from the rest of our reader base, and have been flooded with questions about the how and why of all these different kinds of virtualization. Since we don't want to leave any interested people out in the cold, and the in-depth articles may seem a bit daunting to anyone looking to get an introduction, here is another article in our "virtualization series". In it, we will attempt to guide our readers through the different technologies and their actual uses, along with some interesting tidbits for regular desktop users.

"New" Virtualization vs. "Old" Virtualization

The recent buzz around the word "virtualization" may give anyone the impression that it is something relatively new. Nothing is further from the truth however, since virtualization has been an integral part of server and personal computing, almost from the very beginning. To keep using the single term "virtualization" for each of its countless branches and sprouted technologies does end up being quite confusing, so we'll try to shed some light on those.

How to Define Virtualization

To define it in a general sense, we could state that virtualization encompasses any technology - either software or hardware - that adds an extra layer of isolation or extra flexibility to a standard system. Typically, while increasing the amount of steps a job takes to complete, the slowdown is made up for with increased simplicity or flexibility for the part of the system affected. To clarify, the overall system complexity increases, in turn allowing the manipulation of certain subsystems to become a lot easier. In many cases, virtualization has been implemented to make a software developer's job a lot less aggravating.

Most modern day software has become dependent on this, making use of virtual memory for vastly simplified memory management, virtual disks to allow for partitioning and RAID arrays, sometimes even using pre-installed "virtual machines" (think of Java and .net) to allow for better software portability. In a sense, the entire point of an Operating System is to allow software a foolproof use of the computer's hardware, taking control of almost every bit of communication with the actual machinery, in an attempt to reduce complexity and increase stability for the software itself.

So if this is the general gist behind virtualization (and we can tell you it has been around for almost 50 years), what is this recent surge in popularity all about?

Baby Steps Leading to World-Class Innovations

Many "little" problems have called for companies like VMware and Microsoft to develop software throughout the years. As technology progresses, several hardware types become defunct and are no longer manufactured or supported. This is true for all hardware classes, from server systems to those old yet glorious video game systems that are collecting dust in the attic. Even though a certain architecture is abandoned by its manufacturers, existing software may still be of great (or perhaps sentimental) value to its owners. For that reason alone, virtualization software is used to emulate the abandoned architecture on a completely different type of machine.

A fairly recent example for this (besides the obvious video game system emulators) is found integrated into Apple's OS X: Rosetta. Using a form of real-time binary translation, it is able to change the behavior of applications written for the PowerPC architecture to match that of an x86-app. This allows a large amount of software that would normally have to be recompiled to survive an otherwise impossible change in hardware platforms, at the cost of some of its performance.

Hardware platforms have not been the only ones to change, however, and the changes in both desktop and server operating systems might force a company to run older versions of the OS (or even a completely different one) to allow the use of software coping with compatibility issues. Likewise, developers have a need for securely isolated environments to be able to test their software, without having to compromise their own system.

The market met these demands with products like Microsoft's Virtual PC and VMware Workstation. Generally, these solutions offer no emulation of a defunct platform, but rather an isolated environment of the same architecture as the host system. However, exceptions do exist (Virtual PC for the Mac OS emulated the x86 architecture on a PowerPC CPU, allowing virtual machines to run Windows).

Putting together the results of these methods has lead to a solution for the problem quietly growing in many a company's server room. While the development of faster and more reliable hardware was kicked up a notch, a lot of the actual server software lagged behind, unable to make proper use of the enormous quantity of resources suddenly available to it. Companies were left with irreplaceable but badly aging hardware, or brand new servers that suffered from a very inefficient resource-usage.

A new question emerged: Would it be possible to consolidate multiple servers onto a single powerful hardware system? The industry's collective answer: "Yes it is, and ours is the best way to do it."

Read More......
 
Template design by Amanda @ Blogger Buster