Virtual Nvidia: It's All Fun & Games Until Someone Loses a GPU
Cloud computing has been a buzzworthy trend for so long that it's now simply a movement in progress. Our files and apps are rapidly migrating from our hard drives to distributed servers, leaving behind only a few jealously guarded and resource-intensive tasks that must be kept local. Crucial, vital tasks... such as Skyrim, or StarCraft 2, or Diablo 3...
Look under the hood of half the machines running these games, and you'll find Nvidia. Well, "half" is inaccurate; Nvidia currently accounts for about 17% of the PC video market plus every PS3, while rival AMD shows up in the Xbox 360, the Nintendo Wii, and gets about 25% of PCs. The big video winner is Intel, due to integrated virtual GPU, but we'll get to that later.
Funny you should mention virtual, because that's Nvida's big news for the week. At the GPU Technology Conference in San Jose, Nvidia unveiled a plan -- two plans, actually -- to create incredibly powerful and efficient servers with their GPUs. The bottom line is virtualization, the ability to use whatever device you have on hand to access a much more powerful computer somewhere else.
Virtualization first reared its head in the business world, with VMware and Citrix (who happens to be partnered with Nvidia on this project). Nevertheless, when you see Nvidia's name in a headline, you're going to wonder what the news will mean to your frames per second. So when the company summarizes its VGX desktop virtualization platform before introducing the GeForce GRID, is it a shift in priorities for Nvidia, or simply a way to heighten the drama of the unveiling?
Probably a bit of both. You see, if you haven't been following the fortunes of the gaming market, the sad truth is that things aren't looking good for the "traditional" gaming platforms. Sony's year has been frankly disastrous -- a $6.4 Billion loss, with further losses expected in 2012 -- and even the plucky Nintendo saw its first operating losses ever. The only winner is Microsoft, who continues to make up in Xbox Live what it loses in Xbox 360 manufacturing. Hmm, that would indicate that there's more money to be made in an online service than in hardware, wouldn't it?
Sony and Nintendo both came out with next-generation handhelds this year, which were warmly greeted and critically acclaimed and largely ignored in favor of Angry Birds and Cut the Rope. You see, there really is no reason for casual gamers to buy an additional device to carry around with them if their smartphones and tablets can provide a comparable gaming experience. In fact, with titles such as Dead Space HD and Infinity Blade II, even hardcore gamers can appreciate picking up New iPad instead of a Vita or 3DS.
And that's without even mentioning OnLive. You see, OnLive is a magical app that turns your iPad into anything you want it to be... or in other words, it's exactly what Nvidia is aiming for; a virtualization platform that allows you to play PC games on your iPad (or wherever). OnLive even tends to downplay the games in favor of their business applications, just like Nvidia is doing. A spate of "GeForce GRID vs. OnLive" articles followed Nvidia's announcement -- many neglecting to mention that the two are already partners, which suggests to me that Nvidia has at least one pre-groomed customer base to step into (and mostly likely impress) when the time is right.
That time will coincide with the death of the PC, which will be a very bad time for business that depend on sales of PC components, such as consumer graphics cards. One might assume that virtualization is a losing proposition for Nvidia -- after all, putting one big GPU in a server will take the place of countless video card sales to consumers and PC / console manufacturers. But Nvidia is obviously looking to the future, and the recent past, and seeing sales that will taper off to nothing no matter how good they make their GPUs.
Whereas graphics cards have long been a necessity even for non-gaming PC builds, the add-on card format is quickly giving way to onboard video in the budget and mainstream sphere. Once Intel's Z68 chipset hit the market, even home theater enthusiasts and gamers started thinking seriously about the virtual GPU approach.
The same thing happened to the once-ubiquitous Creative SoundBlaster series and other discrete soundcards. Onboard audio started as a less-than-ideal option for all-in-one solution, but before long had become the default for most PC builders and users. Serious audiophiles and music-makers can still splurge on more sophisticated soundcards and external audio devices, but the vast majority of the market has discovered that good enough is indeed good enough. Where once we had individual cards for everything from Ethernet to drive controllers, the integration, efficiency, and cost savings of an all-mobo solution wins every time.
And if you can offload duties to the cloud, your hardware matters even less. Virtualization offers you more and more power with thinner and thinner clients -- in other words, ultimately negating the PC itself. Consoles have already reached the gaming capabilities of anything but the most enthusiast-quality PCs, and they (along with mobile devices) offer a more convenient way for many people to get music, video, web browsing, social media... all of those things that 90% of computer users spend 90% of their time doing.
But why even have a console when your TV can do those things? The current generation of LG Smart TVs offer an intuitive and flexible app-based environment, and the "Wii meets Siri" capabilities of the LG Magic Remote. Expect other manufacturers to follow suit; even Apple has been rumored to be working on a big-screen HDTV that contains their increasingly capable (and eminently marketable) AppleTV set-top box functionality.
To complement the move toward cloud-based apps and files, virtualized hardware and computing environments are a perfect fit for a TV with an Ethernet port and an "always-on" Internet connection, not to mention an app-driven interface that can already handle 90% of the current computing experience. Consumers that are any mix of PC users, gamers and home theater enthusiasts (i.e., all of us) should jump at the chance to minimize the clutter and cabling, and more importantly to merge the budgets for their next TV, next PC, and next-generation gaming console. Especially if there's an Nvidia-powered virtualized GPU doing all the work, sending everything you need to play high-resolution games over a fast Internet connection.
And there's the gorilla in the room. No matter how powerful your cloud-based and virtualized system may be, you're still at the mercy of that ISP lifeline. Nvidia's testers (like those of OnLive before them) claimed a lovely lack of lag and latency, but I'm willing to bet that most of us will see real-world results that vary wildly. Even online gamers blessed with short runs of ultra high-speed fiber sometimes struggle with the latency issue, and the rest of us copper-bound folks can't even count on consistent bandwidth, let alone the kinds of minimal latency necessary to sustain a fast-paced, immersive 3D gaming experience. GPU engineering could enable a gazillion petaFLOPS, but that won't allow them to overcome basic network inefficiencies. Perhaps Nvidia and Google could partner up to bury fiber in everyone's yard? After all, Verizon and AT&T have all but given up...
As I mentioned earlier, virtualization also doesn't bode well for the offline experience. I may be part of a dying breed, but I'm a bit suspicious of the trend to make all computing dependent on an always-connected state. My paranoia was inflamed by "Games for Windows Live", which began the extremely shady practice of requiring a network login even to play games whose resources were completely installed on my own hardware. I get it -- this is a combination of marketing and DRM that works very well for advertisers and rights holders (including Capcom, Epic Games, Playcast, THQ, and Ubitus, all of whom have already announced plans to put their games on the GeForce GRID). Not so much for people who simply want to play games, but thankfully we still have helpful hackers who are more than eager to find ways around these needless barriers. For the time being.
But the best hackers in the world can't overcome a dependency on remote hardware. Am I saying that there's a conspiracy between Nvidia, high-speed ISPs, and major game companies? Uh... no, I guess not. There doesn't really need to be a nefarious secret plan, just the usual business of scratching each others' backs. After all, they're simply following market trends... okay, maybe creating and encouraging them, whenever the natural momentum starts to stall. But that's no reason to be paranoid. This is not Assassin's Creed, after all...