Isn't the major point of this that it makes piracy impossible? I think that if there is no benefit (or even a small detriment) to the game experience, industry will still embrace this.
As I understand it, piracy is essentially the reason PC gaming is dead now.
from 2007:
"The biggest gaming platform last year was the PC. Why? Online revenue alone exceeded 7 billion USD in 2007. As well, DFC predicts total PC gaming revenue to top 19 billion by 2013"
Interesting. I know that there is lots of activity in certain sectors of PC gaming based on a subscription model where piracy is also basically impossible. (World of Warcraft, for example).
But then you will have other game companies release console based games that perform better and people will flock to those. I think this is a failure out of the gate; we'll see.
Could, but won't. I woulnd't want to subject myself to the lag inherent in sending commands from my controller to a remote server and then recieving live video back to my screen again. Multiplayer would simply not work.
Even if it worked, here in Canada most ISPs have a monthly bandwidth limit (usually around 20-30GB, which is ridiculously low), so it would be really expensive for the end user.
This would be great for a Myst-style game, or turn-based games, where latency isn't as much of a problem.
You use the cloud as your distributed-renderer, instead of relying on a single desktop, and send finished frames to the client. Maybe you do some compositing on the client.
The graphics and AI for games designed around this approach's limitations could be incredible.
It might be more than a few years ahead of its time... it might be physically impossible.
At the theoretical speed of light, once around the equator is 2 frames of lag for a 30 FPS game (which most console games are these days).
Back when there was the whole graphics card benchmark wars going on, driver writers were buffering commands for two or three extra frames to prevent pipeline stalls. This would allow the game to achieve a higher frame rate, but introduced as little as 30 ms of input lag that hardcore players complained about.
Ignoring the fact that the worst case is halfway around the equator (though real world routes could be less direct), when you want to reduce latency you want to have data centers as near to your customers as possible. In that case, 10 to 15 ms round-trip time to many customers is feasible with current tech and infrastructure. That's about one frame at 60fps.
It's actually pretty impressive, they do show actual stuff (for example high-details Crysis multiplayer with one player on set-top box and another one using browser plugin on low-end notebook).
-----
About latency: I was also very skeptical, but in principle it could be doable if they would deploy servers also locally (it can make economical sense for dense urban areas). Google already does this.
I tried to ping around to check latency limits: even across Atlantic I could get to around 30 FPS roundtrip. For servers that were few hundred kilometers away it was ~100 FPS and for my ISP it's ~1000 FPS.
While I think latency will still be an issue to some degree, it's probably not nearly to the level people are imagining due to current experience since this is quite different.
With current gaming systems, everyone's computer/console is rendering a version of the game with all known info. As a player acts, that info is sent to the server and distributed to other participants. If one user is experiencing (or intentionally causing) latency with this exchange of data, there can be what amounts to a "collision" in the state of the game. These problems are solved in various ways (i.e. jumping locations) that affect game play.
In this system, there is one central computer that is managing all happenings in the game and rendering out a unique view for each player. There will be no "collisions" of conflicting information because there are not multiple entities creating and modifying game actions.
I imagine any lag would primarily be experienced as your character responding to your actions in a delayed manner or a drop of video signal. Both would definitely inhibit game play, but they should ONLY be apparent to the user experiencing the lag. A lagged user would just be a sitting duck for any other player, but not change the game play for anyone else. THAT would be a huge improvement.
You've got it backwards with lag. With local rendering power, minor lag can be magicked away as the local machine applies prediction algorithms to make a decent guess as to what the game state will be in the next .25 seconds. With this system, every minor lag blip for more than one frame, which I will be generous and assume will be 30fps and thus 33ms (rather than 60fps) will manifest as total game freeze, because every minor lag blip is logically equivalent to walking up to your real-life XBox and just disconnecting the video cable, assuming a TV that would then hold the previous frame instead of dropping the picture.
There's no way in hell this thing is going to play FPSs. Think "Bejeweled". With no animations, or very choppy ones.
It's a terrible, terrible design suited solely for extracting investment money from credulous investors. If you are a credulous investor doing due diligence on this product and you've encountered this message, if they're promising anything other than casual, graphically-unimpressive games, this is your cue to run.
Right now, even with broadband internet connections, players consistently experience game-ruining latency, and that is sending magnitudes less information.
I seriously doubt that OnLive could work in the U.S.; the broadband infrastructure just isn't there yet. Korea and Japan, on the other hand? Sign me up!
Latency and bandwidth are fundamentally distinct. In particular, if all the players in a game are on the same OnLive server, then there's no synchronization issue between systems as there usually is, only a latency issue from the renderer to your screen.
The beauty of it is you don't need a console, but makes up for it in the bandwidth needed. If my internet goes out I can still use my computer I can play the wii, and chances are if my internet is out weathers not too good outside so I don't really see me wanting something like this.
Idk about other people but I really do like tangible objects like discs or even things stored on a harddrive not a thinclient experience. Another thing is if this actually becomes big one wonders how much the quality would degrade as servers would be gaining load and losing performance, so would play quality go out the window at peak periods of the day, something you don't see with the current system in place with consoles even in multiplayer games.
"OnLive will supply players with a small set-top box, not much bigger than a Nintendo DS, which will plug into your TV and your home broadband connection."
You mean like a console?...
"OnLive also includes some features you might associate more with your DVR than with a gaming console, including a Replay feature that lets you save the last ten seconds of your gameplay, and send it to your friends."
Only the last 10 seconds? That's nearly useless. First, 10 seconds is really short. Second, you might not want to interrupt your gameplay immediately after accomplishing some feat.
My major problem with this console is that the price difference between "a box that can feed the TV with a screen off the network and cope with all the associated problems thereof" and "a traditional console" is not likely to be wide enough to make the inevitable disadvantages worth it.
Cloud mania appears to have obscured the fact that personal computing power continues to grow. We are currently in a momentary I/O bobble that will resolve itself in a year or two with solid state drives (which, by their fast pace of development, seem eager to catch up to all the other components of the computer), leaving only multicore to deal with, which putting this sort of thing in the cloud can hardly ignore either.
By the time this thing comes out, the box + one year of service will almost certainly cost more than an XBox 360, and will probably be going head-to-head with the "Playstation 4" or the "XBox 720" with what is now the current gen readily available for $50. The PS4/XBox360 will probably be bragging about pushing 120Hz of 1080P with surround sound.
My minor complaint is that this just won't work on a consumer grade network, full stop. It will take years to make the deals with Comcast and everyone to get the necessary QoS that this would take. Nobody is streaming video out in real time over the internet. "Real time" here doesn't mean "real time minus two seconds" or "streaming a video while permitting a 5 minute pre-buffer", "real time" means a solid, uninterrupted stream of data with no more than a handful of frames of latency, where the slightest network bobble means your game freezes solid.
I imagine that's probably a limitation of the technology (and not an arbitrary design choice). Like most storage issues, it will probably become exponentially less of a problem over time.
They are also offering a browser plugin for your PC or Mac so you don't need their box. The box is only required if you want to use your TV and not your computer.
One thought I had as mainly a non-gamer is on the costs. Since this is being done in the cloud, it'll cost a bit of money to keep a game running - a company can't simply sell you the game, it now has to pay a bit for each hour you play. As a game gets older, it becomes less attractive for a company to support it.
Of course, I assume that the monthly subscription fees could cover it, or perhaps a pay-per-drink model, but I wonder what will happen to the up-front costs.
Latency is the biggest issue when it comes to competitive gaming. For the casual market it's fine. But for the hardcore competitive players, it won't be popular. It is also impossible for a up and coming player to get to a high level of technicality when it comes to gaming. It's just not possible even with the slightest bit of latency.
Latency is mostly a solved problem though when it comes to high speed connections though. Early when cable was released in my area there was high burst rates but bad latency - so back then I went with a dsl connection that had lower band width but better latency. These days though the latency is low enough on the cable connection in my area that I can play FPS perfectly fine.
From the article it sounds like the were able to play an FPS like Crysis just fine. FPS seems to be the gold standard, if you can run an FPS you can play any other type of game because they don't have the same type of split second reaction times.
It definitely sounds like an interesting system to try. I can't see how they could possibly stream HD content fast enough to make an FPS playable - but maybe that only works on a fiber connection.
> These days though the latency is low enough on the cable connection in my area that I can play FPS perfectly fine.
You're not talking about the same kind of latency problem that these guys face.
It seems that this service is sending the player's input to the server, and sending back compressed rendered frames of video. So, the soonest you can see any of your actions is the round-trip time.
Current FPS's do a tremendous amount of simulation on the client. Only state information is sent back and forth, and not everything is even in lock-step with the server. It's a completely different architechture.
Latency is only "solved" in modern games by hacking around things on both the client and server so that visual indicators are predicted on the client, and conflicting interactions(e.g. two players firing and killing each other simultaneously) are resolved by rolling back game state and examining the timestamps. It's unclean.
They're common in all games. Even if the ISPs got their act together (no sign of that) the stuff on the consumer's end is still a problem. Lots of ISPs distribute modems with significant problems. Lots of companies (Linksys, Belkin, and DLink come to mind as offenders) sell wifi boxes that are not only pathetically slow, but come with absurd defaults that make them even slower (Think 150ms+ ping on the wlan).
Not to mention the huge variety of brokenness that happens with PCs, which they're going to have to deal with (they have a client for Windows/OS X as well as the set top box).
To be honest, I would be amazed if latency didn't kill this project. Or turn it into a puzzle game platform.
This reminds me of The Phantom game console created by Infinium Labs. It promised to deliver games via the internet and be compatible with PC games. The company blew through $60 million and the product was largely Vaporware.
It was most likely too early to be a significant competitor to consoles.
Could latency be worked around akin to how VNC handles your mouse movements? I.e., it moves your mouse on the screen immediately and then lets the actual mouse on the remote computer catch up. (I think)
That's how games work today, combined with a bunch of prediction and conflict resolution. Unfortunately, it sounds like their architecture fundamentally prevents this- all the rendering is going to happen server side. I'd be interested to know if they have some sort of halfway solution (rendering extra scenes that you can pick from?) or if they just hope no one will notice the extra 100ms.
(I know that doesn't sound like much, but for games that involve fast action- from starcraft to quake- it's a lot...)
As I understand it, piracy is essentially the reason PC gaming is dead now.