Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For a networked multiplayer game, the total amount of network lag is effectively the same, except that the network lag is moved from between client and server to client and "display". Now that the server and client are now just one instance, there is zero lag there.


Considering that pretty much all multiplayer games in existence perform some sort of local prediction / state interpolation to hide the lag on the local machine, cloud only multiplayer will be considerably worse. After all you can no longer hide the lag locally, since you're not computing anything on the local machine, so the minimum precieved lag will go from 0 (for movement of your own player character) to the RT between the datacenter and your PC :/


Not entirely true. Many networked games use client side prediction, and collision detection (at least for collisions against terrain and walls, weapon hits are often computed on the server to curb cheating).

If you have a 100ms latency to a stadia server, then there's a 200ms latency between pushing forward on the stick and your character moving forward. This is not the case on a networked game, the client would start moving your player immediately (even if there is a 100ms delay to send those updates to the server).


Except, that's not actually the case as you still have latency between servers.


This is only relevant if you're deliberately splitting the game across multiple servers (MORE work) so that people living in different cities can play against each other and have low latency to their own server. Other than that there's no reason to run one multiplayer game on more than one server.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: