Why not 1080p with cloud computing??????

Ok i’ve seen this question multiple times, in youtube, ign here etc. If the cloud is so powerfull, and is being advertised as more powerfull than a Nvidia titan gtx why isnt warzone 1080p?

I’ll try to answer to the best of my ability. GPU are used because unlike CPU there are very good at computing trigonometric functions. For example: making a FMUL in intel x87 architecture (a multiplication) has a latency of 8, while FCOS (cosine function) has a latency of 180-280. The FTAN tangent function is even worst with 240-300. GPU has on the other hand a latency of 26 for FMUL (higher than the CPU) and only 48 in FCOS (Nvidia GT200 old i know) and only 98 for FTAN! All this is real data.
Also, while current CPU have in average 4 cores with two threats each (You can make 8 calculations at once) GPU have hundreds of cores, but very specialized cores. Thats why GPU prossesing is called parallel processing.

So if you want to make calculation that requires a sum, division, multiplication or substraction you use the CPU for better performance. But if you want to make calculations that use trigonometric function you use the GPU. Trigonometric functions are used for lots of things, when in a video game you rotate you camera you are aplying trigonometric functions to the world, because you are changing the angle. But other things use SIN COS and TAN. For example they are used in physics to calculate rotation and colitions of objects. So GPU are not only used for graphics and resolution and framerate, but for physics as well.

The cloud has a fundamental restriction. While its processing power is almost limitless (you can have several servers working with one xbox) it has a high delay, a lag. So things that have to be inmediatly calculated need to be rendered on the xbox. Only things that need massive amount of prossesing power and dont require a low latency can be uploaded to the cloud to be processed. For example physics and AI are no problem, but player movement and hit detection would be terrible because of the delay. The cloud can indeed improve graphics, but not by processing all of your game, but by offloading things like would normaly be done in your GPU to an external GPU. (Same for the CPU). If the GPU of your console is not very good (Xbox is mid-low range) you still wont be able to get to constant 1080p even with the cloud. What you can get is massive worlds, very good AI, a very, very high number of AI in your game, incredible physics (Like cracksown showed us), very good drawing distance and effects like fog etc. So the cloud is indeed like having a titan GTX, only dedicated to physics and high latency processes.

Finally, warzone uses the cloud for AI and physics, the same for online co-op. So thats all. Hope it help you solve some of your doubts about the subject.

I don’t think it’s war zone that’s not in 60 fps, its campaign cinematics, and thats for obvious reasons. For cloud computing you’ll need constant connection to XBL and for some that’s stupid or really difficult.

> 2533274819075101;2:
> I don’t think it’s war zone that’s not in 60 fps, its campaign cinematics, and thats for obvious reasons. For cloud computing you’ll need constant connection to XBL and for some that’s stupid or really difficult.

FPS are stable at 60 but 30 in cutscenes aparently. But i never talked about fps XD

Could you explain why GPUs compute trigonometric functions faster? I suppose that would be true if the algorithms can be parallelized well, but I was always under the assumption that GPUs are used in geometry computations not because individual geometric transformations are faster, but because of the sheer mass of transformations that can be parallelized easily.

Also, I’m a bit confused by these latency numbers. What are the units? I suppose they’re clock cycles because you’re referring to x86 (x87?). At the end of the day, where does this data come from. Do you have a source or is this from your own testing?

At the end of the day, I’m a bit lost what computing trigonometric functions has to do with cloud computing because as you pointed out, cloud computing is only viable for computations that aren’t time sensitive. So graphics computations are out of the question not because of trigonometric functions, but because you want the result for your computations back for the next frames, which means round trip time + computation + overhead < 17 ms, which just isn’t viable.

I’d rather not have to deal with cloud streaming for better resolution when I’d rather play the game offline.