Phil and 343 reps have been hyping Xbox one and Halo 5 on twitter with statements like ‘we’re bringing innovation’, scope and scale, and (not exactly halo 5 but phil said) “team has been testing ray tracing using the cloud. Really amazing visuals”. So i really wonder, if the leaks and rumors suggest that we can expect “200 brutes with wraiths, banshees, scarabs” and everything running around in real time at 60fps why on earth can they still not achieve 1080p??
I KNOW that they have not confirmed a resolution for final product and that the game can be 1080p BUT if it were to be 1080p then they would have confidentially said it’s going to be 1080p at launch. However, the game will more that likely be 900p or have some sort of dynamic scaling ‘upto 1080p’, but what screws my mind is that 'if’ Halo 5 can have all of what i said above in real time and 60fps on the power of the all mighty cloud, how on earth can they not get 1080p??
Phil since e3 2013 twitter has been saying that they have teams working on ray tracing visuals and HUGE open world sandboxes running in the cloud with thousands of things being computed in the cloud and then streamed to your console in real time on xbox one BUT game are still not at 1080p???
My theory is that no matter how amazing the cloud computing really is there is something in the xbox one hardware that that doesn’t quiet allow for a full 1080p output buffer NO matter how much things you have computed in the cloud. I know games have come out in full 1080p BEFORE BUT not without sacrificing something eg (forza-racing game-repeated textures, GTA V- last gen textures with amazing rockstar optimization**)-AND** these games are both at 30fps
Do you guys know what i mean or am i not making it easy to follow?
> 2533274832290972;2:
> I really don’t care if the resolution is 1080p or 900p. As long as it’s solid 60 fps and has good textures and lighting I’m good.
Same as me but what i don’t get is how can the game not be 1080p yet still have this ‘amazing scope and scale and hundreds of enemies on screen at once with these amazing visuals’??
Pixel density does not always equal amazing visuals. Look at all the indie games that run in 1080p. Not to say they are worse or better, but with a more simplistic game and less individual items on screen it is much easier to get 1080p resolution. If you have more things on screen, better lighting effects, textures, etc. it may still look better at a lower resolution, but forcing the game to run at 1080p could require more power than is available, causing performance issues.
A lot of the time, devs would rather make what you see amazing as opposed to hitting some magic pixel number.
They could give you 1080p…or greater draw distances, larger environments, more objects on screen, better lighting.
You can sort of see it as a triangle: frame rate, resolution, or graphic engine power. (Draw distances, lighting etc). Focusing on one pulls away from the other two.
> 2533274949956561;3:
> > 2533274832290972;2:
> > I really don’t care if the resolution is 1080p or 900p. As long as it’s solid 60 fps and has good textures and lighting I’m good.
>
>
> Same as me but what i don’t get is how can the game not be 1080p yet still have this ‘amazing scope and scale and hundreds of enemies on screen at once with these amazing visuals’??
Resolution and framerate are largely dependent on the Xbone’s GPU, which is pretty terrible by today’s standards. Having a ton of AI running around and huge open worlds is (AFAIK) less dependent on the GPU and more on the CPU and maybe system memory, particularly if they can somehow load some of this processing off onto the “cloud,” although I don’t know if that’s even possible.
Also keep in mind that building a game with all these bells and whistles involves trade-offs. So you say “how can there be hundreds of enemies and amazing scope and scale and not 1080p?” Well that would be because the developers are putting more hardware resources into more enemies instead of resolution. They could do 1080p if they wanted, but other aspects of the game would have to be dialed way back.
I will be surprised of Halo 5 isn’t 1080p / 60 FPS. They have been using those terms so much to push MCC that it would only be perceived as a step backwards to drop to 900p / 60 FPS. I am sure they are doing ridiculous optimization work at 343i, and if they have to tone down certain effects or textures or enemies to hit that resolution, I figure they probably will.
First things first, the odds of 343 Industries using cloud power is next to zero. After the always online fiasco, only a multipley-only game like Titanfall could get away with cloud power. Halo 5 of all things would be trashed the second they made that announcement. Anyone who doesn’t have a good enough internet connection or doesn’t like playing online, or has some kind of restriction on internet usage, would riot because using cloud power means the game would have to be always online. And since that’s the case, the developers will have to rely solely on what’s inside the Xbone.
Secondly, it is entirely possible to hit 1080p60 on the Xbox One with a state of the art game. The problem lies with the hardware Microsoft chose to go with. It’s that pesky esRAM that is troubling developers. From what I read it isn’t very easy to use, but may or may not solve the resolution/frame rate problem if utilized correctly. Supposedly DirectX12 will help developers with using that esRAM. I’m no tech expert so take all that with a grain of salt, but that’s what I read.
Either way, I’m sure 343 will do all they can to reach 1080p60 by launch, if not, then 900p which is perfectly fine. The Beta ran at 720p and it was still gorgeous. Also, I wouldn’t use Halo 4 as a measuring stick. Halo 2 Anniversary, both the campaign and multiplayer, had a leg up graphically over Halo 4. Halo 4 is a beautiful game, and it does indeed look next-gen at 1080p. But it wasn’t built for next-gen systems and was upgraded in any other way. Halo 2 Anniversary was built for next-gen so it makes sense that its graphics would be better than Halo 4’s. The multiplayer hit 1080p60 when playing alone, but add another player via split-screen and there started to be problems. The campaign on the other hand was 1328x1080 or something like that. It wasn’t 1080p. Why? Because it was running two game engines simultaneously which was required for the instant switch. If they had taken the instant switch out, 1080p60 would have been a reality.
It’s all about optimization at this point. Though going forward I am still a tad skeptical that games will maintain 1080p60, even on PS4. Ready at Dawn probably won’t admit it, but the 30fps was not a design choice. To get those graphics they probably had to sacrifice the frame rate.
I can’t believe that some people still expect 60fps AND native 1080p for H5 - comparing it to the MCC shows that those people don’t know what they’re talking about.
343i had problems to get all the games in the MCC to 1080p (see H2A) - and now you expect H5 which will be a much more taxing game (models, lighting, effects - most likely scale, enemies on screen) to be in 1080p?
Just a small reminder: The H5 Beta was 720p.
Mark my words: If you’re hoping for 1080p you’ll be disappointed for sure.
ill never know the difference between 1080 and 900. if the game turns out to suck you’re not going to say “well at least it has a high resolution” Id rather them make an excellent game first then worry about resolution
As long as its 60fps I wouldn’t really care if it were 900 or 1080p. I went back to the 360 to play some Halo on there and came to the conclusion that I just can’t handle 30fps anymore (sadly).
900p upscaled to 1080p would be perfectly fine
and 720p upscaled to 1080p would be also fine
I just want it to be 60fps with LOTS of graphical enhancements in the campaign Huge insane battles with hudreds of AI at the same time I guess they could make the multiplayer 1080p but the Campaign 720p I would be perfectly a- OK with that I mean the beta looked clean! on my 24 inch FULL HD monitor I also played it on my 50" and it still looked GOOD!
> 2533274949956561;3:
> > 2533274832290972;2:
> > I really don’t care if the resolution is 1080p or 900p. As long as it’s solid 60 fps and has good textures and lighting I’m good.
>
>
> Same as me but what i don’t get is how can the game not be 1080p yet still have this ‘amazing scope and scale and hundreds of enemies on screen at once with these amazing visuals’??
Upscaling is a powerful tool. Besides, a game can be 720p, and still have ‘amazing scope and scale with hundreds of enemies on screen at once’. I play on PS4, Xbox One, and PC, and I’m telling you honestly, there is such an ever so slight difference between 900p, and 1080p. It’s nigh impossible to see the difference.
> 2533274800197828;9:
> > 2533274949956561;3:
> > > 2533274832290972;2:
> > > I really don’t care if the resolution is 1080p or 900p. As long as it’s solid 60 fps and has good textures and lighting I’m good.
> >
> >
> > Same as me but what i don’t get is how can the game not be 1080p yet still have this ‘amazing scope and scale and hundreds of enemies on screen at once with these amazing visuals’??
>
>
> Resolution and framerate are largely dependent on the Xbone’s GPU, which is pretty terrible by today’s standards. Having a ton of AI running around and huge open worlds is (AFAIK) less dependent on the GPU and more on the CPU and maybe system memory, particularly if they can somehow load some of this processing off onto the “cloud,” although I don’t know if that’s even possible.
>
> Also keep in mind that building a game with all these bells and whistles involves trade-offs. So you say “how can there be hundreds of enemies and amazing scope and scale and not 1080p?” Well that would be because the developers are putting more hardware resources into more enemies instead of resolution. They could do 1080p if they wanted, but other aspects of the game would have to be dialed way back.
>
> I will be surprised of Halo 5 isn’t 1080p / 60 FPS. They have been using those terms so much to push MCC that it would only be perceived as a step backwards to drop to 900p / 60 FPS. I am sure they are doing ridiculous optimization work at 343i, and if they have to tone down certain effects or textures or enemies to hit that resolution, I figure they probably will.
It’s actually the cpu that’s the bottleneck. But that’s why they’re pushing for cloud computing. To offload resources from the cpu load. Which is going to limit A.I. and scripts among other thing.
> 2533274813583846;19:
> It’s actually the cpu that’s the bottleneck.
Care to tell why that would be the case? My understanding on the processing pipeline of games work is (crudely) that the game state is first calculated on the CPU (e.g. object locations/states), after which the task is given for the GPU to render based on the updated state. All the rendering, including drawing the final frame which is sent to the display, is done on the GPU. Then the maximum available resolution is purely a limitation of a) the speed at which the GPU can draw the frame b) the size of whatever buffers the data is stored in before it’s sent to the display and c) the available bandwidth between memory, GPU, and display.
I don’t see how the CPU could be a bottle neck for the resolution, unless there’s something in the rendering pipeline I haven’t understood.