File sizes and compression.

Do you think we will see better compression with infinite. File sizes are getting crazy and it’s gotten to a point where you could want to hop onto guardians because a playlist is back only for it to have vanished once the games reinstalled. I don’t expect much of a hardrive bump next gen but 4k textures are auto installed on the x.

If we look at Nintendo’s exclusives. They pack alot of content into a tiny file, granted the assets are far less detailed for the most part. I think compression is an issue that gets overlooked this gen. Any thoughts or insights?

We’ll probably see Intelligent Delivery implemented deeply with Infinite. There are still plenty of people who play only campaign, only multiplayer, only Forge, or a combination of two of the three. That would definitely help cut down space used by a lot.

Regardless, it will be cool to see what new tech and features Xbox comes out with before Infinite’s release. They’ve been hitting home run after home run (with maybe a couple of missed swings) over the past year or two so there’s a lot to be excited for.

> 2533274824050480;2:
> We’ll probably see Intelligent Delivery implemented deeply with Infinite. There are still plenty of people who play only campaign, only multiplayer, only Forge, or a combination of two of the three. That would definitely help cut down space used by a lot.
>
> Regardless, it will be cool to see what new tech and features Xbox comes out with before Infinite’s release. They’ve been hitting home run after home run (with maybe a couple of missed swings) over the past year or two so there’s a lot to be excited for.

Agreed.

Would be great if there was some way for Microsoft to put pressure on the ISPs and push them to get rid of monthly data caps.
A lot of current-gen games are around 100GB now, making a 1TB monthly cap very limiting, never mind streaming video on top of that. Bits and bytes are not a finite resource and should not be treated as such.

I’m not sure how compression works so I’m not gonna just say “make the game smaller, sing -Yoinks!-”. A lot of games are so huge this gen, I think RDR2 is over 100gb? Absolutely nuts. Anyone care to give me a compression for dummies explanation as to how it works?

> 2533274819029930;4:
> Would be great if there was some way for Microsoft to put pressure on the ISPs and push them to get rid of monthly data caps.
> A lot of current-gen games are around 100GB now, making a 1TB monthly cap very limiting, never mind streaming video on top of that. Bits and bytes are not a finite resource and should not be treated as such.

Not an issue for me but I hear ya.

> 2533275031939856;5:
> I’m not sure how compression works so I’m not gonna just say “make the game smaller, sing -Yoinks!-”. A lot of games are so huge this gen, I think RDR2 is over 100gb? Absolutely nuts. Anyone care to give me a compression for dummies explanation as to how it works?

This is fairly rudimentary but as I’m unsure of your exposure to terms and systems. How Does File Compression Work?

It’s actually a real interesting topic there’s an interview with miyamoto that uses Pokémon gold as a reference point that’s worth reading. I can’t recall where I read it tho. Perhaps shumplations.

> 2535411919953126;7:
> > 2533275031939856;5:
> > I’m not sure how compression works so I’m not gonna just say “make the game smaller, sing -Yoinks!-”. A lot of games are so huge this gen, I think RDR2 is over 100gb? Absolutely nuts. Anyone care to give me a compression for dummies explanation as to how it works?
>
> This is fairly rudimentary but as I’m unsure of your exposure to terms and systems. How Does File Compression Work?
>
> It’s actually a real interesting topic there’s an interview with miyamoto that uses Pokémon gold as a reference point that’s worth reading. I can’t recall where I read it tho. Perhaps shumplations.

Thanks man. So it sounds like they really would just need to put in the time to actually do it, not much magic going on. Hell I feel like there could be programs to even do it for you.

> 2533275031939856;8:
> > 2535411919953126;7:
> > > 2533275031939856;5:
> > > I’m not sure how compression works so I’m not gonna just say “make the game smaller, sing -Yoinks!-”. A lot of games are so huge this gen, I think RDR2 is over 100gb? Absolutely nuts. Anyone care to give me a compression for dummies explanation as to how it works?
> >
> > This is fairly rudimentary but as I’m unsure of your exposure to terms and systems. How Does File Compression Work?
> >
> > It’s actually a real interesting topic there’s an interview with miyamoto that uses Pokémon gold as a reference point that’s worth reading. I can’t recall where I read it tho. Perhaps shumplations.
>
> Thanks man. So it sounds like they really would just need to put in the time to actually do it, not much magic going on. Hell I feel like there could be programs to even do it for you.

Ya pretty much although from what I’ve read (admittedly very little) Nintendo are currently using some wizardly proprietary software to condense their file sizes by instant amounts. I guess it was spurred by the hight cost of the switches cartridges and the limited space on console/SDcards. But if you can claim that your console fits X amount more games on the same space as playstation then it might be worth Microsoft game studios investing in a new compression software. Dunno just my 2 cents.

This is a technical topic that’s difficult to make educated comments on without knowing the spcifics of what sort of compression 343i is already using, or what kind of data constitutes the hundred or so gigabytes in modern games. The reality is that as games get bigger in scale and their assets get more detailed, file sizes will grow, even if we were able to use theoretically optimal compression. Unless Halo 5 had some huge completely unnecessary assets taking spac, or unless Halo Infinite will be a significantly less ambitious game (both of which are highly unlikely), it will almost certainly have a larger file size than Halo 5.

> 2535411919953126;9:
> > 2533275031939856;8:
> > > 2535411919953126;7:
> > > > 2533275031939856;5:
> > > > I’m not sure how compression works so I’m not gonna just say “make the game smaller, sing -Yoinks!-”. A lot of games are so huge this gen, I think RDR2 is over 100gb? Absolutely nuts. Anyone care to give me a compression for dummies explanation as to how it works?
> > >
> > > This is fairly rudimentary but as I’m unsure of your exposure to terms and systems. How Does File Compression Work?
> > >
> > > It’s actually a real interesting topic there’s an interview with miyamoto that uses Pokémon gold as a reference point that’s worth reading. I can’t recall where I read it tho. Perhaps shumplations.
> >
> > Thanks man. So it sounds like they really would just need to put in the time to actually do it, not much magic going on. Hell I feel like there could be programs to even do it for you.
>
> Ya pretty much although from what I’ve read (admittedly very little) Nintendo are currently using some wizardly proprietary software to condense their file sizes by instant amounts. I guess it was spurred by the hight cost of the switches cartridges and the limited space on console/SDcards. But if you can claim that your console fits X amount more games on the same space as playstation then it might be worth Microsoft game studios investing in a new compression software. Dunno just my 2 cents.

For sure. I’d love to not have to go out and buy a $200 external drive just to keep my games installed lol With 4k becoming the norm they need to get on it, that or the next consoles better be huge. This gen if you get a standard xb1 you can install like 5 big games and you’re capped, that’s mind boggling and so limiting.

> 2533274825830455;10:
> This is a technical topic that’s difficult to make educated comments on without knowing the spcifics of what sort of compression 343i is already using, or what kind of data constitutes the hundred or so gigabytes in modern games. The reality is that as games get bigger in scale and their assets get more detailed, file sizes will grow, even if we were able to use theoretically optimal compression. Unless Halo 5 had some huge completely unnecessary assets taking spac, or unless Halo Infinite will be a significantly less ambitious game (both of which are highly unlikely), it will almost certainly have a larger file size than Halo 5.

Well we can at least say that mega textures should be used more. We saw some larger textures in halo 5 like in the forge canvases but it didn’t appear to be used a lot. Possible it was an engine limitation we might see a better use in slipspace. We also know that one of the worst offenders for large file sizes is the recent bumps in standard s of resolution and the growth of online stores presence at retail. If you don’t have to pay for more discs to sell a physical game you won’t care how much fits. Hence the large day one patches we get. These make compression less of a priority because it has no bearing on the cost on publishing side of the fence.

That said ID tech games seem to be not so bad and presumably due to the engines megatexture centric and mega particle centric design.

> 2535411919953126;12:
> > 2533274825830455;10:
> > This is a technical topic that’s difficult to make educated comments on without knowing the spcifics of what sort of compression 343i is already using, or what kind of data constitutes the hundred or so gigabytes in modern games. The reality is that as games get bigger in scale and their assets get more detailed, file sizes will grow, even if we were able to use theoretically optimal compression. Unless Halo 5 had some huge completely unnecessary assets taking spac, or unless Halo Infinite will be a significantly less ambitious game (both of which are highly unlikely), it will almost certainly have a larger file size than Halo 5.
>
> Well we can at least say that mega textures should be used more. We saw some larger textures in halo 5 like in the forge canvases but it didn’t appear to be used a lot. Possible it was an engine limitation we might see a better use in slipspace. We also know that one of the worst offenders for large file sizes is the recent bumps in standard s of resolution and the growth of online stores presence at retail. If you don’t have to pay for more discs to sell a physical game you won’t care how much fits. Hence the large day one patches we get. These make compression less of a priority because it has no bearing on the cost on publishing side of the fence.
>
> That said ID tech games seem to be not so bad and presumably due to the engines megatexture centric and mega particle centric design.

I don’t see how you think larger textures would help, since the more unique texture you have in the game, the more data you have. The MegaTexture technology in id Tech helps with the amount of data you need to have loaded to memory at any time, not the amount of total texture data you need to have. In fact, I remember the file size of Rage making some headlines back in the day with John Carmack revealing that uncompressed Rage was about a terabyte in size. After all, the whole point of MegaTextures was that you’d be able to have this huge areas with unique textures without repetition, which will cost you in storage.

> 2533274825830455;13:
> > 2535411919953126;12:
> > > 2533274825830455;10:
> > > This is a technical topic that’s difficult to make educated comments on without knowing the spcifics of what sort of compression 343i is already using, or what kind of data constitutes the hundred or so gigabytes in modern games. The reality is that as games get bigger in scale and their assets get more detailed, file sizes will grow, even if we were able to use theoretically optimal compression. Unless Halo 5 had some huge completely unnecessary assets taking spac, or unless Halo Infinite will be a significantly less ambitious game (both of which are highly unlikely), it will almost certainly have a larger file size than Halo 5.
> >
> > Well we can at least say that mega textures should be used more. We saw some larger textures in halo 5 like in the forge canvases but it didn’t appear to be used a lot. Possible it was an engine limitation we might see a better use in slipspace. We also know that one of the worst offenders for large file sizes is the recent bumps in standard s of resolution and the growth of online stores presence at retail. If you don’t have to pay for more discs to sell a physical game you won’t care how much fits. Hence the large day one patches we get. These make compression less of a priority because it has no bearing on the cost on publishing side of the fence.
> >
> > That said ID tech games seem to be not so bad and presumably due to the engines megatexture centric and mega particle centric design.
>
> I don’t see how you think larger textures would help, since the more unique texture you have in the game, the more data you have. The MegaTexture technology in id Tech helps with the amount of data you need to have loaded to memory at any time, not the amount of total texture data you need to have. In fact, I remember the file size of Rage making some headlines back in the day with John Carmack revealing that uncompressed Rage was about a terabyte in size. After all, the whole point of MegaTextures was that you’d be able to have this huge areas with unique textures without repetition, which will cost you in storage.

Your right that was exactly the point. But if you consider an environment using one large texture and then a couple staple small textures that are used frequently you will have a good chunk of memory saved as opposed to using large amounts of individually rendered objects. It also has the benefit of making texture streaming less intensive and any pop in would be simultaneous at the start of the session as opposed to a occuring hear and there. So there’s a definite benefit in keeping frames smooth. But if you have say 4-5 maps using a single megatexture that is gonna go a long way in helping the compression while giving the impression the environment is organic and not repeating itself. This can be applied in creative was to really get alot out of one texture base. A good comparison is 4k halo5 to 4k doom. Rage was big at the time yes but also suffered from a troubled development and was using tech that at the time was still in an early almost experimental state. We might also be looking at larger environment s for infinite where this will factor in.

> 2535411919953126;14:
> Your right that was exactly the point. But if you consider an environment using one large texture and then a couple staple small textures that are used frequently you will have a good chunk of memory saved as opposed to using large amounts of individually rendered objects. It also has the benefit of making texture streaming less intensive and any pop in would be simultaneous at the start of the session as opposed to a occuring hear and there. So there’s a definite benefit in keeping frames smooth. But if you have say 4-5 maps using a single megatexture that is gonna go a long way in helping the compression while giving the impression the environment is organic and not repeating itself.

I think you’re confused about what the MegaTexture technology (or virtual texturing in general) actually does. Its innovation is not the ability to use a single large texture. This is just a texture atlas. The point of virtual texturing is to get around the limitation on the amount of texture data that can be stored in video memory. It does nothing to the amount of texture data you have in the game overall. The total amount of textures you have in the game depends on how much surface you have that you need to texture, and how much you are willing to reuse textures for different surfaces. The latter is the space saving you’re probably thinking of, but this has nothing to do with virtual texturing.

EDIT: You can find some explanation and more sources on virtual texturing in this StackExchange thread.

> 2533274825830455;15:
> > 2535411919953126;14:
> > Your right that was exactly the point. But if you consider an environment using one large texture and then a couple staple small textures that are used frequently you will have a good chunk of memory saved as opposed to using large amounts of individually rendered objects. It also has the benefit of making texture streaming less intensive and any pop in would be simultaneous at the start of the session as opposed to a occuring hear and there. So there’s a definite benefit in keeping frames smooth. But if you have say 4-5 maps using a single megatexture that is gonna go a long way in helping the compression while giving the impression the environment is organic and not repeating itself.
>
> I think you’re confused about what the MegaTexture technology (or virtual texturing in general) actually does. Its innovation is not the ability to use a single large texture. This is just a texture atlas. The point of virtual texturing is to get around the limitation on the amount of texture data that can be stored in video memory. It does nothing to the amount of texture data you have in the game overall. The total amount of textures you have in the game depends on how much surface you have that you need to texture, and how much you are willing to reuse textures for different surfaces. The latter is the space saving you’re probably thinking of, but this has nothing to do with virtual texturing.

Ah my bad that’s ringing bells somewhere in there. I was reading an article somewhere that mentioned using large scale single use texturing as a way of saving memory. I believe it was to with large environment s in MMOs. To be honest I’m more versed in physics and audio, have little hands on experience with texture mapping I must of confused things along the way. I was aware of virtual texturing but had got the impression that mega textures included subbing out smaller fragments for a greater whole that could be complied and scaled to reduce the overall storage required. My bad. So megatextures is essentially caching fragments in ram? The whole thing relies on heuristic streaming of the textures?

> 2535411919953126;16:
> > 2533274825830455;15:
> > > 2535411919953126;14:
> > > Your right that was exactly the point. But if you consider an environment using one large texture and then a couple staple small textures that are used frequently you will have a good chunk of memory saved as opposed to using large amounts of individually rendered objects. It also has the benefit of making texture streaming less intensive and any pop in would be simultaneous at the start of the session as opposed to a occuring hear and there. So there’s a definite benefit in keeping frames smooth. But if you have say 4-5 maps using a single megatexture that is gonna go a long way in helping the compression while giving the impression the environment is organic and not repeating itself.
> >
> > I think you’re confused about what the MegaTexture technology (or virtual texturing in general) actually does. Its innovation is not the ability to use a single large texture. This is just a texture atlas. The point of virtual texturing is to get around the limitation on the amount of texture data that can be stored in video memory. It does nothing to the amount of texture data you have in the game overall. The total amount of textures you have in the game depends on how much surface you have that you need to texture, and how much you are willing to reuse textures for different surfaces. The latter is the space saving you’re probably thinking of, but this has nothing to do with virtual texturing.
>
> Ah my bad that’s ringing bells somewhere in there. I was reading an article somewhere that mentioned using large scale single use texturing as a way of saving memory. I believe it was to with large environment s in MMOs. To be honest I’m more versed in physics and audio, have little hands on experience with texture mapping I must of confused things along the way. I was aware of virtual texturing but had got the impression that mega textures included subbing out smaller fragments for a greater whole that could be complied and scaled to reduce the overall storage required. My bad. So megatextures is essentially caching fragments in ram? The whole thing relies on heuristic streaming of the textures?

That’s the main point of virtual textures: you have only the parts of the texture that need to be rendered immediately present in video memory, while the rest can be sitting on RAM or even in storage.

> 2535411919953126;1:
> Do you think we will see better compression with infinite. File sizes are getting crazy and it’s gotten to a point where you could want to hop onto guardians because a playlist is back only for it to have vanished once the games reinstalled. I don’t expect much of a hardrive bump next gen but 4k textures are auto installed on the x.
>
> If we look at Nintendo’s exclusives. They pack alot of content into a tiny file, granted the assets are far less detailed for the most part. I think compression is an issue that gets overlooked this gen. Any thoughts or insights?

Yeah I think Nintendo’s art style lets them get away with murder, when it comes to compressing data. Since a lot of textures / surfaces in Nintendo games rely on few colours, and (generally), not super complicated placements of these colours, I could almost imagine them having super low resolution textures that are scaled up, then through processing (I would imagine basic blurs in most Nintendo games) and texture filtering, constructing the intended texture rather effortlessly. But the point here is that it’s totally down to the art style: If Halo Infinite was going to be greyscale, or the world was going to be constructed of flowing gradients, they could definitely have a much smaller game. Similarly, by having less detailed 3D models, and stored the model as a series of parameterized instructions rather than a point cloud, size could once again be massively reduced. But the concessions that one would have to make to achieve this sort of result are massive for a game like Halo, with an established visual style.

So the other direction is actually improved, likely lossless, compression, and the problem with this is, decompressing data takes time, and resources. There’s a bunch of resources about this - especially for the *nix compression algorithms - Microsoft even has a number of articles (blog posts) about it on MSDN. While not all of that is immediately applicable to a game - I doubt there’ll be a whole lot of compression done in Halo Infinite’s code - some of it is certainly relevant: there are benefits to having random access to segments of a compressed folder, and there are clearly benefits to performance of having compression that is fast to decompress.

Now obviously compared to Halo 4 (14 or so GB), Halo 5 was, even at launch a monster - ~50GB then, and now ~100GB, Halo 5 is almost an order of magnitudes larger than it’s predecessor. However I would have thought, that most of the upgrades to Halo Infinite are going to be code changes - such as AI improvements. While navmeshes for AI certainly aren’t small, their improvements are very unlikely to bring the game to 200GB.

Also given what 343 has said about how they “can’t really cram anymore content into Halo 5” (if I recall in response to people asking for new Christmas emblems in their Christmas live stream), I would also theorise that the 50GB increase in size from launch to now wasn’t super deliberate, but was at least partially a result of poor planning for content delivery systems and the like.

> 2533274956613084;18:
> > 2535411919953126;1:
> > Do you think we will see better compression with infinite. File sizes are getting crazy and it’s gotten to a point where you could want to hop onto guardians because a playlist is back only for it to have vanished once the games reinstalled. I don’t expect much of a hardrive bump next gen but 4k textures are auto installed on the x.
> >
> > If we look at Nintendo’s exclusives. They pack alot of content into a tiny file, granted the assets are far less detailed for the most part. I think compression is an issue that gets overlooked this gen. Any thoughts or insights?
>
> Yeah I think Nintendo’s art style lets them get away with murder, when it comes to compressing data. Since a lot of textures / surfaces in Nintendo games rely on few colours, and (generally), not super complicated placements of these colours, I could almost imagine them having super low resolution textures that are scaled up, then through processing (I would imagine basic blurs in most Nintendo games) and texture filtering, constructing the intended texture rather effortlessly. But the point here is that it’s totally down to the art style: If Halo Infinite was going to be greyscale, or the world was going to be constructed of flowing gradients, they could definitely have a much smaller game. Similarly, by having less detailed 3D models, and stored the model as a series of parameterized instructions rather than a point cloud, size could once again be massively reduced. But the concessions that one would have to make to achieve this sort of result are massive for a game like Halo, with an established visual style.
>
> So the other direction is actually improved, likely lossless, compression, and the problem with this is, decompressing data takes time, and resources. There’s a bunch of resources about this - especially for the *nix compression algorithms - Microsoft even has a number of articles (blog posts) about it on MSDN. While not all of that is immediately applicable to a game - I doubt there’ll be a whole lot of compression done in Halo Infinite’s code - some of it is certainly relevant: there are benefits to having random access to segments of a compressed folder, and there are clearly benefits to performance of having compression that is fast to decompress.
>
> Now obviously compared to Halo 4 (14 or so GB), Halo 5 was, even at launch a monster - ~50GB then, and now ~100GB, Halo 5 is almost an order of magnitudes larger than it’s predecessor. However I would have thought, that most of the upgrades to Halo Infinite are going to be code changes - such as AI improvements. While navmeshes for AI certainly aren’t small, their improvements are very unlikely to bring the game to 200GB.
>
> Also given what 343 has said about how they “can’t really cram anymore content into Halo 5” (if I recall in response to people asking for new Christmas emblems in their Christmas live stream), I would also theorise that the 50GB increase in size from launch to now wasn’t super deliberate, but was at least partially a result of poor planning for content delivery systems and the like.

Ya the big N don’t have the most detailed artstyles and simple pallets I’m sure that goes a long way.

Lossless would have issues keeping things smooth. Still we know files only getting bigger so I hope we can find some solutions to help the issue. Storage is becoming the biggest grievance with console gaming. I guess that’s why we’re gettlge the X cloud streaming platform.

> 2535411919953126;19:
> Lossless would have issues keeping things smooth. Still we know files only getting bigger so I hope we can find some solutions to help the issue. Storage is becoming the biggest grievance with console gaming. I guess that’s why we’re gettlge the X cloud streaming platform.

Yeah I just very highly doubt that we’ll see as large of a leap in size from this generation to the next. Even if nothing is specifically done to avoid it. I woulda thought that the next Xbox will herald more improvements in the technical backends of games than the graphics throughput it has (and thus the size of textures / bump maps / models). The only case in which I could imagine Halo Infinite being much larger than ~100GB at launch is if the games levels are just significantly bigger, and that’s a thing I would welcome with open arms.