My question is simple:
Is it still 30hz in BTB and 60hz in 4v4 ?
Honestly it should be 60hz everywhere at least and 120hz in ranked.
My question is simple:
Is it still 30hz in BTB and 60hz in 4v4 ?
Honestly it should be 60hz everywhere at least and 120hz in ranked.
Yes, it is still 30hz for BTB and 60hz for 4x4.
I also agree, ranked should be 120hz.
I thought they added the option to play 120hz 4v4?
Found this
https://www.ign.com/articles/halo-infinite-multiplayer-now-supports-120hz-on-series-s
If the server is tick 30 or whatever, it absolutely doesn’t matter if you play with 69394649399394 Hz
I get that it doesn’t matter what monitor I have but when you can suddenly change the option in game you would think support for 120 has been added.
Although I haven’t tested it in any way.
Your monitor has nothing to do with this.
You’re talking about image on your screen.
I was asking how many informations (not image) servers (not your computer) can deal per second. It contributes a lot to the feeling of a good hit detection.
https://win.gg/news/heres-what-tick-rate-means-and-which-games-have-the-highest/
How you see the game is very important no matter the speed of communication between you and the server.
Playing 30 fps compared to playing 120 fps is so different. One is horrible and the other one is smooth.
Tick rate is more about hit detection as I say and hit timing in 1v1.
Of course is your local performance very important but it does not matter in a multiplayer game if the underlying networking architecture, either per design or by circumstances, is total crap.
My 140Hz do not matter if the game has random packet loss. It’s also not exclusively hit detection that is affected.
Tick rate determines how fast the server updates information, and there is a lot more going on that shooting and hitting.
Hard agree on that 30 FPS bs; that really needs to die.
I’m pretty sure 343, like a lot of other game devs, are forcing FPS limits like this just so the older generations of technology can still have a ‘fair’ chance against the new or more recent consoles/PCs + new TV or monitor refresh rates/elite controllers/etc
I would like to know if developers forcing older versions of their consoles to play with one another is causing these seemingly random network issues? For example, Microsoft forces Xbox 360/Xbox One to play with the more advanced Series X and PCs. Is there any research into this or is this not an issue to worry about?
There are so many variables that come into play with network issues, you cannot simply point your finger at one thing for the cause of it. I’ll even go as far and say that 70%, possibly more, of the network issues are out of 343i’s hands.
So in other words, 343’s networking is so screwed up that it will probably never be fixed?
What I’m saying is, the networking issues are not always related to the server. Too many variables to solely point the finger at 343i/server.
For sure, like a player’s personal set up… For instance, a 2015 Xbox One playing online against brand new Series S/Xs and optimized PCs. Maybe these multiple tech miss matches are causing more internal issues than previously thought or maybe not, this was what I was asking about in my first post. It’s not only 343, all game devs do this type of forced cross console generation play and I was wondering if this could potentially be causing any issue at all.
Not what I’m referring to, I’m not talking about hardware of choice.
It’s out of 343i’s hands if the users network is mediocre.
No. Regardless of hardware, it’s all speaking the same language.