Can any of you guys actually tell a difference between 30fps and 60fps?
I was curious and fired up Halo 3 on both MCC and my 360, aside from the nice anti-aliasing in MCC, I couldn’t feel any difference, am I in the minority here?
Thanks!
I’d say you’re blind and need some sort of mediation. Not only does Halo 3 on MCC looks infinitely smoother but controls are much more responsive.
After getting used to playing 60 fps games, switching to 30 fps feels a bit jarring at first. 30 fps feels slower and less responsive.
> 2533274888449416;1:
> Can any of you guys actually tell a difference between 30fps and 60fps?
> I was curious and fired up Halo 3 on both MCC and my 360, aside from the nice anti-aliasing in MCC, I couldn’t feel any difference, am I in the minority here?
> Thanks!
Well, it could simply be the case, that you simply have a very mediocre or straight out BAD tv-set (could also be the tv settings ruining it), ruining every game you play.
You should see the “30fps-blur” here, if your computer/graphs/monitor is up to it:
https://youtu.be/OLfLQ__We64?t=149
REMEMBER TO SWITCH TO 1080p60 -quality or you’ll likely be getting a lower reso/fps
Don’t feel too bad about it. Only in this generation are movies starting to get in to the 60 business and some games like Fallout 4 still think it’s ok to run like that. Besides, if you compulsively had to worry about things like fps, you might be overtly competitive and already getting a stroke on a pc somewhere.
60 fps is beautiful. The 360 games do seem very jarring
> 2533274888449416;1:
> Can any of you guys actually tell a difference between 30fps and 60fps?
> I was curious and fired up Halo 3 on both MCC and my 360, aside from the nice anti-aliasing in MCC, I couldn’t feel any difference, am I in the minority here?
> Thanks!
You maybe playing on a television that can’t show you the difference, but if you have a TV that resolves at 1080 and refreshes at a minimum of 120Hz then you’ll feel the difference, especially when the image is in motion (i.e. using your right thumb stick a lot).
Besides graphics, I see no difference. Still laggy.
Yes… I can tell between 30 FPS and 60 FPS… Our eyes can see up to 60 FPS
> 2533274873843883;6:
> You maybe playing on a television that can’t show you the difference, but if you have a TV that resolves at 1080 and refreshes at a minimum of 120Hz then you’ll feel the difference, especially when the image is in motion (i.e. using your right thumb stick a lot).
All TVs will refresh at least at 60 Hz, and “120 Hz” is more often a marketing trick than the TV’s actual refresh rate, referring to backlight strobing (which does reduce motion blur, though).
> 2533274804945518;8:
> Our eyes can see up to 60 FPS
And beyond…
> 2533274825830455;9:
> > 2533274873843883;6:
> >
>
>
>
>
> > 2533274804945518;8:
> > Our eyes can see up to 60 FPS
>
>
> And beyond…
I want some sauce on that, because I’ve never heard anywhere that the human eye can detect passed 60 FPS.
> 2533274795950354;4:
> > 2533274888449416;1:
> > Can any of you guys actually tell a difference between 30fps and 60fps?
> > I was curious and fired up Halo 3 on both MCC and my 360, aside from the nice anti-aliasing in MCC, I couldn’t feel any difference, am I in the minority here?
> > Thanks!
>
>
> Well, it could simply be the case, that you simply have a very mediocre or straight out BAD tv-set (could also be the tv settings ruining it), ruining every game you play.
>
> You should see the “30fps-blur” here, if your computer/graphs/monitor is up to it:
> Halo 5 | 60FPS Vs 30FPS |Comparision - YouTube
> REMEMBER TO SWITCH TO 1080p60 -quality or you’ll likely be getting a lower reso/fps
>
> Don’t feel too bad about it. Only in this generation are movies starting to get in to the 60 business and some games like Fallout 4 still think it’s ok to run like that. Besides, if you compulsively had to worry about things like fps, you might be overtly competitive and already getting a stroke on a pc somewhere.
I do see the difference in the video, (on my laptop screen) I’ve got a 1080p LG monitor that I picked up a few years back for the xbox. I’ll check to see if it’s outputing in some weird “film” mode or something thats inhibiting refresh rate though. Thanks for the help 
It is very easy to tell the difference. If you haven’t had the chance, try playing a PC game on a 144hz monitor with a game you can get that kind of framerate on. I didn’t think I’d be able to tell the difference between 60hz and 144hz, but it was a bigger difference than 30fps vs 60fps, absolutely jarring.
I’m fine with 60 dog
> 2533274804945518;10:
> > 2533274825830455;9:
> > > 2533274873843883;6:
> > >
> >
> >
> >
> >
> > > 2533274804945518;8:
> > > Our eyes can see up to 60 FPS
> >
> >
> > And beyond…
>
>
> I want some sauce on that, because I’ve never heard anywhere that the human eye can detect passed 60 FPS.
Go to your local electronic store and compare a 144hz and 60hz screen. You can see the difference just as much as you can see the difference between 30 and 60. The eye is effectively able to resolve infinite frames per second. When you look at something normally there is no refresh rate. You will pretty much always be able to see a difference until display technology either no longer flickers or flickers at such at such a speed that it is essentially a steady light.
The human eye can’t process anything over 26fps so I’m not sure why we need 60.
> 2533274804945518;10:
> > 2533274825830455;9:
> > > 2533274873843883;6:
> > >
> >
> >
> >
> >
> > > 2533274804945518;8:
> > > Our eyes can see up to 60 FPS
> >
> >
> > And beyond…
>
>
> I want some sauce on that, because I’ve never heard anywhere that the human eye can detect passed 60 FPS.
On the contrary, what makes you think that 60 Hz is some magical limit? Would be quite the coincidence if the maximum perceivable frame rate coincided with the US AC frequency, don’t you think? (Which is the reason CRT TVs in the early does got standardized to 60 Hz.) No, the eye doesn’t see the world in frames, it’s much more complicated than that. There’s a frequency above which individual images can be perceived as motion, usually said to be above 12 Hz. Then there’s the flicker threshold, below which flicker of lights becomes perceivable. For many people, flicker is still easily perceivable at 60 Hz. This was historically a problem with CRT monitors for many people, and anecdotally I can say that even with plasma displays I can still see the flicker not head-on, but in my peripheral vision. So, even at 60 Hz, we can still see artifacts that shouldn’t be there. Here’s one study, which concluded that young adults can see gaps in visual stimuli of at least about 16 milliseconds: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2826883/. This article seems to report similar findings: http://vision.arc.nasa.gov/personnel/pavel/publications/TemporalSensitivity.pdf.
Beyond that though, it becomes kind of hard to define the difference between frame rates. There’s an often repeated factoid that fighter pilots in tests have been observed to recognize objects in images lasting as little as 1/225th of a second, so 225 Hz, effectively. I’ve tried to search the source many times, but most people just conveniently repeat this without any source, so its reliability is dubious at best. However, I can give you at least on article which should give you reason to believe that 60 Hz is not some kind of magical limit: http://link.springer.com/article/10.3758%2Fs13414-013-0605-z. It’s unfortunately behind a paywall, but I quote:
> The main findings of Experiment 1 are that viewers can detect and retain information about named targets that they have never seen before at an RSVP duration as short as 13 ms, and that they can do so even when they have no information about the target until after presentation. Furthermore, no clear discontinuity in performance emerged as duration was decreased from 80 to 13 ms. […] A feedforward account of detection is more consistent with the results, suggesting that a presentation as short as 13 ms, even when masked by following pictures, is sufficient on some trials for feedforward activation to reach a conceptual level, without selective attention.
13 milliseconds would correspond to about 76 Hz. So, at the very least effects that only last 13 milliseconds are visually perceivable, and worth pursuing. And just because this study went to 13 milliseconds, that’s not necessarily the limit either. That was simply the shortest duration they tested.
Moreover, the problem of motion in video is more complex than that. I can tell that a higher frame rate looks smoother, but I can’t tell why that is. However, I do know that 60 Hz isn’t as smooth as it could be. The prime example of this is reasonably slow, panning motion. Consider this demo: https://frames-per-second.appspot.com/, with an immobile background with two identical balls (any balls) at 60 Hz; one immobile and one moving at 500 pixels/second, without motion blur. The problem you run into is that even if you try to follow the moving ball, it doesn’t appear as sharp as the stationary ball. But from real world experiences you know that if your eyes are following a slowly moving object, they should see it as sharp as a stationary one. This clearly isn’t the case, so 60 Hz isn’t enough to replicate motion in a way not differentiable from real motion. I also point out that at 200 pixels/second, the blurring effect is substantially smaller, and practically imperceivable at 100 pixels/second if you ignore potential stuttering, suggesting that frame rate is most relevant for fast motion.
Finally, there are also non-visual effects to high frame rates in video games, because games are interactive. Games typically have around four frames of input lag: http://www.gamasutra.com/view/feature/3725/measuring_responsiveness_in_video_.php?print=1, which at 30 Hz translates to around 133.3 milliseconds, at 60 Hz to around 66.7 milliseconds, and at 120 Hz to around 33.3 milliseconds. Even if you choose not to believe that the human vision is capable of differentiating video content above 60 Hz, you should at the very least appreciate the improvement in responsiveness for games. There was an article pertaining to traffic safety, if my memory serves, that I’ve unfortunately lost, which cited studies to have shown lag of as low as 20 milliseconds to be perceivable, which would mean that as long as we are above that threshold, lower latencies are worth pursuing.
i keep getting lag on arena and the opening scenes for warzone, other than that everything is normally smooth… but cant tell the difference!
I’ll tell ya what, before I got the XB1 and MCC, I didn’t expect CEA, 3 or 4 to look that different than the originals, I was one of those grumpy people that heard the talk about 1080p and 60fps and thought to myself “Oh, I don’t need all that crap, I’m sure it looks the same…” WOW I was wrong. My eyes can definitely tell the difference. I booted up Halo 3 on the 360 the one night just to compare back to back, and I was stunned how big a difference it made. OG Halo 3 looked every bit an 8 year old game.
> 2533274825830455;16:
> > 2533274804945518;10:
> > > 2533274825830455;9:
> > > > 2533274873843883;6:
> > > >
> > >
> > >
> > >
> > >
> > > > 2533274804945518;8:
> > > > Our eyes can see up to 60 FPS
> > >
> > >
> > > And beyond…
> >
> >
> > I want some sauce on that, because I’ve never heard anywhere that the human eye can detect passed 60 FPS.
>
>
> On the contrary, what makes you think that 60 Hz is some magical limit? Would be quite the coincidence if the maximum perceivable frame rate coincided with the US AC frequency, don’t you think? (Which is the reason CRT TVs in the early does got standardized to 60 Hz.) No, the eye doesn’t see the world in frames, it’s much more complicated than that. There’s a frequency above which individual images can be perceived as motion, usually said to be above 12 Hz. Then there’s the flicker threshold, below which flicker of lights becomes perceivable. For many people, flicker is still easily perceivable at 60 Hz. This was historically a problem with CRT monitors for many people, and anecdotally I can say that even with plasma displays I can still see the flicker not head-on, but in my peripheral vision. So, even at 60 Hz, we can still see artifacts that shouldn’t be there. Here’s one study, which concluded that young adults can see gaps in visual stimuli of at least about 16 milliseconds: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2826883/. This article seems to report similar findings: http://vision.arc.nasa.gov/personnel/pavel/publications/TemporalSensitivity.pdf.
>
> Beyond that though, it becomes kind of hard to define the difference between frame rates. There’s an often repeated factoid that fighter pilots in tests have been observed to recognize objects in images lasting as little as 1/225th of a second, so 225 Hz, effectively. I’ve tried to search the source many times, but most people just conveniently repeat this without any source, so its reliability is dubious at best. However, I can give you at least on article which should give you reason to believe that 60 Hz is not some kind of magical limit: http://link.springer.com/article/10.3758%2Fs13414-013-0605-z. It’s unfortunately behind a paywall, but I quote:
>
>
> > The main findings of Experiment 1 are that viewers can detect and retain information about named targets that they have never seen before at an RSVP duration as short as 13 ms, and that they can do so even when they have no information about the target until after presentation. Furthermore, no clear discontinuity in performance emerged as duration was decreased from 80 to 13 ms. […] A feedforward account of detection is more consistent with the results, suggesting that a presentation as short as 13 ms, even when masked by following pictures, is sufficient on some trials for feedforward activation to reach a conceptual level, without selective attention.
>
>
> 13 milliseconds would correspond to about 76 Hz. So, at the very least effects that only last 13 milliseconds are visually perceivable, and worth pursuing. And just because this study went to 13 milliseconds, that’s not necessarily the limit either. That was simply the shortest duration they tested.
>
> Moreover, the problem of motion in video is more complex than that. I can tell that a higher frame rate looks smoother, but I can’t tell why that is. However, I do know that 60 Hz isn’t as smooth as it could be. The prime example of this is reasonably slow, panning motion. Consider this demo: https://frames-per-second.appspot.com/, with an immobile background with two identical balls (any balls) at 60 Hz; one immobile and one moving at 500 pixels/second, without motion blur. The problem you run into is that even if you try to follow the moving ball, it doesn’t appear as sharp as the stationary ball. But from real world experiences you know that if your eyes are following a slowly moving object, they should see it as sharp as a stationary one. This clearly isn’t the case, so 60 Hz isn’t enough to replicate motion in a way not differentiable from real motion. I also point out that at 200 pixels/second, the blurring effect is substantially smaller, and practically imperceivable at 100 pixels/second if you ignore potential stuttering, suggesting that frame rate is most relevant for fast motion.
>
> Finally, there are also non-visual effects to high frame rates in video games, because games are interactive. Games typically have around four frames of input lag: http://www.gamasutra.com/view/feature/3725/measuring_responsiveness_in_video_.php?print=1, which at 30 Hz translates to around 133.3 milliseconds, at 60 Hz to around 66.7 milliseconds, and at 120 Hz to around 33.3 milliseconds. Even if you choose not to believe that the human vision is capable of differentiating video content above 60 Hz, you should at the very least appreciate the improvement in responsiveness for games. There was an article pertaining to traffic safety, if my memory serves, that I’ve unfortunately lost, which cited studies to have shown lag of as low as 20 milliseconds to be perceivable, which would mean that as long as we are above that threshold, lower latencies are worth pursuing.
You got me, but anything about 60 FPS is just a bonus in my opinion.
The higher you go in resolution and the bigger the screen, the more noticeable the difference at higher frame rates. In movies, the frames are blurred together for smoother motion. In video games, there typically is little or no motion blurring unless they’re trying for some sort of cinematic factor.