To play better online with the control, with USB or wireless cable?
> 2533274943558842;1:
> To play better online with the control, with USB or wireless cable?
I used wired connections for my controller and internet connection for the Xbox one. I think they are a lot more stable connections. I notice a lot less problems when I am using wired connections.
Playing on a wired connection is the best option. You’ll experience a lot less lag issues.
I don’t experience input lag when I play wireless, but just to be safe, go wired to minimize the chances of input lag. Same goes with wireless/wired connections to your Wi-Fi.
<mark>This post has been edited by a moderator. Please do not post inappropriate content.</mark>
*Original post. Click at your own discretion.
While I appreciate the old school rule of thumb on using a wired controller, the xbox one controller boils down to about 1/8 of a frame in terms of wireless input latency. The human brain cannot process anything(theoretically and on average) faster than 250 ms, and from a hardware perspective, when using an HDMI at 60hz(which is the maximum refresh rate technically capable without upgrading to hdmi 2.0[only available on xbox one x]) the difference is too negligible to accurately track. tl;dr is that there is no difference in performance between wired and wired controllers that we can detect.
However, what is and usually ends up being a problem is the TV settings. Not all TVs are created equal, and although some are listed as being 4k 60hz, what they really mean is 4k 60hz in game mode. Even then however, you may be looking at 12-15 ms of input lag (newer OLEDs are reaching down to 10ms), which is why pros often use gaming monitors instead. Display port, 144hz and 1ms input lag means negligible input lag. In fact, the majority of people complaining about “controller lag” are actually experiencing TV input lag, which can stretch up to 134ms of lag on a 45 inch 4k TV, running at optimum visual quality in movie mode. The gist of this is that TVs have a gaming mode which sacrifices visual fidelity for performance, and alot of gamers don’t realize this.
As far as your internet situation. The theoretical maximum for an ethernet cable is 100mbps. The theoretical max for wireless 802.11g is 54mbps. Now unless your’re running fios, most internet speeds in the US range from 1-25 mbps download. So provided the rest of your household isn’t eating up that wireless bandwidth with netflix and <mark>[REDACTED]</mark>, you should be well beneath that threshold, meaning there would be no theoretical performance advantage in using a wired connection, unless you are downloading massive files.
That being said, wireless can be tricky with interference between other routers nearby, and sometimes even things like your microwave or fricken roomba can throw off a signal. This is why alot of gamers use wired connections. I personally don’t live in an area with fast enough internet to exceed my routers theoretical bandwidth, so I have never experienced a difference in performance from one to the other. The safest way to tell however, is to use speedtest to test both wireless and wired connections on your computer, in order to tell if you are losing performance wirelessly. From there you can decide if it’s worth it to experiment with port forwarding or changing signal channels to deconflict with other local area networks.
> 2533274814390441;5:
> While I appreciate the old school rule of thumb on using a wired controller, the xbox one controller boils down to about 1/8 of a frame in terms of wireless input latency.
Really? You don’t happen to have source? This article puts it at 20% reduction from the wireless Xbox 360 controller, which is nowhere near the two milliseconds you’re suggesting.
> 2533274814390441;5:
> The human brain cannot process anything(theoretically and on average) faster than 250 ms
Again, would appreciate to see a source on this. I’ve tried to hunt down research articles, but the best I can find is this paper, which states that “With latency-only manipulation, even 50-ms latencies harmed performance”. However, this is quoting results from another study, which I didn’t manage to find.
I’m not really even sure what you mean here. As far as processing things that only persist for a short amount of time, 13 milliseconds appears entirely reasonable. Of course, this has nothing to do with lag, bceause how lag affects performance is entirely different from how fast humans process things.
> 2533274814390441;5:
> Even then however, you may be looking at 12-15 ms of input lag (newer OLEDs are reaching down to 10ms), which is why pros often use gaming monitors instead. Display port, 144hz and 1ms input lag means negligible input lag.
No LCD displays with 1 ms input lag exist. They’re coming close with the faster displays starting to go under five milliseconds, which is negligible, but still, we shouldn’t confuse input lag with the G2G response time that most manufacturers advertise on the box (which itself is often somewhat deceptive).
> 2533274825830455;6:
> > 2533274814390441;5:
> > While I appreciate the old school rule of thumb on using a wired controller, the xbox one controller boils down to about 1/8 of a frame in terms of wireless input latency.
>
> Really? You don’t happen to have source? This article puts it at 20% reduction from the wireless Xbox 360 controller, which is nowhere near the two milliseconds you’re suggesting.
>
>
>
>
> > 2533274814390441;5:
> > The human brain cannot process anything(theoretically and on average) faster than 250 ms
>
> Again, would appreciate to see a source on this. I’ve tried to hunt down research articles, but the best I can find is this paper, which states that “With latency-only manipulation, even 50-ms latencies harmed performance”. However, this is quoting results from another study, which I didn’t manage to find.
>
> I’m not really even sure what you mean here. As far as processing things that only persist for a short amount of time, 13 milliseconds appears entirely reasonable. Of course, this has nothing to do with lag, bceause how lag affects performance is entirely different from how fast humans process things.
>
>
>
>
> > 2533274814390441;5:
> > Even then however, you may be looking at 12-15 ms of input lag (newer OLEDs are reaching down to 10ms), which is why pros often use gaming monitors instead. Display port, 144hz and 1ms input lag means negligible input lag.
>
> No LCD displays with 1 ms input lag exist. They’re coming close with the faster displays starting to go under five milliseconds, which is negligible, but still, we shouldn’t confuse input lag with the G2G response time that most manufacturers advertise on the box (which itself is often somewhat deceptive).
Human Benchmark Average and mean reaction times based on over 67 million participants, which factor in both display and input latency.
Secondly, my point about reaction times is regarding the overall perception of “latency” which was stated to further my point that any controller latency with a wireless xbox one controller is imperceptible to the human brain, and furthermore if you are experienceing “input lag” it’s most likely a display issue.
Third, I never stated that any LCD exists with a 1ms respose time. I stated the average for well documented OLED benchmarks, then stated that was why most professional gamers use LED(typically twisted nematic or IPS) gaming monitors. In fact I’m not sure where you found LCD at all in that sentence. I wasn’t aware that anyone still even used LCD TVs.
Finally, this guy is one of the few people to actually release his test data on input latency, and he clearly found that the input lag measured was a direct result of software issues rather than hardware. Full video below.
> 2533274814390441;7:
> Human Benchmark Average and mean reaction times based on over 67 million participants, which factor in both display and input latency.
>
> Secondly, my point about reaction times is regarding the overall perception of “latency” which was stated to further my point that any controller latency with a wireless xbox one controller is imperceptible to the human brain, and furthermore if you are experienceing “input lag” it’s most likely a display issue.
Which is kind of irrelevant to latency considerations, because how latency impacts performance, and how people perceive latency, has nothing to do with reaction time. Reaction time is about how long it takes for the user to react to changes in the system. Latency is the converse of that: how long it takes for the system to react to inputs from the user. There is a priori no reason to expect these two things to be related.
> 2533274814390441;7:
> Third, I never stated that any LCD exists with a 1ms respose time. I stated the average for well documented OLED benchmarks, then stated that was why most professional gamers use LED(typically twisted nematic or IPS) gaming monitors. In fact I’m not sure where you found LCD at all in that sentence. I wasn’t aware that anyone still even used LCD TVs.
> Best TVs & Monitors for Gaming: Input Lag Database
You said “Display port, 144hz and 1ms input lag means negligible input lag”, which I took to mean that you think some monitors have one millisecond of input lag. I decided to comment on it because—regardless of what you intended to say or what you meant—there already exists a great deal of confusion about input lag and the response time, which are two entirely different things. The response time itself is often presented in marketing material, and since the way in which manufacturers quote the response time measurements is already deceptive, they often quote figures like “1 ms” on the high-end gaming monitors, and “5 ms” even on the bog standard monitors that you get for cheap at any store. We don’t want gamers to be thinking that they are getting 5 ms of input lag, when they may be getting 30.
Also, this is completely besides the point, but just to set the terminology straight here:
- OLED refers to a relatively new display technology where each pixel of the display panel emits its own light. - Every modern display out there, TV or monitor, that does not have an OLED panel, is an LCD display. LCD displays consist of a backlight, and the LCD panel itself. The backlight is of uniform, white color, and the pixels of the LCD panel merely control how much light from they backlight they let through. - LED only refers to the type of backlight used to light the LCD panel. Every “LED TV” or “LED monitor” is an LCD display, and, by extension, almost all TVs used today are, in fact, LCD TVs. - “Twisted nematic” (TN) and “in-plane switching” (IPS) are types of LCD panels. The naming merely refers to the mechanism by which the panel controls the passage of light.With that out of the way, I did not find “LCD” in your post. I simply said “LCD displays”, because all gaming monitors are LCD displays, and the nonexistence of gaming monitors with 1 ms of input lag was what I was adressing.
> 2533274814390441;7:
> Finally, this guy is one of the few people to actually release his test data on input latency, and he clearly found that the input lag measured was a direct result of software issues rather than hardware. Full video below.
> Wired or wireless, is input latency affected? In-depth input times tested PC/XB1/PS4/Snes/Megadrive - YouTube
Correct me if I’m wrong, but he does not state the methods by which he found those figures in the video. Measuring input lag is notoriously difficult, because you need an accurate measurement of when the input occured. The typical (and the most cost effective) way people measure input lag these days is recording the input and the display with a high-speed camera, but what most people have available is at maximum 240 fps, which means that you can’t measure differences greater than 4 milliseconds. And that’s assuming you get everything else right.
The results could be correct, but without understanding of how they were obtained, they’re difficult to trust.
> 2533274825830455;8:
> > 2533274814390441;7:
> >
>
> Which is kind of irrelevant to latency considerations, because how latency impacts performance, and how people perceive latency, has nothing to do with reaction time. Reaction time is about how long it takes for the user to react to changes in the system. Latency is the converse of that: how long it takes for the system to react to inputs from the user. There is a priori no reason to expect these two things to be related.
>
>
>
>
> > 2533274814390441;7:
> >
>
> You said “Display port, 144hz and 1ms input lag means negligible input lag”, which I took to mean that you think some monitors have one millisecond of input lag. I decided to comment on it because—regardless of what you intended to say or what you meant—there already exists a great deal of confusion about input lag and the response time, which are two entirely different things. The response time itself is often presented in marketing material, and since the way in which manufacturers quote the response time measurements is already deceptive, they often quote figures like “1 ms” on the high-end gaming monitors, and “5 ms” even on the bog standard monitors that you get for cheap at any store. We don’t want gamers to be thinking that they are getting 5 ms of input lag, when they may be getting 30.
>
> Also, this is completely besides the point, but just to set the terminology straight here:
> - OLED refers to a relatively new display technology where each pixel of the display panel emits its own light. - Every modern display out there, TV or monitor, that does not have an OLED panel, is an LCD display. LCD displays consist of a backlight, and the LCD panel itself. The backlight is of uniform, white color, and the pixels of the LCD panel merely control how much light from they backlight they let through. - LED only refers to the type of backlight used to light the LCD panel. Every “LED TV” or “LED monitor” is an LCD display, and, by extension, almost all TVs used today are, in fact, LCD TVs. - “Twisted nematic” (TN) and “in-plane switching” (IPS) are types of LCD panels. The naming merely refers to the mechanism by which the panel controls the passage of light.With that out of the way, I did not find “LCD” in your post. I simply said “LCD displays”, because all gaming monitors are LCD displays, and the nonexistence of gaming monitors with 1 ms of input lag was what I was adressing.
>
>
>
>
> > 2533274814390441;7:
> >
>
> Correct me if I’m wrong, but he does not state the methods by which he found those figures in the video. Measuring input lag is notoriously difficult, because you need an accurate measurement of when the input occured. The typical (and the most cost effective) way people measure input lag these days is recording the input and the display with a high-speed camera, but what most people have available is at maximum 240 fps, which means that you can’t measure differences greater than 4 milliseconds. And that’s assuming you get everything else right.
>
> The results could be correct, but without understanding of how they were obtained, they’re difficult to trust.
Riddle me this, if you as a human being cannot react to a visual stimulus faster than .25 seconds, or 250 ms, how exactly are you going to tell me that you are going to perceive input lag, which at it’s most according to the most reliable source I could find was a total of 145ms of combined total input lag(In Halo 5 on average), meaning controller, plus display, plus software at 60hz. At 60hz ,and thereby 60 fps, 145 ms is less than 1/5 of a second(or 1/5 of a frame). The reason I pointed out reactions times, again, was to illustrate that the human brain cannot process information quickly enough to detect anything that fast, much less the difference in input lag between a wired and wireless controller, which also according to the same source, are so negligible they can’t be charted. So back to my original point, there is no performance difference in using a wired vs wireless xbox one controller.
And to clarify, that 1ms of response time specifically refers to grey to grey as opposed to black to black. You are correct.
> 2533274814390441;9:
> Riddle me this, if you as a human being cannot react to a visual stimulus faster than .25 seconds, or 250 ms, how exactly are you going to tell me that you are going to perceive input lag, which at it’s most according to the most reliable source I could find was a total of 145ms of combined total input lag(In Halo 5 on average), meaning controller, plus display, plus software at 60hz. At 60hz ,and thereby 60 fps, 145 ms is less than 1/5 of a second(or 1/5 of a frame). The reason I pointed out reactions times, again, was to illustrate that the human brain cannot process information quickly enough to detect anything that fast, much less the difference in input lag between a wired and wireless controller, which also according to the same source, are so negligible they can’t be charted. So back to my original point, there is no performance difference in using a wired vs wireless xbox one controller.
Unlike reaction time, input lag perception is not not a reaction to an unexpected external stimulus. It’s a reaction to a stimulus triggered by you, the user. They’re two vastly different situations. In one you’re reacting to something unexpected, while in the other you are analyzing a chain of events and comparing it to your expectation. I see no reason to expect the times to be related.
Research on the topic is hard to come by. However, I have already linked you to one paper (Watson, et al., 1998), which states that according to prior research, input lag as low as 50 ms can affect user performance. When it comes to direct perception, while I didn’t manage to find anything peer reviewed, I did find someone’s thesis (Banatt, 2017), where the author experimented with input lag perception of expert gamers, and non-gamers. The results are shown in figure 2, from which we see that for non-gamers the mean threshold for detection was 114 ms, while for expert gamers it was as low as 48 ms.
Your reasoning here is problematic, because there are all kinds of timing thresholds in the human brain such as the time it takes to identify an image. Assumnig these all to be bound by the reaction time is unfounded. Heck, if I take your reasoning that the human brain cannot process anything faster than 250 ms to its logical conclusion, then frame rates above 4 Hz are useless, because the frame times go below 250 ms, and thus any extra frames are useless because we can’t process them. You see the problem?
Also, I remembered that you shouldn’t really take Human Benchmark as the absolute authority, because, as they themselves point out, their test itself is affected by input lag. So, you should expect the true average reaction time to be less than whatever their statistics show.
> 2533274825830455;10:
> > 2533274814390441;9:
> > Riddle me this, if you as a human being cannot react to a visual stimulus faster than .25 seconds, or 250 ms, how exactly are you going to tell me that you are going to perceive input lag, which at it’s most according to the most reliable source I could find was a total of 145ms of combined total input lag(In Halo 5 on average), meaning controller, plus display, plus software at 60hz. At 60hz ,and thereby 60 fps, 145 ms is less than 1/5 of a second(or 1/5 of a frame). The reason I pointed out reactions times, again, was to illustrate that the human brain cannot process information quickly enough to detect anything that fast, much less the difference in input lag between a wired and wireless controller, which also according to the same source, are so negligible they can’t be charted. So back to my original point, there is no performance difference in using a wired vs wireless xbox one controller.
>
> Unlike reaction time, input lag perception is not not a reaction to an unexpected external stimulus. It’s a reaction to a stimulus triggered by you, the user. They’re two vastly different situations. In one you’re reacting to something unexpected, while in the other you are analyzing a chain of events and comparing it to your expectation. I see no reason to expect the times to be related.
>
> Research on the topic is hard to come by. However, I have already linked you to one paper (Watson, et al., 1998), which states that according to prior research, input lag as low as 50 ms can affect user performance. When it comes to direct perception, while I didn’t manage to find anything peer reviewed, I did find someone’s thesis (Banatt, 2017), where the author experimented with input lag perception of expert gamers, and non-gamers. The results are shown in figure 2, from which we see that for non-gamers the mean threshold for detection was 114 ms, while for expert gamers it was as low as 48 ms.
>
> Your reasoning here is problematic, because there are all kinds of timing thresholds in the human brain such as the time it takes to identify an image. Assumnig these all to be bound by the reaction time is unfounded. Heck, if I take your reasoning that the human brain cannot process anything faster than 250 ms to its logical conclusion, then frame rates above 4 Hz are useless, because the frame times go below 250 ms, and thus any extra frames are useless because we can’t process them. You see the problem?
>
> Also, I remembered that you shouldn’t really take Human Benchmark as the absolute authority, because, as they themselves point out, their test itself is affected by input lag. So, you should expect the true average reaction time to be less than whatever their statistics show.
I see what you’re saying, but reaction time is the only viable metric I could find enough data to prove my point. I still believe there is a reliable metric out there with a direct correlation between noticing input lag and the speed at which your brain processes information, but I will concede the point that it is not necessarily reaction time. The thesis is interesting however, as it shows how people’s brains adapt when learning to play games. It also shows that there are other ways to test this. What I want to know is what Microsoft is using to test input lag, and why those numbers are so closely guarded. I think this information shouldn’t be this hard to come by, especially since companies regularly claim things regarding low response time and or latency in their products all the time. But without a reliable way to benchmark these claims, essentially it all becomes a marketing gimmick.