I’m hard pressed on people spreading misinformation.
I’m sorry, what? You mean a 5% difference, right?
I don’t really care what you’re here for.
Go put in the work like the rest of us.
You are just wrong, and can’t see it.
Its a 5% difference from 42% to 47%. While also being a 12% increase from 42% to 47%. I mean literally read the name of the calculator you are linking.
The halo stats are literally, in every since of it, a record of total shots hit, vs total shots fired. And knowing the total number of shots is literally irrelevant because the data (since we have a %) can obviously be normalized. So a 42% accuracy is 1:1 meaning 42 hits for every 100 shots fired. This would be true no matter if it was 1Trillion total shots fired for PC players, or 12,032,231,233,305,212,312,332. It still comes out to 42/100.
So yes, its a 11.9% increase from 42% to 47%. Or 42 shots hit to 47 shots hit.
But… what is 11.9% of 42? 5. Which is 1:1 to the 5% difference that we are discussing.
JC, Adding 5% to 42% IS NOT THE SAME as adding whole numbers. Meaning it is not as simple as 42+5 = 47. Again, 42%+5% = 44% (the formula is 42x1.05) . 44% is 5% greater than 42%. 47% is 12% greater than 42%. I posted a link to a calculator that does the work for you.
The calculator isn’t designed for adding %… it’s designed to calculate the % increase or decrease… two completely different things.
If you have a pie, and you keep 42% of it, and you give me 5% of the remaining pie… and then I decide to give you the piece back… how much pie do you have?
Right here. In other words, controllers are 12% more accurate on average. Not 5% more accurate.
Lol, just no. They are 5% more accurate. Which happens to be a 12% improvement for a player that hits 42% that wants to hit 47%.
But the controller player is only hitting 5% more of their shots then the PC player.
If controller was only 5% more accurate, then it would be 42% vs 44%, but then you would say they’re only 2% more accurate. Which would still be wrong.
Please, just take a second and think.
If I went to an archery range, and competed with a pro, lets say we both shot 100 arrows.
I land 1 arrow on target.
They land 99 arrows on target.
I have an accuracy of 1%, they have an accuracy of 99%.
They are 98% more accurate then me.
However, for me to improve, and shoot at a 99% accuracy, I would have to increase my current performance by 9800%.
How To Find Increase Or Decrease Between Two Percentage Numbers (increase=great than, decrease=less than). In this case, we want to figure how much more accurate (increase/greater than) controller is than MnK…
Percent Increase: You need to calculate percent % increase from 42% to 47%.
Find the difference between two percentages, in this case, it’s 42% - 47% = 5
Take 5 percent, and divide by 2nd percentage: 5/47 = 0.106
Now multiply this number by 100: 0.106*100 = 10.6
You calculated difference of a number in percent, and the answer is a percentage increase of 10.6% percent.
A couple scenarios…
Controller is on average 10.6% more accurate than MnK
MnK would need to be 10.6% more accurate than they are now to be on par with controller
MnK is 10.6% less accurate than controller.
Now you’ve got it!
Finding the difference is not the same as finding the increase or decrease!
Correct, but in this case the greater than is what matters. For an example, 5% and 15% is only 10% difference, but 15% is 200% greater than 5%.
So when you say oh it’s only 10% different, but how much greater is it?
So in this case, how much greater is controller over MnK?
You can’t say one matters and the other doesn’t. If people really care, then they need both. One number gives the raw data and the other provides the context. This is why statistics can be so misleading.
This I agree with 100%.
With this comparison, specifically. The end result is finding out how much more accurate one is over the other. In this case, “greater than” is the answer we’re looking for.
The fun part is re-reading the discussion and noticing we have been talking about difference the whole time…
The difference IS the > than. 5% more accurate is 5% more accurate. 80% to 85%. 35% to 40%. 42% to 47%.
% increase from base ONLY matters if we want to know a relative amount of improvement. NOT the difference.
And a side note, your example has an error. You don’t divide the difference by the 2nd value. You divide it by the first. Should be 5/42.
The controller player is still only 5% more accurate. That’s what the difference is.
The fact that the M&K player has to improve their relative performance by 12% is a completely different statistic.
If you look at sports stats, and you compare two free throw %'s between players, the DIFFERENCE is what matters. Nobody is looking at the % increase unless you are specifically talking about relative performance increase for person A to be the same as person B.
If Player A hits 42% of their free-throws, and player B hits 47%, PlayerB is 5% more accurate on their free-throws. You wouldn’t say that they are 12% more accurate.
Now, if you wanted to know how much better player A would have to be? Then yes, that is when the % difference is important. This is where you would say that Player A would have to be see a % increase of 12% to be as accurate as player B. To make up that 5% difference.
My god the math you are using is simple it’s just like having a .99 kda and trying to make it a 1.0 all your determining is how many kills you need over deaths for at least 1 match, round, game ect ect.
All you are arguing with this ridiculous percentile is how much better the average player would need to be. Which is a statistic that is taken from every account on both platforms. It’s a rough number for the general population. This takes into account all skill levels, platforms, and inputs but you’re over here spouting numbers the average player doesnt care about in fact the percent of people that dont care even a little bit is more than likely over 60%.
Take into account how many people of the entire player base dont even use waypoint and how many of them dont care to use it and dont care about that percentage you keep spouting off about how much better they would need to be. Face it lots of people own consoles, not everyone owns computers just like not everyone owns a console. But halo has primarily been console for the last 21 years while the pc playerbase has slowly been building.
When taking into account how long these pc players have been using m&k but never played an fps on them that would determine a lower skill level off the bat plus the lack of aim assist for them. They’re gunna miss a lot of shots just like a toddler using a controller on their first ever fps. Heck even my wife misses shots like that but shes an RPG player even on computer so its expected when that’s something every rookie faces.
Then you’ve got the harcore side of things racking headshots like they’re using aimbot. But I know it’s how the precision of a mouse works if it’s a br vs br fight I could win or I could lose depends if I get the first shot off or not. But if I never see them I will lose every time. a mouse click resets faster than a trigger pull. The only time I have an advantage is with automatic weapons so the ar and needler? Maybe the guns of vehicles and turrets can be counted here but each of those guns reacts to bloom if i want to be accurate i have to burst fire them no different on pc.
P.s. if you dont want misinformation spread then fine but it’s not misinformation when someone doesnt care about that second percentage. The ones who do are the ones calculating their personal performance and possible how many games it would take them to increase that number with their current performance
Honestly a percent that every single player would collectively have to work on to increase isnt a number anyone cares about. It’s all about the personal stats
What I’m reading, is pretty much all subjective
This isn’t accurate, I mean it can be but that’s really considering IF that is the case. You know Shroud, right? Is he new? Plenty more like him that haven’t touched the game since it’s release.
This sounds like a personal problem and/or a skill gap issue. With the data we have, it’s a clear indication of what input has the advantage over the other. If you don’t like the data, don’t agree with it or just flat out think it’s a hoax then that is your opinion, and you’re entitled to your opinion but who are you, or anybody else to say it’s BS? The person who figured out the data, explained how he did it to a tee. Don’t agree with it, pull your own data but, until then it’s the best we have to go by. Until 343i comes out with their data, it is what we have but I think we all know they’ll be hush hush about it. I have much respect for someone who took the time to do it themselves and who’s to say someone else hasn’t done it? Maybe someone else has done it and came up with similar results and just said “oh well, the data is already out there”.
Plenty of controllers out there that lower the travel distance of a trigger significantly. Besides, each weapon has it’s own ROF, so you cannot spam a trigger/mouse button faster than it’s allowed to fire. You can most defiantly pull a trigger faster than any gun in Halo will shoot. The same applies for a mouse button.
May I chime in? And how did this turn into a math debate anyhow?
42% + 5% = 42/100 + 5/100 = 47/100 = 47%
hmmm
That’s what I’m saying. I don’t know how this -Yoink- started.