Ugh, you exhaust me.. like legit.
I've edited the title of the video for you and people like you, but you can send a stream in 60fps without the gameplay being 60fps. Which is what I think our main disconnect is.
Example, when you play a PC game that is outputting say ... 120fps. (Whether or not you notice it) you are getting EVERY bit of that output. Back to our earlier argument - When you decide to capture and encode that very stream at 60fps or at 30fps you don't notice the difference. You notice the difference when the frames drop to say below 24 which is why its safer to send a 60fps STREAM because on avg you are seeing around 30-50 on actual playback. When you send 30, you have a higher probability to drop below the 24fps threshold and even then the 'degradation' in quality most people see is more closely related to the bitrate drop and not the frame drop.
A stream is different from an uploaded video that has taken the time to process and get as close to source quality as possible.
Lets say I have a webcam outputting 60fps in my PiP display.. but I'm playing a game thats in 30fps. Is the entire stream relegated to the quality of the console game? Maybe since people are tuning into see the gameplay - but thats the broadcasters choice.
In another example - lets assume the gameplay, webcam, videos served in line are all aligned on resolution and framerate prior to the encode.. If you're sending 1080p 60fps do you think people are watching true 60fps every second? Should the title of the video be based on the avg of what they are watching (+/- the dropped frames) or should it be based on the streaming profile sent out?
dude.. i.. I just cant anymore.
Madster, halp