I just want to say I am glad pro gaming took over. Back in the day it was only Quake players advocating for 120 FPS (for various reasons, including Q3 physics being somewhat broken), 125hz mice and stuff like that. I am talking 20 years ago.
The number of lost souls parroting the old "human eye can only see 30 fps" has gone down considerably over the years. The last 10 years were fantastic in that regard, despite the whole RGB craze.
Even CS servers have 100 Hz heartbeat these days. Of course, by the time we get 1khz displays I'll be too old to enjoy it myself but still likely to put a bittersweet smile on my face.
There's definitely diminishing returns the higher we go with refresh rates. 60hz to 240hz for example is like playing a completely different game. But going from 240hz to 360hz, even in CSGO it's a lot harder to notice a difference.
Personally I believe the newly announced 300hz 27" 1440p monitors[0] are going to be the perfect sweet spot for the foreseeable future. I imagine it will be a long time before technology emerges that is a noticeable improvement to this.
There are diminishing returns, but 360Hz is still too low to display sharp-looking motion without strobing. 360Hz strobing is visible as phantom array effect whenever you move your eyes. If you are sensitive to this artifact and instead want motion that looks like real life, you need more like 1000fps/1000Hz. There is no hardware capable of this, but at high enough frame rate you could probably get away with interpolating the frames along motion vectors, e.g from 500fps to 1000fps, with very minor latency/artifacting.
I am not convinced we need to go nearly that high - 300hz puts a crapload more stress on your system performance wise for not much gain. 90hz is in my opinion already such a massive improvement over 60 that I do not see mind blowing results even going to 120 or 144. And many pro gamers were using 120hz monitors over 144hz at one point.
Realistically I think the two sweet spots are 120hz and 240hz - not necessarily because they are the best of the best but because they are each divisible by both 24 and 30 (the most common FPS of films and television) AND they offer two tiers of increased performance for different hardware requirements. You can run a much more taxing game at 120 and then if you want to spend the big bucks on the latest hardware move up to 240.
As for resolution I completely agree with you - 1440P is really a sweet spot for 27" monitors. If display / DPI scaling improves across multiple OS then I think eventually we will likely have 4K become the norm for 27" sized monitors and it will show some improvement but again be diminishing returns like the difference between 120-240. That being said as more film content moves to 4k I think we will also start to see 1440P become less popular as people will want to view content in something that doesnt scale.
All of this however is nothing compared to the improvement that a true HDR display brings - a high end monitor that can show a large increase in dynamic range is such a game changer and I do not think most people realize it yet - it brings us so much closer to how the human eye really sees that I really think it is equivalent to the difference of going from laserdisc resolution to something 4k. And on top of that now that cameras are also shooting in such massive dynamic ranges it is going to make older content just look plain in comparison.
> And many pro gamers were using 120hz monitors over 144hz at one point.
This was done solely in order to enable strobing as 144 Hz panels at the time were too slow to support strobing which requires scanout speeds equivalent to ~200 or so Hz at 144 Hz.
And strobing is solely used because S&H displays have too much transition and motion blur at 1xx and 2xx Hz.
> Realistically I think the two sweet spots are 120hz and 240hz - not necessarily because they are the best of the best but because they are each divisible by both 24 and 30 (the most common FPS of films and television)
Principally I agree that 120/240 Hz are more suited to general purpose use for this reason [1], but on the other hand this really has nothing to do with the hardware and is purely so because of software limitations. Really what one would like to see is that video playback causes the variable-display refresh to adjust to a multiple of the exact video frame rate instead of janky ad-hoc frame-rate conversions.
This is a common theme; hardware is generally much more capable than what the software/drivers allow everywhere you look.
> All of this however is nothing compared to the improvement that a true HDR display brings
Many people would probably already be quite happy with something that doesn't turn shadows into a foggy, cloudy mess like all IPS panels do, and VA panels as well (but less so).
[1] though 120 Hz does not solve the 50p problem, as content produced by broadcasters in 50 Hz countries, which is basically all of the world that isn't the US, cannot easily be converted at playback time. 25p can just be handled like 24p with 1:1 playback, basically nobody notices the slight speed-up / slow-down and that's how films have been shown in television in 50 Hz countries since always.
There are definitely diminishing returns with increase of refresh rates. But nonetheless your comparison is unfair, since you are comparing quadrupling of refresh rate with a mere 50% increase, which is similar to comparing 60 to 90, not 60 to 240. And with advent of VR demand for refresh rate increase of display panels will only grow, since it's much more noticeable while using headset.
As for VR I think that's an excellent illustration of my point - we know that many people don't do well at 60FPS per eye in VR due to motion sickness. Move up to 90FPS per eye though and there is a massive improvement that I have seen first hand others. By the time you get to 120FPS the experience feels pretty damn smooth and while I would of course like to see more frames I am not convinced going a ton beyond 120 is really worth it performance wise considering you have to render that twice and the extra compute could be instead spent on the new shiny like ray tracing.
Luckily when I was working at a VR startup I was one of the few people that never seemed to get motion sickness so I became the test dummy for everyone's work - they would throw me in something they hadn't optimized at all yet that was only getting 40fps on a system with dual Xeons and quad Nvidia top of the line workstations cards and while it felt a little weird it for some reason never bothered me :-D
It doesn't make any sense to invest much in displays over ~90hz vs just working on adaptive refresh rates
Your eyes really do work at a pretty low speed. At some point it makes more sense to just track the eyeballs and put updated scenery in front of them at the exact instant the game engine produces it, rather than try to run at some insanely high speed generating frames that aren't actually having any effect on the player's brain
90, 144, 240hz, etc all look better than 60hz because there's less random lag between when the game generates a frame and when it appears on the screen. You can't see an 8ms delay, but you CAN see a variable 0-10ms delay that's happening as the game engine and computer monitor drift in and out of sync again and again.
>I just want to say I am glad pro gaming took over.
Yes! I have been crying about latency for nearly 10 years [1]. Computing has always been optimising for throughput. And before Pro Gaming, there just hasn't been a marketable incentive for companies to work on / minimise latency. Now we finally do!
Even in the best case scenario, the lowest latency is still 25ms, and in most cases we are still above 50ms. I think it is worth posting [2] Microsoft Research on Input latency. It would be nice if we could get average system end to end latency down to sub 10ms level. Which is what I hope work on VR will bring us.
I believe that nonsense was originally send in the world by the movie industry to have an argument for not increasing the roll sizes and weights to disproportionate sizes. Not to mention that the earliest film rolls were also highly incendiary giving even more incentive not to make them too big or store too many
Part of that however is also highly related to motion blur - many big directors have done tests in theaters showing "HFR" content (like 60fps) and audiences distinctly said they did not like it on average. The D-Day scene from Saving Private Ryan is a good example - it was not shown HFR but they intentionally made the shutter speed faster to give it that "staccato" and jerky and gritty sort of feel. While in photography we use all kinds of shutters speeds for different effects (think of things like using a super fast shutter speed to freeze the propellers of a plane or using a very long shutter speed in a landscape photo with a river so that the river becomes a nice smooth blur) the movie industry mostly abides by the rule of "180 degree shutter" meaning that your shutter speed is 1 over 2x your fps (x=fps). So for most cinema shot at 24fps the shutter speed is 1/48 of a second.
The importance of this is that because you are not shooting still frames and instead of shooting a series of frames to be played back quickly this adds a motion blur effect that smooths the transition between frames and creates a sort of artistic look. There are technical limitations of this blurring (medium fast pans across a scene are a great example - the whole thing becomes too blurred and is hard to see). Any scene with slower moving objects such as people adds a sort of natural motion blur that many cinematographers believe is an artistically ideal choice.
Now that being said you do not need to abide by the 180 degree shutter with modern cameras (like Saving Private Ryan) and one can theoretically choose a variety of shutter speeds for different scenes regardless of what FPS one is shooting at. A fast pan could be shot at something like 1/120 and even at 24FPS it will appear much sharper and easier to make out individual objects (although perhaps not quite as smooth on the panning motion). However you are theoretically limited on the low end to a shutter speed that is approximately equal to your frame rate (or your shutter would be open LONGER than the frame itself and defeat the purpose of shooting "frames" in the first place). So theoretically we could move to 48 FPS content and still shoot at 1/48 and have the same amount of motion blur PER FRAME but also double the amount of frames which would be a large improvement from a technical sense. I haven't seen any films shot this way but I have experimented quite a bit with my own camera shooting at these kinds of speeds and it works quite well. You can also shoot at 24FPS and drag the shutter to 1/24 to get a full stop (double the amount of light) vs normal 24FPS footage if you are shooting in a very dark environment that is already pushing the limits of your cameras sensor. This of course introduces even more motion blur but depending on the scene it may not be very noticeable or even introduce interesting artistic looks.
TLDR: I think we should move to 48 FPS and shoot most content at a variety of shutter speeds, the most common being the already standard 1/48s shutter, and either increase or decrease that within reason depending on the nature of the scene and the desired artistic outcome.
I think people were saying 60fps was the limit but still I agree
That being said in the quake days I dont think monitors could go over 60hz anyways so even at 120FPS you were not gaining a similar advantage from what we have today. From what I remember however there were other advantages to high FPS in games like Counterstrike as well in terms of player movement - the monitor might have "smoothed" the motion back down to 60 fps but it still resulted in a more accurate experience.
I forget how refresh rate worked on CRT's though - maybe those could higher than 60?
And of course you can overclock an LCD monitor quite easily - most will not do much but there are some that I got to 90hz which (in my opinion) is a massive improvement compared to 60 and the 30hz difference is a much, much larger jump than the next jump from 90 to 120hz.
> I forget how refresh rate worked on CRT's though - maybe those could higher than 60?
Yes, even the standard VGA 13h mode (320x200x8) is 70Hz and many CRTs could do 85Hz. By Quake 3's time CRTs that could do 120Hz and above were very common. Personally i have such a CRT as well as another that can do 160Hz.
Also FWIW the refresh rate is only part of the story - CRTs have practically instant "response time" so 120Hz on a CRT vs 120Hz on a LCD feels very different (in favor of the CRT). Supposedly OLED could be made to be close but personally i haven't seen such a case (and people who have both OLED and CRTs still say that CRTs are better there). I have a 165Hz LCD and doesn't hold a candle to the CRTs i have around in terms of motion feel.
Nowadays you can find small-ish CRTs for dirt cheap on Facebook Marketplace, etc (some even give them for free) - i recommend trying to find one that can do 120Hz if for no other reason than to experience the liquid butter smoothness of FPS motion (and join us in the lamenting its loss in modern monitor tech :-P). Also kinda amusing that when those were new chances are the PCs they were used with couldn't do high framerates (and low framerates do not feel as bad on a CRT as on an LCD, but i'm not sure if it is related).
A large reason why CRTs are rather excellent in regards to motion is because they're not sample and hold displays resulting in very low duty cycles (dominated by the phosphor's fall time of somewhere between 200-1000 µs, as rise time is basically instant at <20 ns or so and video bandwidth is well above 100 MHz). That's the main reason why a 240 Hz LCD using BFI for strobing (D=0.5) can't compete with a 120 Hz CRT (D~0.05 or so).
Interesting - I had a nice high end 21" CRT at one point I got for free when some tech company went under and told the building maintenance to just trash all of the brand new equipment. Luckily my uncle was that maintenance guy and I got free pick of whatever parts I wanted before it went to the landfill.
I do remember being VERY good at the original counterstrike (pre v1.5) comparatively - I know CRT's have very low input lag - I wonder if I was playing at higher fps / hz and didnt even realize it!
From what I remember VGA could actually do some decent resolutions (SVGA and whatnot were more limited) and DVI was starting to rear its head around that time as well. I vaguely using a resolution like 1900 x 1200 which is about what a modern 1080P HD is doing (slightly higher in fact)
Now this is making me wonder how my plasma TV actually compares - from what I remember plasmas do not have a "hertz" so to say but also werent really coveted for gaming (although burn in may to be to blame for that). Input lag on it seems decent but I would guess might be its big limitation. Surprisingly it does do 10 bit video and while it wont accept an HDR signal I suspect the display itself is capable of showing more dynamic range than many of the cheaper "HDR" LCD's
I'm pretty sure I was playing Counter Strike at, as reported by the game, 99 fps on my Sony Trinitron back then (2000?). We'd use "low poly" mods (simplified 3D characater models) to speed everything up and reach these speeds.
There are still messages on message boards from this era where several people mention reaching that same 99 fps.
It was two decades ago but I'm pretty sure it was 1024x768 @ 99 fps (19" Sony Trinitron). I may be mistaken on the monitor though.
The number of lost souls parroting the old "human eye can only see 30 fps" has gone down considerably over the years. The last 10 years were fantastic in that regard, despite the whole RGB craze.
Even CS servers have 100 Hz heartbeat these days. Of course, by the time we get 1khz displays I'll be too old to enjoy it myself but still likely to put a bittersweet smile on my face.