Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm not sure I follow. CRTs draw the image to the screen in a fundamentally different way than modern displays due to how the electron beam moves sequentially left to right/top to bottom. This analog process, happening at 15 or 25hz, is what gives authentic arcade machines their look and feel. Same for old computer terminals. My understanding is that to reproduce this effect on a modern display, you'd need an extremely high refresh rate. To properly replicate this requires some pretty low level aspects of the system to be addressed. Hardware limitations are bound by the laws of physics after all.

Beyond just the aesthetics, there are practical reasons why this is important, whether it be lighgun idiosyncrasies or how the game "feels," which can affect timing and such for competitive players. There's a lot more to preserving the look, feel, and compatibility of displays for old computer systems than most realize and the rabbit hole can go quite deep on this one.



    there are practical reasons why [how tthe electron gun works is] 
    important,  whether it be lighgun idiosyncrasies or how the 
    game "feels,"
This is always interesting to discuss because there are so many factors at play! To put it in less than a zillion words,

The way a game "feels" in this context is essentially a function of input latency. The old-style "chasing the beam" hardware, plus a CRT display, equals something very close to a true zero lag environment.

Here's a breakdown of the input lag in a modern environment, for contrast. These are all latencies that don't exist in something like, say, a Sega Genesis/Megadrive hooked up to a CRT: http://renderingpipeline.com/2013/09/measuring-input-latency...

In an ideal emulation situation, you could theoretically recreate something close to a zero-lag analog environment (in terms of latency) without necessarily simulating the path of the electron beam itself.

Although, as the linked article implies, there are a lot of bits in the emulation stack that would need to be optimized for low latency. High refresh rate displays get you part of the way there "for free."


Not everything is games that require minimum latency, though. For, say, a terminal, or a CDC 6x00 console, some lag is perfectly acceptable.


Sure, and even many games don't particularly benefit from it. However, it's a really remarkable thing to play e.g. Mega Man or Smash Bros. in a true lag-free environment.


I wonder about that, might be that having specialized display controller on say OLED display could've been enough ?

You could then have the controller artifically drive it line by line instead refreshing whole screen


Perhaps. One issue I foresee is the way CRTS glow. The phosphor doesn't light/dim immediately the way an LED does. So there's some amount of fade in/out that happens on a CRT as the beam moves across the screen. I imagine this could be difficult or impossible to reproduce with a traditional OLED screen. Some old games rely on this technique along with the slow refresh rates to to create a sort of dithering/aliasing effect.


Phosphor decay is not terribly difficult to simulate to an acceptable degree. Doing it at the pixel level is pretty easy, doing it at the phosphor level is computationally harder but not much more complicated.

The larger issue w.r.t. this specific quirk of CRTs is that we're running out of human beings that are familiar with what this is "supposed" to look like, and actually care.

I care a lot, but I'm old.


I'm not aware of any cases where it's been emulated in any acceptable manner. I can't be bothered to do the math myself, but I imagine doing this well would be beyond the capabilities of modern displays (probably in the 1000s of hz refresh rate). Maybe some special FPGA based controller with an OLED like was suggested above could make it possible. I'm not sure.


Can you talk more about why you feel it would be infeasible? I'm a guy with a house full of CRTs so I am genuinely interested.

What sorts of things are advanced filters like CRT-Royale are missing? https://www.google.com/search?q=crt-royale

Each individual phosphor dot on a CRT is not terribly tricky to emulate.

The brightness at any given moment is a fairly simple decay function based on how long it's been since you lit it up with the electron gun. On top of that, you would typically want to apply some level of bloom to simulate the way light is diffused by the glass. Sure, you've got a few million dots to simulate, but this is also anembarrassingly parallel problem.

Now of course, admittedly, you're only simulating that phosphor glow decay at the refresh rate of your monitor -- 60hz, 144hz, 240hz, whatever -- instead of an effectively infinite level of steps as would be the case in real life. However, I don't think that is a practical issue.

You're clearly thinking of factors I'm not and I'm genuinely interested. To my mind, the visual aspects of CRTs are pretty easy to simulate, but not the near-zero lag.


The thing you can't emulate is the phosphorus coating. It simply looks different because light isn't coming from a backlight, but the front display is actually shining. And in vector graphics you don't have pixels at all, the light shines quite beautifully in a way I don't think is possible at all with backlit displays.


> The thing you can't emulate is the phosphorus coating. It simply looks different because light isn't coming from a backlight, but the front display is actually shining.

I did said OLED not LCD precisely because of that


Are you sure it's the decay they're using, and not the natural blurriness and texturing?

And some phosphors have a much longer decay than others, but you could easily emulate those long-term effects on a normal screen.


That and the fade follows a non linear curve. It’s pretty cool, but quite a lot of math to match the physics going on.


It would need to redraw the whole screen to account for the phosphor decay. To do that with line resolution and an NTSC signal, you’d have to redraw it roughly 1500 times per second (60 fields of about 250 lines). You’d draw the current line at full brightness and “decay” the rest of the frame according to that phosphor persistence. Since there is some quantization, you could reduce the frequency of decays as the line gets older.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: