True rasterdemo ("Raster Interrupts"-like) working on GeForce/Radeon! -- Comments?

category: general [ general [ glöplog

There's also mid-screen input reads for same-frame response.

e.g. mid-screen button read for bottom-of-screen pinball flippers. For fast snappy response & all original input latencies.

I’m probably being ignorant, but are our reflexes faster than 1/50th of a second?

Scratch that.. i think there are bobsled people who compete in the +-1/100 league

If you can anticipate the motion, then definitely. For a good drummer or comping rhythm player, inadvertently missing a beat by a 50th of a second is a mistake.

Elite sprinters react in just a little over 100 ms (under 100 ms is considered impossible and counted as a false start). But it's not about reflexes here (which is about reacting to an input), just noticing whether you got a response in 50 ms or 100 ms (which is about measuring reaction to your output), which is entirely possible.

Yeah, you can definitely tell if your drum sounds suddenly come 20 ms later. So yeah... I can kind of understand it. It might help in using ProTracker or Deluxe Paint by making the keyboard and mouse feel better. But for watching demos it shouldn't provide any improvement.

>"Elite sprinters react in just a little over 100 ms (under 100 ms is considered impossible and counted as a false start). But it's not about reflexes here (which is about reacting to an input), just noticing whether you got a response in 50 ms or 100 ms (which is about measuring reaction to your output), which is entirely possible."



There's the race-to-finish effect. The best eSports players in the tighest leagues are dependant on milliseconds much like 100 meter sprinters may cross finish line only milliseconds apart. Y'know, "see-react-shoot" same time after going around a corner, is kind of like the sprint to the finish. In this situation, milliseconds can matter.



In a different situation, consider lag offsets in reaction expectations. For example, an archery game with a moving target. Timing the vertical arrow-shoot as a target moves horizontally. If the archery target is moving 1000 pixels/second on your display (1ms = 1 pixel), timing your shoot 10ms too slow or 10ms too fast, means the bullseye is 10 pixels to the left or to the right by the time arrow hits the target. So you meet a different system with a different lag (e.g. emulator with 16ms more lag than a different emulator), it interferes with your pre-trained archery aim.



Etc.



Using beamracing preserves 100% original input lag mechanics, which means you get the same lagfeel as the original arcade/console/etc machine. For some that can be important.



Either way, everyone has different reasons for reducing emulator lag.

I have used "the UFO" when comparing my gaming screen to a newer one. I have posted my skepticisms vis-à-vis blur reduction techniques and 2D as measurement on your forum as HenrikErlandsson.



And from this I conclude it's incorrect to use your test for the new techniques and would like you to focus hard (for flat screens, that is) on extreme-brightness, 200/240 Hz screens offering black frame insertion. This is the only thing that will give a chance for flatscreens to outperform CRTs for real-time graphics. (As in, no, not "RTG" although it could be, some day. ;))



~



I have shamelessly taken the opportunity to sort of apply my pet peeve (well, anyone's pet peeve, surely) with non-CRT displays from you just posting a question. Sorry about that.



In answer to your question, ...raster is nothing to chase when you have the luxury of a graphics card taking care of much more performant chores. If you want it oldschool, just pick your favorite oldschool platform, a lovely CRT, and enjoy coding without blur :)