This article confirmed my own nagging suspicions:
It’s a bit absurd that a modern gaming machine running at 4,000x the speed of an apple 2, with a CPU that has 500,000x as many transistors (with a GPU that has 2,000,000x as many transistors) can maybe manage the same latency as an apple 2 in very carefully coded applications if we have a monitor with nearly 3x the refresh rate.
I suspect my sensitivity to low latency will soon be the norm (if it’s not already) as computer-driven peripherals became ever more commonplace in our daily lives.
Not only is it too early to put Redox to the test for this, but I assume a high-speed camera isn’t always so easy to come by. But I’m wondering if “input latency
< x on SomeDevice” could be a valuable design goal for Redox OS.