An input lag investigation
-
Hi !
First of all : great job !!! So basically, I guess those 2 more frames of delay (ogl vs dispmanx and linux vs win) is what I perceive and what bugs me :) Hmm, I'm so addicted to pixel shaders, it'll be difficult to switch to dispmanx ! If it were possible to use scanlines with dispmanx, it would be a nice solution.
With the same display and joysticks, I compared the same NES / PC Engine games on my FPGA computer (MiST) and the Raspberry Pi, and the difference is important (meaning : important enough to notice it without making measures). Some fast PC Engine games like Star Soldier seem unplayable on the Pi (because the ship moves extremely fast) while I perceive no delay on the MiST. In some other games, it's barely noticeable. Mainly depends on the sprite speed I guess !
Personally, I made the configuration tweaks everyone does (enable hard sync, disable multithreaded video, set frame delay) but I'm unable to measure. It may make things better but input delay is still noticeable. There are also options to poll input at different times.
It's pretty cool some people are paying attention to this issue. I guess casual gamers don't mind, but it makes a big difference when playing fast arcade games seriously !
-
BTW, Brunnis, thanks a lot for your fix too ! Will it get into the main tree ? Would be great if libretro devs checked the other cores :)
-
@dankcushions could you walk me through?
Im taking a guess of how to update .. from the main menu, do I go: retropie > Retropie setup > Manage packages > Update all installed packages OR Manage main packages > lr-fceumm/lr-snes9x > and then do I choose update from binary OR update from source?
Thanks
-
they can gotten by updating the individual lr-snes9x-next and lr-fceumm emulators by binary in the 'main' section of the packages section of the retropie-setup script.
-
Well I guess I followed it correctly, and I think I do see the differencee, thank you. When I mash sword in zelda for example, I feel like its reacting different than it use to.
Whats the difference from updating by binary, and source?
-
First of all, I just want to say great work to the OP for doing such a thorough investigation and thanks also to the devs for implementing the delay-reducing change into lbr-snes9x-next and lbr-fceumm.
I recently updated my Retropie from my old 3.7 setup to the latest 3.8.1 just so I could install the updated emulators with these changes, and here are my findings so far:
- Running on the default OpenGL display driver with Floob's Video Manager overlays/shaders applied to most of my emulators (NES, Genesis, SNES... etc.), I didn't really notice a discernible reduction in input delay from before the update.
- Then I switched to the Dispmanx display driver, and I noticed a signifcant reduction in input delay. It was pretty freaking fantastic, as even tighter platformers like DKC and Super Mario World felt nearly native.
- BUT running on Dispmanx means that I can not use any of the shaders/overlays that I've enjoyed up until this point.
So now I'm faced with a tough decision. Do I want to enjoy the awesome CRT overlays/shaders that give all of my console emulators that authentic look and feel at the expense of some input delay, or do I want my gameplay to be as responsive as possible but with only a simple bilinear filter applied and no overlays?
I'm having a hard time choosing. Why is it that OpenGL inherently has more input lag than Dispmanx? Is there theoretically anything that can be done to achieve parity between the two?
-
@ScOULaris said in An input lag investigation:
I'm having a hard time choosing. Why is it that OpenGL inherently has more input lag than Dispmanx? Is there theoretically anything that can be done to achieve parity between the two?
The GL driver has to do lots of extra processing (i.e. running the shader) that the Dispmanx driver doesn't. Keep in mind the shader has to be run for every pixel on the screen - which for a 1080 screen is about 1.5 million times. You can see where the extra lag comes from.
Lag parity isn't possible because of this. With a really fast video card it could be reduced but that's not an option on the Pi.
-
Question about the NES and SNES input lag fix. Should I remove the video hard sync and frame settings after applying the fix? I know those were previously known ways to fix the input lag slightly but will that throw anything off with these new fixes in place?
-
Video hard sync does nothing on the rpi
-
@BuZz Should this be removed from the Git page then?
https://github.com/retropie/retropie-setup/wiki/Overclocking#improving-input-lag-and-delay -
@silentq yes. The wiki is edited by the community so mistakes do happen.
-
@davej said in An input lag investigation:
@ScOULaris said in An input lag investigation:
I'm having a hard time choosing. Why is it that OpenGL inherently has more input lag than Dispmanx? Is there theoretically anything that can be done to achieve parity between the two?
The GL driver has to do lots of extra processing (i.e. running the shader) that the Dispmanx driver doesn't. Keep in mind the shader has to be run for every pixel on the screen - which for a 1080 screen is about 1.5 million times. You can see where the extra lag comes from.
Lag parity isn't possible because of this. With a really fast video card it could be reduced but that's not an option on the Pi.
Hm. So I understand how a shader can add more input delay (increasingly in correlation with the complexity of the shader). Would lowering the Retroarch rendering resolution to 720p instead of the default 1080p for SNES have any impact on input delay since it's half of the pixels that need to be processed by the shader?
-
The default is not 1080p. The default is the video output resolution, so depends on your screen. afaik it will affect it, as the final scale from render res to video output res is done via dispmanx. I tend to run my tv in 720p for retropie anyway switching it in
/boot/config.txt
-
@BuZz said in An input lag investigation:
The default is not 1080p. The default is the video output resolution, so depends on your screen. afaik it will affect it, as the final scale from render res to video output res is done via dispmanx. I tend to run my tv in 720p for retropie anyway switching it in
/boot/config.txt
Yeah for me the video output resolution is 1080p, so the default render resolution right now is 1080p for all of my emulators as well. Should lowering the render resolution to 720p theoretically lessen input lag because the shader overhead is lessened, or would it actually increase input delay because it would add an extra scaling step to bring the render resolution up to the video output res?
-
It should reduce lag in theory, but instead of doing that, if your tv upscales from 720p well, you could just use a 720p video mode. It is possible that screens that have a delay due to post processing could be quicker at a lower res also.
dispmanx also scales with a filter by default, so it might look better with a 720p video mode with the tv doing the scaling vs dispmanx (I prefer it on my tv)
-
@ScOULaris said in An input lag investigation:
Hm. So I understand how a shader can add more input delay (increasingly in correlation with the complexity of the shader). Would lowering the Retroarch rendering resolution to 720p instead of the default 1080p for SNES have any impact on input delay since it's half of the pixels that need to be processed by the shader?
Yes, but probably not as much as you'd think. You'll also have poorer image quality to deal with. crt-pi's scan lines work best when scaling up 4x or more. If you are going to use a lower resolution than that I'd suggest using integer scaling - and don't go below 3x.
Scaling in the display hardware is essentially free but you are probably better off setting your Pi to a 720p screen mode and letting your TV do the upscaling. Try both and see which works best. If the scan lines are hard to see with such as low resolution, try setting MASK_TYPE to 0.
-
Just to detail my tv/kodi setup downstairs:
/boot/config.txt contains
hdmi_group=1 hdmi_mode=4
which is 720 @ 60hz
/opt/retropie/configs/all/autostart.sh contains
sudo mount -t cifs -o username=blah,password=blah //mynas/retropie /home/pi/RetroPie kodi #auto emulationstation #auto
kodi is configured to display and output at 1080p (it can change video mode itself)
then I exit kodi, and I'm ready to game (in 720p)
the only other thing I have to do is switch the tv to game mode and back.
-
how do you change the display driver to Dispmanx? i assume mine is running stock, i never knew how to change.. Ive also yet to learn about shaders, so I'm not too worried about losing that ability.
-
you can do it via the configuration editor (in advanced mode)
https://github.com/retropie/retropie-setup/wiki/Configuration-Editor
-
thank you, i found it.
Unfortunately, i think I noticed WORSE play on dispmanx.... I can tell that the video quality is worse... but mashing jump in mario world created a worse experience than before with GL. Also trying to do a "speed run" caused me to not have some jumps registered, thus plowing right into enemies...
Contributions to the project are always appreciated, so if you would like to support us with a donation you can do so here.
Hosting provided by Mythic-Beasts. See the Hosting Information page for more information.