Updated crt-pi shader
-
@aikon82
It sounds like the emulator and shader are fighting for memory bandwidth. Although the shader can run at 1080P@60Hz, it can only do so if other parts of the system don't do too much memory access at the same time. On the Pi2, I don't have a Pi3 to test with, some SNES games are OK and some just push it over limit and drop occasional frames. As the Pi3 has similar memory to the Pi2, I suspect the same thing is happening for you.Things you can do to help:
- Overclock the memory as fast as it will go.
- Use a more efficient emulator if there is one. (Some need less memory accesses than others.)
- Change the audio resampler driver to 'nearest'.
You can also try making the following changes to the shader:
- Don't enable CURVATURE or SHARPER.
- Try enabling FAKE_GAMMA.
- Disable GAMMA altogether.
- Set MASK_TYPE to 0.
- Disable MULTISAMPLE.
- Disable SCANLINES.
Although the shader changes don't actually reduce the amount of memory it needs to access, they do mean the GPU can better tolerate the CPU accessing memory when the GPU wants to.
Ironically, Pi1s and Pi Zeros have less of a problem with this issue because you can run their memory faster. They just have a problem running the emulators fast enough. ;)
-
You could check if you have any custom aspect ratio set with "aspect_ratio_index" and make sure that "video_scale_integer" is set to "true". If I set any of these differently I also get the lag, if not, constant 60fps.
-
Thanks davej/zonitz for the info, I'll have a look when I'm back home later on tonight. I don't think I've touched the aspect ratio at all but the video scale interger is set to false if I remember correctly.
In raspi-config there's a memory split option, can you please advise if there are any settings I should change there?
-
Just to report back, I tried setting "video_scale_integer" and got a 60fps or just under. But I image that's due to less screen to process (black borders top and bottom), by the way I am using the following overclock settings:
gpu_mem=256
arm_freq=1400
over_voltage=6
sdram_freq=500
core_freq=500
gpu_freq=400I did however notice a speed increase when using lr-genesis-plus-gx instead of lr-picodrive. Thanks guys for the info!
-
@aikon82 said in Updated crt-pi shader:
I did however notice a speed increase when using lr-genesis-plus-gx instead of lr-picodrive. Thanks guys for the info!
Hello, today I noticed the performance issue with lr-picodrive and the crt-pi shader. It was super apparent playing Sonic. Then I started watching closely at other Sega games and it was across the platform. I switched to lr-genesis-plus-gx and everything was smooth again. I dont know if this shader causes performance anywhere else such as arcade games. I would have to experiment more.
Pi Model: 3 Model B
RetroPie Version Used: 4.0 RC1
Built From: SD Image
USB Devices connected: MS wired keyboard
Controller used: XinMo -
@GreenHawk84 said in Updated crt-pi shader:
I dont know if this shader causes performance anywhere else such as arcade games. I would have to experiment more.
If it is affecting arcade games, I don't really notice, or at least I am more excited about how it looks than how it plays. I have it running on over 100 arcade titles across lr-fba-next and lr-mame2003 and it plays nicely. I even have curvature enabled, Pi3, 1280x1024, usually fullscreen scaling.
-
@caver01, I run RetroPie on a 1920x1080 screen, I havent played with resolution on anything. Is there a difference enabling high resolutions vs. whatever RetroPie renders at default? Do you recommend anything for a 1920x1080 screen? And if so, where do I go in RetroPie to start messing with resolutions?
-
@GreenHawk84 A while back, the default emulator resolution was changed in RetroPie such that the cores now render in the resolution you are running on your Pi. There was a lot of discussion on the topic related to how this would affect performance, but in the long run, I think it was a good decision. You can always drop it down using runcommand. I recommend experimenting with that a bit so you know what affect it has on your setup.
For example, I run everything I can at 1280x1024, but it can be interesting to see what happens if I specify 640x480 using runcommand. This may improve performance, as many games might run in that resolution (or lower) anyway, but it forces my LCD to upscale the image to fill the screen. This is fine for some folks and ideal for anyone that is building portables around small screens, but it effectively hands the scaling process over to your display.
Those of us who run HD (or near HD) displays who want to reproduce the CRT look will be unhappy with LCD scaling. It s better to run at the native resolution of the display so you can get the most out of shaders. Shaders can leverage the extra pixels to simulate RGB triads and shadow masks. That does come with a potential performance compromise, but if the magnification impact is minimal, and if the shaders can be built in a way that combines the least impact with the best effect, running higher resolution makes a lot of sense. @davej is the real expert here, as he built the PI-CRT shader to perform well at 1080P, but as you can see from his description above, memory access constraints can affect performance.
Everyone has their own opinion about it, but I feel that at my resolution (1280x1024) the PI-CRT shaders are amazing on the Pi3, and running this way is why I choose libretro cores whenever possible. Playability of the games is obviously important, but a big part of what puts the "retro" in RetroPie for me comes from the sense of nostalgia that comes from seeing the realistic look and feel which is why I even enable curvature on my setup. So, that means high-res rendering with CRT shaders enabled whenever possible.
The holy grail for me will be the ability to render vector games at a display's native resolution with gorgeous glow/bloom CRT effects--especially on monochrome (no scanlines) games.
-
@caver01 said in Updated crt-pi shader:
ideal for anyone that is building portables around small screens, but it effectively hands the scaling process over to your display.
PI-CRT shaders are amazing on the Pi3, and running this way is why I choose libretro cores whenever possible.
Could not agree more!
-
Just to clarify a couple of things:
- Changing the screen mode - scaling done by screen
- Changing the render resolution (retroarch only) - scaling done by RPI hardware (no performance loss)
By default the render resolution is the same as the video output resolution. You can lower this for all retroarch emulators via the configuration editor, or per system. Via the runcommand launch menu you can further change this on a per emulator or per rom basis.
Video output resolution can be changed via runcommand on a per emulator or per rom basis.
To change video output resolution for everything you can do it via the /boot/config.txt
I run my 1080p screen at 720p for RetroPie, with the crt-pi shader. It's a modern screen and does a better job of scaling than the RPI imho. I also run Kodi, but as Kodi can switch resolution itself, when Kodi loads it switches to 1080p and back to 720p on exit.
-
-
@GreenHawk84
tvservice -s
from a terminal will tell you what mode you are in -
@BuZz said in Updated crt-pi shader:
I run my 1080p screen at 720p for RetroPie
tvservice -s from a terminal will tell you what mode you are in
Can you guys clear up some confusion. If you set your TV's output resolution to 720p vs 1080p does the RetroPie try and negotiate an output resolution equal to 720p as that's what the TV will signal back as it's highest possible resolution even though it's capable of higher resolutions?
Then if your render resolution is lower than your TV (output resolution?) the TV up-scales (the TV decides it's method of up-scaling?) or you can adjust your render resolution to something equal or lower? Would you improve performance by setting your render res low and offloading the upscale to the TV or is this where you loose quality or uniqueness?
By default isn't the render resolution generally always lower than the output of modern TV's? So you would need to decide render res (via pi hardware) or output res (via TV) to scale?
If your render resolution is greater than your output then does it clip or down-scale?
Where does the crt-pi shader "layer" in? After the render res but before the output res seems logical?
-
The resolution is set by the connecting device - if you mean if you set the video mode to 720p - that's the mode the RPI will output to the TV. The tv will upscale it if it needs to.
Render resolution is nothing to do with the TV - the RPI will upscale from render resolution to the video output resolution.
by default the render resolution will match whatever the current video mode is. they will be the same.
-
@BuZz said in Updated crt-pi shader:
The resolution is set by the connecting device - if you mean if you set the video mode to 720p - that's the mode the RPI will output to the TV. The tv will upscale it if it needs to.
Sorry Buzz if this is a completely dumb question as I am still missing it. If the TV is set to 720p and the Pi receives a signal from the TV saying output in this mode (this is the output res?) if they match what is there to upscale?
Render resolution is nothing to do with the TV - the RPI will upscale from render resolution to the video output resolution.
So render res is coming from the emulator and the pi is the middle layer sort-to-speak that adjusts the render res to match the output res?
-
RPI looks at the EDID information the tv provides and chooses the mode the TV says is recommended. Some TV's give the wrong information (bad firmware etc). RPI will choose the resolution the TV wants in most cases (which would normally be the native resolution).
The render res is a retroarch feature. the retroarch code uses the dispmanx api to scale from render res to video output res.
-
I have a screen downstairs that has a native resolution of 1024x768. However by default a 720p mode is used, as this is the preferred mode in the EDID info. It of course looks wrong. In this case I manually via
/boot/config.txt
set the screen to 1024x768 @60hz and then it looks correct. -
Thanks Buzz I think I have it straight, I think I just need to ponder on it a little. Can I try and apply it to a real scenario with default settings. A games "native" resolution let's say for example is 320x200. Retroarch would upscale it to 640x400 (the default res is 640x480 but it would fill it in vertically but horizontally it would have 40 blank pixels on either side?) From there it would be further upscaled at/to the TV's res (for example 720p which is 1280x720) with some fractional integer. I know I will need to keep reading and asking before it all comes full circle but I don't want to completely muddy this thread.
-
@Riverstorm There is one more step that may apply--your TV's native pixels may not be 1280x720 but rather 1920x1080, so unless you see black borders when your TV is set to 720p, your TV is upscaling to its native resolution at the end.
-
@caver01 said in Updated crt-pi shader:
@Riverstorm There is one more step that may apply--your TV's native pixels may not be 1280x720 but rather 1920x1080, so unless you see black borders when your TV is set to 720p, your TV is upscaling to its native resolution at the end.
Thanks Caver, I didn't think of that at all. A 720p setting on a native 1080p TV would leave a border all the way around unless the pixels stretch? I am guessing if no scaling happened you would have a small 320x200 square in the middle of the screen. So many factors in there including more like integer scaling, etc. I was hoping to grasp just the general steps a game goes through from start (emulator) to finish (TV) but I don't think I quite have a good understanding. Hopefully with time.
Contributions to the project are always appreciated, so if you would like to support us with a donation you can do so here.
Hosting provided by Mythic-Beasts. See the Hosting Information page for more information.