Updated crt-pi shader
-
Just to report back, I tried setting "video_scale_integer" and got a 60fps or just under. But I image that's due to less screen to process (black borders top and bottom), by the way I am using the following overclock settings:
gpu_mem=256
arm_freq=1400
over_voltage=6
sdram_freq=500
core_freq=500
gpu_freq=400I did however notice a speed increase when using lr-genesis-plus-gx instead of lr-picodrive. Thanks guys for the info!
-
@aikon82 said in Updated crt-pi shader:
I did however notice a speed increase when using lr-genesis-plus-gx instead of lr-picodrive. Thanks guys for the info!
Hello, today I noticed the performance issue with lr-picodrive and the crt-pi shader. It was super apparent playing Sonic. Then I started watching closely at other Sega games and it was across the platform. I switched to lr-genesis-plus-gx and everything was smooth again. I dont know if this shader causes performance anywhere else such as arcade games. I would have to experiment more.
Pi Model: 3 Model B
RetroPie Version Used: 4.0 RC1
Built From: SD Image
USB Devices connected: MS wired keyboard
Controller used: XinMo -
@GreenHawk84 said in Updated crt-pi shader:
I dont know if this shader causes performance anywhere else such as arcade games. I would have to experiment more.
If it is affecting arcade games, I don't really notice, or at least I am more excited about how it looks than how it plays. I have it running on over 100 arcade titles across lr-fba-next and lr-mame2003 and it plays nicely. I even have curvature enabled, Pi3, 1280x1024, usually fullscreen scaling.
-
@caver01, I run RetroPie on a 1920x1080 screen, I havent played with resolution on anything. Is there a difference enabling high resolutions vs. whatever RetroPie renders at default? Do you recommend anything for a 1920x1080 screen? And if so, where do I go in RetroPie to start messing with resolutions?
-
@GreenHawk84 A while back, the default emulator resolution was changed in RetroPie such that the cores now render in the resolution you are running on your Pi. There was a lot of discussion on the topic related to how this would affect performance, but in the long run, I think it was a good decision. You can always drop it down using runcommand. I recommend experimenting with that a bit so you know what affect it has on your setup.
For example, I run everything I can at 1280x1024, but it can be interesting to see what happens if I specify 640x480 using runcommand. This may improve performance, as many games might run in that resolution (or lower) anyway, but it forces my LCD to upscale the image to fill the screen. This is fine for some folks and ideal for anyone that is building portables around small screens, but it effectively hands the scaling process over to your display.
Those of us who run HD (or near HD) displays who want to reproduce the CRT look will be unhappy with LCD scaling. It s better to run at the native resolution of the display so you can get the most out of shaders. Shaders can leverage the extra pixels to simulate RGB triads and shadow masks. That does come with a potential performance compromise, but if the magnification impact is minimal, and if the shaders can be built in a way that combines the least impact with the best effect, running higher resolution makes a lot of sense. @davej is the real expert here, as he built the PI-CRT shader to perform well at 1080P, but as you can see from his description above, memory access constraints can affect performance.
Everyone has their own opinion about it, but I feel that at my resolution (1280x1024) the PI-CRT shaders are amazing on the Pi3, and running this way is why I choose libretro cores whenever possible. Playability of the games is obviously important, but a big part of what puts the "retro" in RetroPie for me comes from the sense of nostalgia that comes from seeing the realistic look and feel which is why I even enable curvature on my setup. So, that means high-res rendering with CRT shaders enabled whenever possible.
The holy grail for me will be the ability to render vector games at a display's native resolution with gorgeous glow/bloom CRT effects--especially on monochrome (no scanlines) games.
-
@caver01 said in Updated crt-pi shader:
ideal for anyone that is building portables around small screens, but it effectively hands the scaling process over to your display.
PI-CRT shaders are amazing on the Pi3, and running this way is why I choose libretro cores whenever possible.
Could not agree more!
-
Just to clarify a couple of things:
- Changing the screen mode - scaling done by screen
- Changing the render resolution (retroarch only) - scaling done by RPI hardware (no performance loss)
By default the render resolution is the same as the video output resolution. You can lower this for all retroarch emulators via the configuration editor, or per system. Via the runcommand launch menu you can further change this on a per emulator or per rom basis.
Video output resolution can be changed via runcommand on a per emulator or per rom basis.
To change video output resolution for everything you can do it via the /boot/config.txt
I run my 1080p screen at 720p for RetroPie, with the crt-pi shader. It's a modern screen and does a better job of scaling than the RPI imho. I also run Kodi, but as Kodi can switch resolution itself, when Kodi loads it switches to 1080p and back to 720p on exit.
-
-
@GreenHawk84
tvservice -s
from a terminal will tell you what mode you are in -
@BuZz said in Updated crt-pi shader:
I run my 1080p screen at 720p for RetroPie
tvservice -s from a terminal will tell you what mode you are in
Can you guys clear up some confusion. If you set your TV's output resolution to 720p vs 1080p does the RetroPie try and negotiate an output resolution equal to 720p as that's what the TV will signal back as it's highest possible resolution even though it's capable of higher resolutions?
Then if your render resolution is lower than your TV (output resolution?) the TV up-scales (the TV decides it's method of up-scaling?) or you can adjust your render resolution to something equal or lower? Would you improve performance by setting your render res low and offloading the upscale to the TV or is this where you loose quality or uniqueness?
By default isn't the render resolution generally always lower than the output of modern TV's? So you would need to decide render res (via pi hardware) or output res (via TV) to scale?
If your render resolution is greater than your output then does it clip or down-scale?
Where does the crt-pi shader "layer" in? After the render res but before the output res seems logical?
-
The resolution is set by the connecting device - if you mean if you set the video mode to 720p - that's the mode the RPI will output to the TV. The tv will upscale it if it needs to.
Render resolution is nothing to do with the TV - the RPI will upscale from render resolution to the video output resolution.
by default the render resolution will match whatever the current video mode is. they will be the same.
-
@BuZz said in Updated crt-pi shader:
The resolution is set by the connecting device - if you mean if you set the video mode to 720p - that's the mode the RPI will output to the TV. The tv will upscale it if it needs to.
Sorry Buzz if this is a completely dumb question as I am still missing it. If the TV is set to 720p and the Pi receives a signal from the TV saying output in this mode (this is the output res?) if they match what is there to upscale?
Render resolution is nothing to do with the TV - the RPI will upscale from render resolution to the video output resolution.
So render res is coming from the emulator and the pi is the middle layer sort-to-speak that adjusts the render res to match the output res?
-
RPI looks at the EDID information the tv provides and chooses the mode the TV says is recommended. Some TV's give the wrong information (bad firmware etc). RPI will choose the resolution the TV wants in most cases (which would normally be the native resolution).
The render res is a retroarch feature. the retroarch code uses the dispmanx api to scale from render res to video output res.
-
I have a screen downstairs that has a native resolution of 1024x768. However by default a 720p mode is used, as this is the preferred mode in the EDID info. It of course looks wrong. In this case I manually via
/boot/config.txt
set the screen to 1024x768 @60hz and then it looks correct. -
Thanks Buzz I think I have it straight, I think I just need to ponder on it a little. Can I try and apply it to a real scenario with default settings. A games "native" resolution let's say for example is 320x200. Retroarch would upscale it to 640x400 (the default res is 640x480 but it would fill it in vertically but horizontally it would have 40 blank pixels on either side?) From there it would be further upscaled at/to the TV's res (for example 720p which is 1280x720) with some fractional integer. I know I will need to keep reading and asking before it all comes full circle but I don't want to completely muddy this thread.
-
@Riverstorm There is one more step that may apply--your TV's native pixels may not be 1280x720 but rather 1920x1080, so unless you see black borders when your TV is set to 720p, your TV is upscaling to its native resolution at the end.
-
@caver01 said in Updated crt-pi shader:
@Riverstorm There is one more step that may apply--your TV's native pixels may not be 1280x720 but rather 1920x1080, so unless you see black borders when your TV is set to 720p, your TV is upscaling to its native resolution at the end.
Thanks Caver, I didn't think of that at all. A 720p setting on a native 1080p TV would leave a border all the way around unless the pixels stretch? I am guessing if no scaling happened you would have a small 320x200 square in the middle of the screen. So many factors in there including more like integer scaling, etc. I was hoping to grasp just the general steps a game goes through from start (emulator) to finish (TV) but I don't think I quite have a good understanding. Hopefully with time.
-
@Riverstorm Actually, by your descriptions I think you do have a good understanding. I don't expect anyone who runs their TV in 720 mode has black borders. That was just to make a point that the TV mode is really just another level of scaling. As Buzz said earlier, he actually likes the way his TV scales from 720.
What helps me is to work backwards from the native pixels and decide if I like each step. Assuming I can correctly account for what is doing the scaling, it helps me recognize opportunities. For example, because I know the shader can handle it, I want to give retroarch a resolution as close as I can to native pixels so that these can be used by the shader to create better CRT effects. So, I run at my LCD's full resolution, letting the emulator fill this and giving the shader the most pixels for creating smooth scanlines and curvature effects. Buzz, on the other hand, sounds like he drops his down to 720. He's still getting good effects at that resolution, but he likes how his display upscales beyond that. In fact, anyone with UltraHD is in the same boat, as they are not likely running their Pi at 3840x2160, but something lower and letting their TV upscale.
-
@caver01 said in Updated crt-pi shader:
Buzz, on the other hand, sounds like he drops his down to 720
I appreciate you explaining it. I still feel like I am missing pieces. When you wrote the line above. Where would he be dropping it down to 720p? You change the screen mode reported/sent to the Pi and it thinks the display is a 720p which makes Retroarch render at 720p but the TV itself will upscale to the native res of 1080p?
I want to give retroarch a resolution as close as I can to native pixels so that these can be used by the shader to create better CRT effects
I prefer this but on the other hand if you allow the TV to upscale in theory it seems it should be a huge performance boost if the resolution is low enough as the Pi doesn't need to do any additional "resolution rendering" work and the TV is taking the load and doing the upscaling ? But when I think like that it contradicts what Buzz wrote which means I am not understanding it.
Changing the render resolution (retroarch only) - scaling done by RPI hardware (no performance loss)
If I am to far off base you can let me know and I am ok enjoying RetroPie if not completely understanding the logic. :)
-
I appreciate you explaining it. I still feel like I am missing pieces. When you wrote the line above. Where would he be dropping it down to 720p? You change the screen mode reported/sent to the Pi and it thinks the display is a 720p which makes Retroarch render at 720p but the TV itself will upscale to the native res of 1080p?
You don't change the screenmode on the TV. You change it on the RPI. In my case, the TV still says to the RPI - "I prefer 1080p". I then configure the RPI via
/boot/config.txt
to switch to 720p. - the TV will then deal with that signal and upscale it etc.
Contributions to the project are always appreciated, so if you would like to support us with a donation you can do so here.
Hosting provided by Mythic-Beasts. See the Hosting Information page for more information.