Updated crt-pi shader
-
@davej said in Updated crt-pi shader:
@Riverstorm The Atari 2600 has a very low horizontal resolution. By default, crt-pi just uses linear filtering horizontally - which, in combination with the low resolution, is what is causing it to look blurry. You can make it sharper by uncommenting the SHARPER line so it reads.
#define SHARPER
Perfect! That was the ticket! Is it best to duplicate the files to accommodate the Atari Shader? I duplicated crt-pi.glslp and crt-pi.glsl adding -atari to the name. Then modified the shader 0 line to point to the newly named glsl while uncommenting
#define SHARPER
or how are people doing per emulator tweaks.Should I do anything with the mask or leave as is?
-
@Riverstorm said in Updated crt-pi shader:
Perfect! That was the ticket! Is it best to duplicate the files to accommodate the Atari Shader? I duplicated crt-pi.glslp and crt-pi.glsl adding -atari to the name. Then modified the shader 0 line to point to the newly named glsl while uncommenting
#define SHARPER
or how are people doing per emulator tweaks.That sounds a reasonable approach. I tend to use just one shader unless I'm working on them so I'm not the best person to ask.
Should I do anything with the mask or leave as is?
It's pretty much an aesthetic decision as to which mask to use so whichever you think looks best.
-
@davej said in Updated crt-pi shader:
That sounds a reasonable approach. I tend to use just one shader unless I'm working on them so I'm not the best person to ask.
Thanks Dave, I didn't want to mess up how MAME and other emulators looked so I duplicated it with the tweak for Atari but out of curiosity would leaving
#define SHARPER
on effect how the other emulators look or would they still be the same? -
@Riverstorm said in Updated crt-pi shader:
Thanks Dave, I didn't want to mess up how MAME and other emulators looked so I duplicated it with the tweak for Atari but out of curiosity would leaving
#define SHARPER
on effect how the other emulators look or would they still be the same?Images will look a bit more pixelated with SHARPER enabled. The reasons for having the default as linear filtering is a) it's needed to get 1080P @ 60Hz on a Pi1/Pi2 (I don't know about a Pi3) and b) I think it looks better for games that try to create smooth transitions with limited palettes.
Ultimately, it's down to personal preference.
The image below shows some example differences. Default on the left, sharper on the right.
-
@davej said in Updated crt-pi shader:
Ultimately, it's down to personal preference.
Thanks Dave for the side by side I wouldn't be able to see the difference without it. I prefer the default (exception being Atari ;). I agree on the limited palette-smooth transition. I think consoles on old CRT TV's were able to take advantage of that or I should say they optimized their graphics to create some nice color blending as a side effect of low resolutions. Also on old CRT's I don't think they really had "true" blacks as TV's do now but more of a dark muddy gray and they definitely had more of a blur. :)
-
Hi Dave, I have a Pi 3 I picked up over the weekend and have been playing about with RetroPie and reading up a LOT of info.... However there seems to be an issue with the crt-pi shaders running at 1080p on my tv. The frame rate seems to be a near constant 52.9-53fps in Altered Beast for the Megadrive (haven't changed the default core) and any other Megadrive/Genesis game for that matter. I then tried some Super NES games and the rate was nearly 60 fps.
At first I thought I thought it may be down to CPU use so went ahead and tested different overclock settings by editing the config.txt file, this seems to make no change whatsoever if I set the ARM to 1200, 1400, 800 or stock. I've even messed around vsync on and off but that made no difference.
I then tried reducing the render resolution in the configuration editor to less than 1080p (eg 800x600) and I then get a full 60fps. Has anyone else had an issue?
-
@aikon82
It sounds like the emulator and shader are fighting for memory bandwidth. Although the shader can run at 1080P@60Hz, it can only do so if other parts of the system don't do too much memory access at the same time. On the Pi2, I don't have a Pi3 to test with, some SNES games are OK and some just push it over limit and drop occasional frames. As the Pi3 has similar memory to the Pi2, I suspect the same thing is happening for you.Things you can do to help:
- Overclock the memory as fast as it will go.
- Use a more efficient emulator if there is one. (Some need less memory accesses than others.)
- Change the audio resampler driver to 'nearest'.
You can also try making the following changes to the shader:
- Don't enable CURVATURE or SHARPER.
- Try enabling FAKE_GAMMA.
- Disable GAMMA altogether.
- Set MASK_TYPE to 0.
- Disable MULTISAMPLE.
- Disable SCANLINES.
Although the shader changes don't actually reduce the amount of memory it needs to access, they do mean the GPU can better tolerate the CPU accessing memory when the GPU wants to.
Ironically, Pi1s and Pi Zeros have less of a problem with this issue because you can run their memory faster. They just have a problem running the emulators fast enough. ;)
-
You could check if you have any custom aspect ratio set with "aspect_ratio_index" and make sure that "video_scale_integer" is set to "true". If I set any of these differently I also get the lag, if not, constant 60fps.
-
Thanks davej/zonitz for the info, I'll have a look when I'm back home later on tonight. I don't think I've touched the aspect ratio at all but the video scale interger is set to false if I remember correctly.
In raspi-config there's a memory split option, can you please advise if there are any settings I should change there?
-
Just to report back, I tried setting "video_scale_integer" and got a 60fps or just under. But I image that's due to less screen to process (black borders top and bottom), by the way I am using the following overclock settings:
gpu_mem=256
arm_freq=1400
over_voltage=6
sdram_freq=500
core_freq=500
gpu_freq=400I did however notice a speed increase when using lr-genesis-plus-gx instead of lr-picodrive. Thanks guys for the info!
-
@aikon82 said in Updated crt-pi shader:
I did however notice a speed increase when using lr-genesis-plus-gx instead of lr-picodrive. Thanks guys for the info!
Hello, today I noticed the performance issue with lr-picodrive and the crt-pi shader. It was super apparent playing Sonic. Then I started watching closely at other Sega games and it was across the platform. I switched to lr-genesis-plus-gx and everything was smooth again. I dont know if this shader causes performance anywhere else such as arcade games. I would have to experiment more.
Pi Model: 3 Model B
RetroPie Version Used: 4.0 RC1
Built From: SD Image
USB Devices connected: MS wired keyboard
Controller used: XinMo -
@GreenHawk84 said in Updated crt-pi shader:
I dont know if this shader causes performance anywhere else such as arcade games. I would have to experiment more.
If it is affecting arcade games, I don't really notice, or at least I am more excited about how it looks than how it plays. I have it running on over 100 arcade titles across lr-fba-next and lr-mame2003 and it plays nicely. I even have curvature enabled, Pi3, 1280x1024, usually fullscreen scaling.
-
@caver01, I run RetroPie on a 1920x1080 screen, I havent played with resolution on anything. Is there a difference enabling high resolutions vs. whatever RetroPie renders at default? Do you recommend anything for a 1920x1080 screen? And if so, where do I go in RetroPie to start messing with resolutions?
-
@GreenHawk84 A while back, the default emulator resolution was changed in RetroPie such that the cores now render in the resolution you are running on your Pi. There was a lot of discussion on the topic related to how this would affect performance, but in the long run, I think it was a good decision. You can always drop it down using runcommand. I recommend experimenting with that a bit so you know what affect it has on your setup.
For example, I run everything I can at 1280x1024, but it can be interesting to see what happens if I specify 640x480 using runcommand. This may improve performance, as many games might run in that resolution (or lower) anyway, but it forces my LCD to upscale the image to fill the screen. This is fine for some folks and ideal for anyone that is building portables around small screens, but it effectively hands the scaling process over to your display.
Those of us who run HD (or near HD) displays who want to reproduce the CRT look will be unhappy with LCD scaling. It s better to run at the native resolution of the display so you can get the most out of shaders. Shaders can leverage the extra pixels to simulate RGB triads and shadow masks. That does come with a potential performance compromise, but if the magnification impact is minimal, and if the shaders can be built in a way that combines the least impact with the best effect, running higher resolution makes a lot of sense. @davej is the real expert here, as he built the PI-CRT shader to perform well at 1080P, but as you can see from his description above, memory access constraints can affect performance.
Everyone has their own opinion about it, but I feel that at my resolution (1280x1024) the PI-CRT shaders are amazing on the Pi3, and running this way is why I choose libretro cores whenever possible. Playability of the games is obviously important, but a big part of what puts the "retro" in RetroPie for me comes from the sense of nostalgia that comes from seeing the realistic look and feel which is why I even enable curvature on my setup. So, that means high-res rendering with CRT shaders enabled whenever possible.
The holy grail for me will be the ability to render vector games at a display's native resolution with gorgeous glow/bloom CRT effects--especially on monochrome (no scanlines) games.
-
@caver01 said in Updated crt-pi shader:
ideal for anyone that is building portables around small screens, but it effectively hands the scaling process over to your display.
PI-CRT shaders are amazing on the Pi3, and running this way is why I choose libretro cores whenever possible.
Could not agree more!
-
Just to clarify a couple of things:
- Changing the screen mode - scaling done by screen
- Changing the render resolution (retroarch only) - scaling done by RPI hardware (no performance loss)
By default the render resolution is the same as the video output resolution. You can lower this for all retroarch emulators via the configuration editor, or per system. Via the runcommand launch menu you can further change this on a per emulator or per rom basis.
Video output resolution can be changed via runcommand on a per emulator or per rom basis.
To change video output resolution for everything you can do it via the /boot/config.txt
I run my 1080p screen at 720p for RetroPie, with the crt-pi shader. It's a modern screen and does a better job of scaling than the RPI imho. I also run Kodi, but as Kodi can switch resolution itself, when Kodi loads it switches to 1080p and back to 720p on exit.
-
-
@GreenHawk84
tvservice -s
from a terminal will tell you what mode you are in -
@BuZz said in Updated crt-pi shader:
I run my 1080p screen at 720p for RetroPie
tvservice -s from a terminal will tell you what mode you are in
Can you guys clear up some confusion. If you set your TV's output resolution to 720p vs 1080p does the RetroPie try and negotiate an output resolution equal to 720p as that's what the TV will signal back as it's highest possible resolution even though it's capable of higher resolutions?
Then if your render resolution is lower than your TV (output resolution?) the TV up-scales (the TV decides it's method of up-scaling?) or you can adjust your render resolution to something equal or lower? Would you improve performance by setting your render res low and offloading the upscale to the TV or is this where you loose quality or uniqueness?
By default isn't the render resolution generally always lower than the output of modern TV's? So you would need to decide render res (via pi hardware) or output res (via TV) to scale?
If your render resolution is greater than your output then does it clip or down-scale?
Where does the crt-pi shader "layer" in? After the render res but before the output res seems logical?
-
The resolution is set by the connecting device - if you mean if you set the video mode to 720p - that's the mode the RPI will output to the TV. The tv will upscale it if it needs to.
Render resolution is nothing to do with the TV - the RPI will upscale from render resolution to the video output resolution.
by default the render resolution will match whatever the current video mode is. they will be the same.
Contributions to the project are always appreciated, so if you would like to support us with a donation you can do so here.
Hosting provided by Mythic-Beasts. See the Hosting Information page for more information.