crt-pi shader users - reduce scaling artifacts with these configs in lr-mame2003, lr-fbalpha, lr-nestopia (and more to come)
-
@dankcushions Is it possible to convert this somehow for 640*480 CRT VGA monitor (480p)?
I use a pc monitor with that resolution, but the monitors scanlines does not make a big difference at that resolution. It does not support 240p, the point where I wouldn't need any shader. Thats why I use some sort of shaders here and I really like yours. But it looks often very bad for this resolution.Edit: Sorry, I saw the link with the script for creating the package myself. I will download and try it myself first.
Edit2: OH, but I think this will not help me with the other consoles right? I would need a shader specifically for 480p.
-
@dankcushions Ok, done generating this. It does not work with 480p. I get following content for all games:
# Auto-generated crt-pi-vertical.glslp .cfg # Game Title : dkong , Width : 224, Height : 298, Aspect : 3:4, Scale Factor : 1.61073825503 # Screen Width : 640, Screen Height : 480 # Place in /opt/retropie/configs/all/retroarch/config/MAME 2003/ # Insufficient resolution for good quality shader video_shader_enable = "false"
-
@thelostsoul didn't we already have this conversation? :) https://retropie.org.uk/forum/topic/4046/crt-pi-shader-users-reduce-scaling-artifacts-with-these-configs-in-lr-mame2003-lr-fbalpha-lr-nestopia-and-more-to-come/277
-
@dankcushions Yes, I remember. Till then, I am not happy with the Arcade games. If I get an old 240p CRT, then I can leave this shader stuff behind me. Currently I use the shader named "scanlines" for all Arcade games and it works, but doesn't look good as it would with a correct crt-pi shader.
@caver01 said:
"
That's really interesting. I need to do a CRT build at some point.
"
Which is exactly what I am searching for, isn't it? Am I the only one asking for shader at 480p? -
@thelostsoul said in crt-pi shader users - reduce scaling artifacts with these configs in lr-mame2003, lr-fbalpha, lr-nestopia (and more to come):
Am I the only one asking for shader at 480p
I think so, because with that low res, you won't have many pixels to simulate the scanlines or shadow mask effects. And for folks with a real CRT, well, no shader needed because you have the CRT we are trying to simulate.
Of course, if the CRT is a high-resolution (very fine dot-pitch) multi-sync PC monitor, your real scanlines could lack the vintage look and feel at 480 that a TV would have shown. You might as well run it at a HIGH resolution and enable shaders like the rest of us.
-
@caver01 Hmm ok, I understand now it does not make any sense to produce such shader. Then I need a different CRT device. Thanks for your answers guys. :-) Never mind then.
-
Could this script be used to generate cfgs using zfast shaders?
-
@cloudlink it's an outstanding issue: https://github.com/dankcushions/crt-pi-configs/issues/11
no ETA on it.
-
@cloudlink If you're so inclined, it would be pretty straightforward to modify the script to use the zfast shaders.
The shaders are referenced in two places in the script - lines 93-96 and lines 105-108. In each instance, they look a bit like this;
if curvature: shader = "crt-pi-curvature-vertical.glslp" else: shader = "crt-pi-vertical.glslp"
Just change the shader name to the one you want to use and then run the script to generate the .cfg files.
The main reason (I assume) it's on @dankcushions to-do list with no ETA is that the more 'official' way to do it would be to present the user with the option - via the command line - to use either shader, which would take more coding.
-
@andrewh for me it's more that i haven't had a chance to test the shaders - they're allegedly faster so if the image quality is equivalent, i would probably just wholesale replace crt-pi with them. we could have an option, i suppose, but i think anyone running the script could probably just as well edit it to suit whatever they wanted.
i say no eta as i've been meaning to test them for about 7 months now, and still haven't :)
-
@dankcushions I have been using zfast for a while now and I will say that moire patterns and rainbows are reduced. I have always used curvature so I have not really taken advantage of the configs, but with zfast, the artifacts may be harder to notice anyway. I am curious about the results @cloudlink might be able to share.
-
@andrewh said in crt-pi shader users - reduce scaling artifacts with these configs in lr-mame2003, lr-fbalpha, lr-nestopia (and more to come):
@cloudlink If you're so inclined, it would be pretty straightforward to modify the script to use the zfast shaders.
The shaders are referenced in two places in the script - lines 93-96 and lines 105-108. In each instance, they look a bit like this;
if curvature: shader = "crt-pi-curvature-vertical.glslp" else: shader = "crt-pi-vertical.glslp"
Just change the shader name to the one you want to use and then run the script to generate the .cfg files.
The main reason (I assume) it's on @dankcushions to-do list with no ETA is that the more 'official' way to do it would be to present the user with the option - via the command line - to use either shader, which would take more coding.
Thanks.
I modified the python script to test it. I should be able to test it in a few hours.
Here's the modification if anyone else wants to try it:
https://pastebin.com/F1zp5qcz -
@dankcushions said in crt-pi shader users - reduce scaling artifacts with these configs in lr-mame2003, lr-fbalpha, lr-nestopia (and more to come):
@andrewh for me it's more that i haven't had a chance to test the shaders
Ah, ok - fair enough.
I swapped them in quite some time back - several months now - and haven't noticed anything that caused concern.
That said, I'm not necessarily the most discerning, so don't take this as any sort of suggestion that you don't need to test them yourself :-)
-
I tested out the script generating standard and curved zfast shader configs for Mame 2003 and FBA. It works great and both standard and curved shaders look and run great. I'm very impressed with the zfast shaders.
-
Are the scanlines guaranteed to be always aligned with the games original pixels, regardless of my scaling settings? E.g. if the game is low-res and fullscreen, and the pixels are large, the scanlines are also thicker?
-
@rsn8887 said in crt-pi shader users - reduce scaling artifacts with these configs in lr-mame2003, lr-fbalpha, lr-nestopia (and more to come):
Are the scanlines guaranteed to be always aligned with the games original pixels, regardless of my scaling settings? E.g. if the game is low-res and fullscreen, and the pixels are large, the scanlines are also thicker?
In my experience, the scanlines do line up with the game's original pixels, but they look more like CRT phosphor dots when you run higher resolutions. If you run on a low-res display, you cannot get around the inherent stair-stepping that is visible at the display's native pixels. At higher resolutions you have more pixels available for the shader to use. Theoretically, with extremely high resolution, a shader could create visually perfect scanlines, but of course, there are performance concerns when the resolution gets too high. The current optimum will typically be to run as high res as you can before performance becomes a problem. The shaders have been coded to run at HD resolution. Yet, even at HD you get some artifacts--hence the config files which try to align at least one dimension with an integer scale factor of your display's native resolution.
You cannot just say "higher resolution is better" because of the performance impact, but higher does get you a better looking effect in my opinion. Do some tests and decide what you like best.
-
@rsn8887 said in crt-pi shader users - reduce scaling artifacts with these configs in lr-mame2003, lr-fbalpha, lr-nestopia (and more to come):
Are the scanlines guaranteed to be always aligned with the games original pixels, regardless of my scaling settings? E.g. if the game is low-res and fullscreen, and the pixels are large, the scanlines are also thicker?
there are two separate resolutions at play with scanline shaders:
-
the original game's original resolution. this is typically 240 pixels high. a black scanline will be inserted between every vertical pixel. this rule always applies, no matter your render resolution:
-
your render resolution (by default in retropie this is your display's resolution). on an HDTV this would be 1080 pixels high. this determines the quality of the image, but it has no effect on the thickness (other than via scaling artefacts, which my script aims to eliminate) or number of scanlines.
whilst a game with a higher original resolution (eg, tekken 3 on the psx is 480 pixels high) will have more scanlines, you have to consider that this isn't a real world situation. CRTs have scanlines when running progressive scan, low resolution games. tekken 3 would be running in an interlaced (non-progressive) mode, so would NOT have scanlines. the shader isn't able to figure this out, so still injects scanlines regardless, but they are so thin they end up looking like a sort of 'haze' on the screen.
to answer your question specifically:
Are the scanlines guaranteed to be always aligned with the games original pixels, regardless of my scaling settings?
the shader guarantees that there will be a scanline between every vertical pixel (1) of the original game. if you upscale this to a different resolution (2), you will get scaling artefacts unless the new resolution (2) is a factor of the original resolution (1). this script aims to mitigate those artefacts - i explain how in the first post.
-
-
Interesting.
I wonder then why are there so many versions? There should be only one shader that adapts itself automatically and generates the masks etc. on the fly for whatever input and output resolutions are used.
For example why is there not a check inside the vertex shader to pass a variable to the fragment shader that disables scanlines if the game resolution is larger than 240p?
Also the vertical vs horizontal thing bothers me, this should be automatically checked within the vertex shader and the result passed to the fragment shader.
The vertex shader is only executed four or six times or so per frame for a 2d game. So the slowdown of these automatic checks should be negligible.
I read somewhere that making the shader more automatic would cause slowdown but that makes no sense to me, if these things are only done in the vertex shader.
-
@rsn8887 said in crt-pi shader users - reduce scaling artifacts with these configs in lr-mame2003, lr-fbalpha, lr-nestopia (and more to come):
Interesting.
I wonder then why are there so many versions? There should be only one shader that adapts itself automatically and generates the masks etc. on the fly for whatever input and output resolutions are used.
what do you mean many versions? there's not.
For example why is there not a check inside the vertex shader to pass a variable to the fragment shader that disables scanlines if the game resolution is larger than 240p?
Also the vertical vs horizontal thing bothers me, this should be automatically checked within the vertex shader and the result passed to the fragment shader.
The vertex shader is only executed four or six times or so per frame for a 2d game. So the slowdown of these automatic checks should be negligible.
I read somewhere that making the shader more automatic would cause slowdown but that makes no sense to me, if these things are only done in the vertex shader.
i mean, if you know better than the shader authors, why not do it yourself? :) i think zfast shader doesn't have a vertical variant, so maybe the issue is solved there, or maybe it doesn't have a shadow mask. i haven't tested it yet...
-
@dankcushions said in crt-pi shader users - reduce scaling artifacts with these configs in lr-mame2003, lr-fbalpha, lr-nestopia (and more to come):
i think zfast shader doesn't have a vertical variant,
No, there is a vertical variant of zfast too. Actually, there are four variants to cover the standard, vertical, and both again with curvature.
@rsn8887 it as been stated by davej and possibly others that, although it is possible to do some detection on the front-end to avoid the variants, this alone can have an adverse affect on performance. The point being, we all want these shaders to run as lean as possible and have as little effect on performance as necessary to achieve the visual results. The variants are really the same shader repeated, only with different configuration settings embedded so that we as users don't have to dive into the shader files and change the settings. In other words, it is simply easier to make a duplicate with adjustments and cycle through shaders to use them than it is to edit the files themselves. You also can specify which variant in rom-specific configs.
Contributions to the project are always appreciated, so if you would like to support us with a donation you can do so here.
Hosting provided by Mythic-Beasts. See the Hosting Information page for more information.