lr-duckstation - Experimental new PlayStation 1 emulator
-
@stoo said in lr-duckstation - Experimental new PlayStation 1 emulator:
Vulkan
AFAIK, Vulkan is not officially available in RetroPie yet, at least on RPi platforms.
[ERROR] Requesting Vulkan context, but RetroArch is not compiled against Vulkan. Cannot use HW context.that's right - it's nether included in raspi pi OS yet, nor does retropie compile retroarch for vulkan on pi platforms
-
@dankcushions Hi there.
See here: From ALL folder: https://pastebin.com/VAHGAUQk
From PSX folder: https://pastebin.com/yJqykEM0Cheers,
Marnix -
@stoo Hi there. Thanks a lot for your efforts, much appreciated. Please see here: https://pastebin.com/mNe4uETa
-
@mrpacman17 said in lr-duckstation - Experimental new PlayStation 1 emulator:
@dankcushions Hi there.
See here: From ALL folder: https://pastebin.com/VAHGAUQk
this is a non-default config, unfortunately. i would reset it by updating retroarch, which will generate a
retroarch.cfg.rp-dist
default file in th same folder. delete/rename your old retroarch.cfg and then renameretroarch.cfg.rp-dist
toretroarch.cfg
that said, the line i was curious about is
video_driver = "gl"
try instead with
video_driver = "glcore"
From PSX folder: https://pastebin.com/yJqykEM0
looks fine
-
@dankcushions Thank you, will do as advised. Many thanks again for your hard work and time.
-
@dankcushions I did as requested. The file is default now and I changed the videodriver to glcore as requested. But the issue persists, see Runcommand log: https://pastebin.com/YcmcpvLM
-
@mrpacman17
glcore
is not supported on a Pi4. Usegl
as the video driver. Also, RetroPie doesn't support running it in a desktop session on a Pi, you should run it outside the desktop environment. -
@mitu I am running Xubuntu on an HP laptop (ProBook 6550B). I did as adviced but still the same issue is at play.
-
@mrpacman17 said in lr-duckstation - Experimental new PlayStation 1 emulator:
ProBook 6550B
I see, even this way
glcore
would not work. According to HP's spec page, the GPU included doesn't support OpenGL 3.3, which is needed forglcore
. -
@mitu Okay, video is set back to GL. Still no luck though. I am doing something wrong but I don't know what. Sorry for my basic level of understanding here.
-
@mrpacman17 Try to force the Software renderer in Duckstation by modifying
retroarch-core-options.cfg
and settingduckstation_GPU.Renderer = "Software"
-
@mitu we tried that :)
i believe even in software mode duckstaiton needs a valid gl context. methinks this hardware is getting nothing. perhaps a linux driver issue.nah it should fall back to software... https://github.com/stenzek/duckstation/pull/908 - weird.@mrpacman17 i'm afraid i'm out of ideas. i don't have this hardware. if you know how to use gdb you could get a backtrace of the crash. you could also try the libretro and duckstation discord for advice.
-
@dankcushions Hi all. Well I am not a Linux specialst whatsoever. So I leave it at this. Thank you very much for your time and effort. I really appreciate it. I will run my PSX games with another emulator. Still I appreciate all your hard work. Thanks to you I can play retro games, which is totally awesome!
-
@dankcushions and @mitu: It WORKS!!! Just for a last effort I added the line as I did before. And now it WORKS!!! Totally happy I am. Guess all the previous steps combined with the software rendering did the trick. Woohooo!!! Thank you guys!
-
As well as the emulator sometimes failing to initialize OpenGL after changing resolution settings (even at low x values), I've noticed that after changing settings a few times (scaling, AA, PGXP, etc.) the emulator will become choppy and slow even when set to 1x native with all enhancements disabled. Restarting the emulator is the only way to get performance back. Not sure what's going on here.
I'll see if I can replicate it and if there are reproducible steps to cause it.
-
@stoo While I haven't been able to replicate the weird performance problem I was getting before, I see why OpenGL is failing intermittently.
Seemingly at random when switching AA values, a shader fails to compile. It doesn't happen every time, but maybe 50% of the time?
[libretro INFO] [LibretroHostInterface] Hardware context reset, type = 5 [INFO] [Environ]: GET_SAVE_DIRECTORY. [libretro INFO] [LibretroHostInterface] Shader cache directory: '/home/pi/RetroPie/roms/psx//duckstation_cache/' [libretro INFO] [GPUBackend] GPU thread stopped. [libretro INFO] [GPU_HW_OpenGL] Max texture size: 4096x4096 [libretro INFO] [GPU_HW_OpenGL] Per-sample shading: not supported [libretro INFO] [GPU_HW_OpenGL] Max multisamples: 4 [libretro INFO] [GPU_HW_OpenGL] Uniform buffer offset alignment: 256 [libretro WARN] [SetCapabilities] GL_EXT/OES_copy_image missing, this may affect performance. [libretro INFO] [GPU_HW_OpenGL] Max fragment shader storage blocks: 16 [libretro INFO] [GPU_HW_OpenGL] Max shader storage buffer size: 134217728 [libretro INFO] [GPU_HW_OpenGL] Using shader storage buffers for VRAM writes. [libretro INFO] [GPUBackend] GPU thread started. [libretro INFO] [GPU_HW] Resolution Scale: 1 (1024x512), maximum 4 [libretro INFO] [GPU_HW] Multisampling: 4x [libretro INFO] [GPU_HW] Dithering: Disabled [libretro INFO] [GPU_HW] Texture Filtering: Nearest-Neighbor [libretro INFO] [GPU_HW] Dual-source blending: Not supported [libretro INFO] [GPU_HW] Using UV limits: NO [libretro INFO] [GPU_HW] Depth buffer: NO [libretro INFO] [GPU_HW] Downsampling: Disabled [libretro INFO] [GPU_HW] Using software renderer for readbacks: YES [INFO] [Environ]: GET_SAVE_DIRECTORY. [libretro INFO] [LibretroHostInterface] Shader cache directory: '/home/pi/RetroPie/roms/psx//duckstation_cache/' [libretro INFO] [GL::ShaderCache] 0 program binary formats supported by driver [libretro WARN] [Open] Your GL driver does not support program binaries. Hopefully it has a built-in cache, otherwise startup will be slow due to compiling shaders. [libretro INFO] [HostInterface] Loading: Compiling Shaders 81 of 0-156 [libretro ERROR] [CompileShader] Shader failed to compile: 0:105(68): error: `gl_SampleID' undeclared 0:105(63): error: cannot construct `uint' from a non-numeric data type 0:105(59): error: cannot construct `int' from a non-numeric data type 0:105(17): error: no matching function for call to `texelFetch(sampler2DMS, ivec2, error)'; candidates are: 0:105(17): error: vec4 texelFetch(sampler1D, int, int) 0:105(17): error: ivec4 texelFetch(isampler1D, int, int) 0:105(17): error: uvec4 texelFetch(usampler1D, int, int) 0:105(17): error: vec4 texelFetch(sampler2D, ivec2, int) 0:105(17): error: ivec4 texelFetch(isampler2D, ivec2, int) 0:105(17): error: uvec4 texelFetch(usampler2D, ivec2, int) 0:105(17): error: vec4 texelFetch(sampler3D, ivec3, int) 0:105(17): error: ivec4 texelFetch(isampler3D, ivec3, int) 0:105(17): error: uvec4 texelFetch(usampler3D, ivec3, int) 0:105(17): error: vec4 texelFetch(sampler2DRect, ivec2) 0:105(17): error: ivec4 texelFetch(isampler2DRect, ivec2) 0:105(17): error: uvec4 texelFetch(usampler2DRect, ivec2) 0:105(17): error: vec4 texelFetch(sampler1DArray, ivec2, int) 0:105(17): error: ivec4 texelFetch(isampler1DArray, ivec2, int) 0:105(17): error: uvec4 texelFetch(usampler1DArray, ivec2, int) 0:105(17): error: vec4 texelFetch(sampler2DArray, ivec3, int) 0:105(17): error: ivec4 texelFetch(isampler2DArray, ivec3, int) 0:105(17): error: uvec4 texelFetch(usampler2DArray, ivec3, int) 0:105(17): error: vec4 texelFetch(sampler2DMS, ivec2, int) 0:105(17): error: ivec4 texelFetch(isampler2DMS, ivec2, int) 0:105(17): error: uvec4 texelFetch(usampler2DMS, ivec2, int) 0:105(17): error: type mismatch [libretro ERROR] [Initialize] Failed to compile programs [libretro ERROR] [CreateGPU] Failed to initialize OpenGL renderer, falling back to software renderer [INFO] [Environ]: SET_MESSAGE: Failed to initialize OpenGL renderer, falling back to software renderer. [libretro INFO] [GPUBackend] GPU thread stopped. [libretro INFO] [GPUBackend] GPU thread started.
When this happens, it says it's falling back to Software rendering, but a black screen appears instead. The game is still running, audio is playing but no display. You can switch internal resolutions up or down to force it to initialize the OpenGL renderer again and it usually comes back.
No idea if this is a bug in duckstation, Retroarch or a GL driver bug. I'm using glcore.
-
@stoo said in lr-duckstation - Experimental new PlayStation 1 emulator:
No idea if this is a bug in duckstation, Retroarch or a GL driver bug. I'm using glcore.
why glcore? aren't you on pi4? should be gl i think
i would also remove any overclocks as they could have an affect on gl stability.
-
Switched to "gl" driver. Problem persists.
Dropped clocks to default. Problem persists.
Simply start switching IR and AA options (e.g. enable 2xIR and 2xMSAA, switch to 3xIR, disable AA, switch to 1xIR) and it'll fail very quickly. Fallback to software renderer never works either.
I don't think I can get it to occur without changing AA settings. Simply changing IR alone doesn't seem to trigger it. It's almost certainly a software bug.
Doesn't matter if I'm using PGXP or not, or using software readbacks or not, or disabling interlacing or not.
-
@stoo yup, i can recreate also. can i ask that you please log it here ?
for others, please don't log issues on this github without confirming them here first - as with most things in emulation, this is a volunteer project and we don't want to spam the (lone) developer with work without doing some due-diligence first :)
-
@dankcushions Yep, that's why I waited until someone else confirmed it :)
Issue added to duckstation GH.
Contributions to the project are always appreciated, so if you would like to support us with a donation you can do so here.
Hosting provided by Mythic-Beasts. See the Hosting Information page for more information.