720p or 1080p
-
Recently I have been on some other forums and some users claim that emulators that run at 720p would be more beneficial to the pi since it would be less stressful on it and the difference in picture quality is miniscule. Now I have used 1080p mostly on my roms and generally I haven't gotten any fps or heat issues, unless I activate a shader that's too much for the Raspberry Pi 3.
Should 720p be used mostly?
-
@RedBatman shaders will definitely run better at 720p if they weren't already running at 60hz at 1080p, but most shaders need 1080p or more to look good, so...
for everything else on rpi3 there's not a benefit i'm aware of. things like n64 run at 320x240 and a simple upscale operation is used to scale them to 1080p. this sort of operation is simple on an rpi3. i don't think any performance improvements would be measurable.
heat... i'm not so sure. if you don't overheat already i wouldn't worry :)
-
@dankcushions Well I wont have to worry about overheating much since I had recently bought a new case that came with a small fan that cools my pi while its on. But my main worry is Frames Per Second, I'm one of those types of people. While playing FBA I will enable the 2xsal shader to give the games a more HD feel, and I always enable the frame rate to see if it is 60fps. Most of the time it works very well with it dipping into 59.1 fps every so often. But that usually happens on CPS3 games like Street Fighter 3.
-
@RedBatman if fps is a concern, have you tried just using a scanlines overlay instead of a shader? They're less CPU intensive than shaders, though of course the effect you're trying to achieve may not be easy to replicate with a simple overlay.
I personally only use overlays, and set up game specific configuration files for vertical games or some different resolutions in case the scanlines are not exactly aligned to every pixel and it annoys me.
Just thought I'd share. But you can test playing the same games without a shader and seeing if it effectively stays at 60 or if it sometimes drops as well. If it does drop then you can conclude that the shader isn't adding a lot of processing to the CPU.
-
@pjft I tried Overlays before and I didn't like them they were okay but the effect just made the game look not as good in my opinion, though I only tried a few. If you can recommend a good one I might be willing to try it. But like I said shaders for the most part maintain around 60fps with only CPS3 srcade games have a minuscule dip. CPS 2 games like X-Men vs. Street Fighter or NEO-GEO games usually stay at 60fps.
-
I always struggle with that question on the Pi. I have no idea how the Pi shares its resources between the CPU and GPU like a rob Peter to pay Paul scenario on that single little chip.
I know you can render and then upscale an image (which a lot of developers do with modern games to target a specific frame rate as it saves some resources) but a 1080p image contains 2.25 times as many pixels as a 720p image. So it would seem it should be notably harder for a game to render a 1080p image over a 720p image.
Resolution is mostly under the control of the GPU. The CPU tells the GPU what to draw and the resolution. In that scenario it would seem the GPU has to render and signal more than twice the pixels 2,073,600 vs 921,000.
Running at 60Hz refresh the GPU needs to deliver 60 frames (pictures) per second for things to work smoothly. Slower and you might see stuttering whereas faster may give you tearing as the GPU is delivering frames faster than the monitor can display and you see multiple pieces of pics at the same time hence v-sync.
The CPU has 1/60 of a second to render each frame and the GPU lights is lighting up about 20 million pixels or about 125 million every second vs half that at 720p.
When it comes to upping the frame rate then it falls back to the CPU.
I am probably over simplifying it but I think frame rate and resolution are interconnected but it depends on what is the bottleneck whether it be the CPU or GPU.
It's really a gray area where a programmer would come in handy as there's so many tricks and shortcuts to render images quicker and more efficiently. They probably understand the relationship between frame rate and resolution more so to get a more definitive answer.
Here's another thread we were having a conversation on refresh rates:
https://retropie.org.uk/forum/topic/2515/does-refresh-rate-help-hurt
-
Yeah, I mean, objectively 720p will be less taxing on the CPU than 1080p. As you said, that's pure maths. The main question is whether any of those quantities are effectively sizeable enough to pose a challenge to the CPU/GPU or not.
As @dankcushions mentioned in quite a few occasions in the forums, on tracking down bottlenecks on the Pi, the way to understand where the bottleneck is is to run a monitoring tool - say,
top
- during the use of the emulator and see what's peaking. If the CPU usage is peaking and performance suffers, then the CPU calculations are the bottleneck. If it's not peaking, but performance is slow, then likely the GPU is the bottleneck.I'm under the impression that most emulators don't use the GPU hardware acceleration, but rather do most of the complex work on the CPU. As such, if the CPU is peaking in usage and slowing things down, removing the shaders can alleviate the burden slightly, as the shaders will sometimes involve per-pixel transformations and computations being done. The overlay is just rendering a static image, which makes it less demanding.
@RedBatman if you're happy with the performance with the shaders, then I wouldn't encourage you to go for overlays. I'd certainly suggest trying out CPS3 games without a shader and seeing the FPS dip or not without them. If they don't, I'd suggest trying out an overlay and seeing if the performance holds. If they do dip without a shader as well, then you're probably better off sticking to the shader anyway, as the overlay won't solve it for you.
-
the upscale operations are using the GPU, so 1080p vs 720p uses the GPU more, but a simple upscale is very easy for the pi3's GPU to do especially given that it's typically not doing anything else.
shaders are where the GPU can start to suffer at higher resolutions.
when it gets slightly greyer is when you have a shader (eg, crt-pi) that seems to work fine at 60hz in one game, but slows down another, despite neither game using the GPU (eg, genesis games). this is likely due to these games having differing CPU useage, with the slower game hammering the bus just enough to cause GPU throughput to suffer. i've seen this with lr-nestopia vs lr-fceumm - put crt-pi on at 1080p, and watch that the clouds jerk when mario runs...
but yeah, i'm still talking about shaders, which are running a somewhat complex shader program for EVERY pixel on screen. i'm not aware of a scenario where a simple upscale would be impacted by CPU usage, although i think it can happen on RP1/0, which has a far weaker CPU, GPU, memory speed, etc.
-
I agree one game runs great and the next is terrible. It might be beneficial to have a peak at the MAME drivers to see how they are achieving a particular result. There's been hundreds of contributors to MAME over the years and I would assume as many programming styles.
Here at work we have the sloppy to the divine. I've worked with some over 20 years now and you start to see specific styles. Some of that sloppy code is in our payroll system and it works week to week just fine printing checks and doing direct deposit. While our "golden boy" is an absolute whiz kid in technique, style and just thinking outside the box. I remember years ago when John Romero & John Carmack had a godly status for the work they did with Doom.
Even though these games originally used only small amounts of RAM measuring in the K's it's not necessarily the case with emulation. I would imagine emulation uses exponentially more RAM to achieve the same results of those old platform and arcade games hardware.
Not to mention how did the original MAME programmer decide to go about the task like is it CPU or GPU centric in rendering, buffering, maybe they are only doing a partial screen refresh or clearing and populating the RAM (basically swapping data in and out) because back then it was so small unlike today they had to be pretty creative. Are they clearing variables and releasing RAM. Is it good clean tight code. I'm a firm believer every programmer has signature code that you can almost identify who did it. I remember back in the day you had to write tight code because you only had 16, 32 or 64(K) or RAM like in the Commodore VIC, Pet & 64.
There was this RPG I used to play (Phantasie) on the C64 that would put a monster in your path that would start eating your inventory items until you killed it. It was devastating to have unique items destroyed. I later read an article that it was a limitation of RAM on the C64. So to keep the items you were carrying to use a specific limited amount of RAM the developers introduced the Devourer.
When 4K monitors starting going mainstream the video card industry wasn't quite prepared. 4K is 4 times the resolution of 1080 and 8 times that of 720. You need pretty beefy cards to run games at 4K and it was crippling games to unusable frames per second if you didn't have it. If they were rendering a true 1080p frame you would think it would be a simple operation of upscaling 4x to match the resolution but it wasn't that easy.
4K is only a resolution upgrade (more pixels--that's it) but it requires monster cards and even more importantly is huge amounts of on board VRAM. When buying a card I think it's important to know what monitor you'll be using to size the card appropriately without extraneous overkill. Buying a 4K video card to drive a 1080p monitor seems silly unless money is of no concern.
I mostly think like x86 architecture with a separate CPU & GPU as I am not quite sure how ARM architecture works but it's entirely realistic to have a bottle necked CPU and still achieve a higher resolution (what's happening with the GPU). Simply looking at one component for an answer isn't always the case. It might require looking at other components, GPU, RAM, etc. to get the full picture. I don't think PC components inversely proportional when it comes to troubleshooting.
Back to the OP's question. It would seem 720p would use less resources but if running at 1080p falls within the specs of the PI 3 then maybe all this is irrelevant or at least less relevant than the PI 1 where RAM was a lot more restricted. Unused resources are wasted. ;) But we know for sure some games suffer from playable performance. I noticed several driving games like Cruisin' USA, etc. etc. that just don't run well even though the Pi exceeds the original hardware specs exponentially. Is it poor coding or some other combination. It seems you almost need to do some testing on different monitors to decide which way to go to fit your needs.
As newer games that require more RAM are emulated on the Pi maybe that 720p monitor will give you a slight advantage in performance and/or allow a filter, overlay or shader to run that normally wouldn't on a 1080p monitor.
At work we push our VM's as close to 100% CPU time shares as possible but it's almost always RAM that is the bottle neck of virtualization and not CPU cycles. Nowadays those blades come with a dozen or more DIMM slots alone.
Here's a few screenshots a guy did on the Pi. The top two are 1080p and the bottom two 720p. Original res is 384x224. Look at the middle two. They are 1080 vs 720 with smoothing. Notice the clean squares in the health bar. They are pretty close or at least I don't think I would mind the minor degradation in visuals. Then again I am running 1080p TV due to the availability is pretty much mainstream store available vs a 720 that usually needs to be ordered.
-
720p is better than 1080p just trust me
-
@N.A.R.E.K.96 said in 720p or 1080p:
720p is better than 1080p just trust me
I definitely waffle back and forth and honestly it's all theory and sharing thoughts on the forum so.....any chance you could explain that a bit further or at least link us to an explanation? I don't mind reading white papers...ok if I don't have to read one then all the better. ;)
-
@N.A.R.E.K.96 said in 720p or 1080p:
720p is better than 1080p just trust me
240p for me, thank you :P
-
I think Dank made a good point for 1080 is that you have more room for sharper scaling and less distortion when using something like DaveJ's shader (godly) which is essential for modern day LCD displays.
By the way Dank. I've been thoroughly enjoying using the vertical shader configs you created, amazing stuff, brilliant work!
-
@Riverstorm said in 720p or 1080p:
By the way Dank. I've been thoroughly enjoying using the vertical shader configs you created, amazing stuff, brilliant work!
thanks! ironically i recently set up retropie for my brother and after seeing some games without the shader i suddenly decided i prefer the crisp pixelly image without the shader.. for this month, anyway :)
-
Some notes on memory usage.
The examples shown are for a 4:3 game displayed on a 1920x1080 screen. The game screen is upscaled to 1440x1080 to keep the aspect ratio the same. Game screens vary in resolution so I've used a rough average.
a) GPU upscales image: GPU reads game image, upscales it and sends it to the display. It can upscale (with linear or nearest filtering) to any supported resolution without extra memory accesses. Relative memory accesses = 1.
b) With overlay: GPU reads game image, combines it with overlay (which it also has to read) and sends it to the display. Needs memory from a) plus size of overlay (overlay is 30 times as big as image from game). Relative memory accesses = 31.
c) Using shader: GPU upscales game image using shader and writes it out to memory. GPU reads upscaled image and sends it to display. Needs memory from a) plus 2 * upscaled image size (upscaled image is 23 times as big as image from game). Relative memory accesses = 47.
d) Shader with overlay: As c) but also has to read the overlay whilst sending image to display. Relative memory accesses = 78.
For a 720 screen, the overlay and shader upscaled images are about half the size.
All Pis have relatively slow memory and the CPU and GPU can end up fighting over access to it. My recommendation is to overclock your memory as fast as it will go whilst still remaining stable.
If you're using shaders overclocking the GPU can help relieve pressure on memory - it doesn't use less but the access pattern changes a bit making it conflict less with the CPU. Note: crt-pi in it's default configuration was designed to work on Pi1+2s with the default GPU clock of 250MHz, by overclocking I mean reative to that. By default Pi Zeros are clocked at 300MHz and Pi3s at 400MHz. If you get overheating problems with Pi3s running shaders you could try underclocking the GPU down from 400MHz - but it might cause problems for shaders other than crt-pi.
-
@davej said in 720p or 1080p:
For a 720 screen, the overlay and shader upscaled images are about half the size.
Thanks Dave that's some really helpful information. I didn't quite understand the relative memory accesses. Using a shader requires 23 times more memory(?) but 47 memory accesses like as in rendering it accesses memory 47 times to apply the shader ? Is there a tangible amount used per rendered frame like height x width x bit depth?
If it can be answered easily, can the Pi comfortably run most games at 1080p with a shader? Is there a gain to be had at 720p? It seems like the memory accesses would be equivalent between both but the memory usage would be half? Also an image twice the size would require more work manipulating at a pixel level? Basically your doing twice the work in the same clock cycle?
It doesn't look like there's much difference in visual quality for the most part?
-
@Riverstorm said in 720p or 1080p:
If it can be answered easily, can the Pi comfortably run most games at 1080p with a shader? Is there a gain to be had at 720p?
Yeah, I had no problems with games at 1080p with the crt-pi shader on my pi3 with cpu/gpu/mem at 1300/500/500. If a game runs slow, disabling the crtpi shader and going native res didn't help in the games I tried. I got a huge speed boost by just changing the emulator instead. I have to use four different arcade emulators because games will slow down on one and be faster in another. I used my pi2 to sort that.
Bit offtopic, but I did have to disable the crt-pi shader on my pi2 on a 720p tv because it looked really bad. It looks like someone ran over the screen with crt-pi tire marks or something...I don't think the shader was made for 720 or maybe a scaling issue since the tv's hdmi ports only handle 720p/1080i but the vga port will do 1366x768. Forcing the output to either 720p or 1080i did not help.
@Riverstorm said in 720p or 1080p:
Buying a 4K video card to drive a 1080p monitor seems silly unless money is of no concern.
Not silly at all. The power is needed to either downsample the image and/or 144hz gaming.
-
@davej But if you have a fan it should be okay right? Not only that but I tried over clocking once and it said that my pi wasn't allowed to over clock.
-
@Riverstorm said in 720p or 1080p:
Thanks Dave that's some really helpful information. I didn't quite understand the relative memory accesses. Using a shader requires 23 times more memory(?) but 47 memory accesses like as in rendering it accesses memory 47 times to apply the shader ? Is there a tangible amount used per rendered frame like height x width x bit depth?
The relative access figures were just away of indicating how much extra work the Pi needs to do when using shaders and/or overlays. If a read or a write of a pixel is one memory access, reading or writing the whole of a large image will take more of them than for a small image.
If it can be answered easily, can the Pi comfortably run most games at 1080p with a shader?
Not really. It depends on the game and the shader. Most shaders are too complex for the Pi - I wrote crt-pi because there wasn't a decent CRT shader that ran well at 1080p. Even then, it pushes the Pi so hard that it only just manages it. Tweaking the crt-pi settings can slow it down. Running a complex game can slow it down. Running a more accurate (and so slower) emulator can slow it down.
Is there a gain to be had at 720p? It seems like the memory accesses would be equivalent between both but the memory usage would be half? Also an image twice the size would require more work manipulating at a pixel level? Basically your doing twice the work in the same clock cycle?
720p screens use half the memory that 1080p one do and so are less susceptible to the slowdowns mentioned above You might also be able to use more complex shaders.
It doesn't look like there's much difference in visual quality for the most part?
There is with crt-pi. If you are not scaling by integer amounts, crt-pi needs about 4 screen pixels to display one game pixel and still look reasonable. A 720p screen doesn't provide enough pixels for that.
-
@RedBatman said in 720p or 1080p:
@davej But if you have a fan it should be okay right? Not only that but I tried over clocking once and it said that my pi wasn't allowed to over clock.
If it's running cool enough anyway, whether with a fan or not, you shouldn't need to underclock.
Contributions to the project are always appreciated, so if you would like to support us with a donation you can do so here.
Hosting provided by Mythic-Beasts. See the Hosting Information page for more information.