Tomcat I am of the opinion that there is a difference in requirements between users/players and developers/creators. For example, it is recommended to develop the game on a computer level higher than the recommended for maximum quality of this game. Well, that's how I was taught…

I would argue for the opposite, you define the minimum spec for the widest audience you want to possibly capture and develop against that minimum spec. You should still test higher end systems including monitors as well, especially now that we have ultra wides and super ultra wides and whatnot.

    All of this does bring up a question, tho. Can you change the res on a Godot game so the user won't have to go through what I did?

    2D and Keep settings are great, but they might be limiting. Sort of like a free aspect ratio.

    Bimbam Most users are using 1080 or less, so accommodating for that should be the highest priority

    But I do not argue with this statement. Yes, developing games on a large monitor, we can not forget that the game will sometimes be played at lower resolutions. But it is possible to emulate a small monitor on a large one, but the opposite… difficult.

    Megalomaniak I would argue for the opposite, you define the minimum spec for the widest audience you want to possibly capture and develop against that minimum spec.

    This point I do not really understand. Optimization usually happens at the end of the work. It is possible to estimate, but only approximately the desired level of minimum requirements. And it is necessary to test, on all available configurations.

      Tomcat This point I do not really understand. Optimization usually happens at the end of the work. It is possible to estimate, but only approximately the desired level of minimum requirements. And it is necessary to test, on all available configurations.

      Yes, I said develop against it, not necessarily on it. It's just that the minimum spec you've set yourself is the one you test the most.

        Tomcat But it is possible to emulate a small monitor on a large one, but the opposite… difficult.

        Actually, you can go both ways. I wrote a script that allows you do choose the resolution while in game (the idea was to use it on an options menu) and I can support any resolution, high or low. For example, I used this to test 4K performance on my 1440p monitor. I wouldn't say it was easy, it took me a day or two to write the script, but not that long in terms of how long it takes to make a game (and the code is generic, so could be used on any Godot project). Maybe I should release the source code, it is pretty handy.

        Megalomaniak Yes, I said develop against it, not necessarily on it. It's just that the minimum spec you've set yourself is the one you test the most.

        My computer is pretty good, but I have a $100 Intel mini-PC, as well as various laptops, that I use for testing. I would not want to develop on that $100 computer (I tried to cause I was going to make a video about it, and it was so painful). But it's important to test on more than your own machine. Different specs, GPU brands, OSes, etc.

        I would imagine res is sort of like sprites? You can shrink them, but it looks bad if you stretch them.

        I do get the point of developing on an overpowered machine, as you want as much of your workflow to be fast as possible.

        I kinda wish there was some way to emulate/nerf performance realistically to match a given hardware target, but when I eventually do release a game I would just alter my target FPS according to https://gpu.userbenchmark.com/Compare/Nvidia-GTX-1080-Ti-vs-Nvidia-GTX-1070/3918vs3609

        This is purely me eyeballing it, but the GTX1070's performance looks to be roughly the Median graphics card for Steam users atm.

          Bimbam I do get the point of developing on an overpowered machine, as you want as much of your workflow to be fast as possible.

          Yeah, that's exactly what I was going to say! And the big monitor helps speed things up.

          Bimbam I kinda wish there was some way to emulate/nerf performance realistically to match a given hardware target

          I still have the ancient Core i7-950/GTX780 for that.

          Bimbam I kinda wish there was some way to emulate/nerf performance realistically to match a given hardware target

          Me too. I have looked into it, and there is not an easy way to do this. Technically it could work, but it would have to be some sort of hardware switch. For example, disabling a portion of shader cores, slowing the processor, reducing the VRAM allocation, etc. On Windows, you can sort of do this with apps like MSI Afterburner. At least you can reduce the clock rate to an arbitrary number and things like that. But you would also need to limit the VRAM, disable shader cores, and a few other things for an apples to apples comparison. Linux is a little more limited in this sense in terms of overclocking (though there are some command line apps, they aren't as robust as the tools for Windows).

            cybereality that is one thing that is neat with the Steam Deck and Steam OS - you can adjust the system wattage and the GPU clock speed easily. It isn’t quite like the Windows tools yet, but it is the closest I’ve seen so far from an easy usability perspective.

            On Linux, maybe the tuned deamon could be used to nerf settings. Also has a graphical front-end I think.

            System drivers such as acpid also have tools to scale CPU clock, I think (never used them). Care should be taken adjusting system clock speeds from a root account, as it might interfere with the systems time stamps.

            And I believe both Nivida and AMD expose some sort of power limiting to the GPU. Stealthy miners depend on them ...

            I wrote a tool that could nerf networking. It injected itself into the Winsock sendto function and added random packet corruption, packet loss, reordering, etc, even when testing on a LAN or single machine.
            It was for testing my students' network game projects (they had to implement their own UDP based network protocols).

            My issue is not the resolution, but the physical size of the screen. I've got a 14" walmart laptop, and my games are mostly aimed at android, so that should be fine. I can run firefox, vim, xterm, etc. with large text. Godot's 3.x ide fits reasonably well -- I have to use mouseover to see some of the debug values -- but running the fonts at 24/26 causes 4.x menus to expand to the point that they force the right sidebar completely off the screen. Godot's interface is designed in godot, and it suffers in some areas because of it, though I'd say it's an overall advantage.

            I imagine they'll fix it before I want to move up. But you can see why a plain text-editor still appeals to me, although (ironically) the 2D/3D parts of 4.x are the ones most affected.

            cybereality

            Thanks, I think I want to actually turn this into a minigame, unlike previous projects that are largely just static scenarios.

            Something like Pokemon Snap where you can explore a small area around the oasis to try and get money shots of animals without startling them. Will mean needing to find/make animated HQ saharan animals though, but then any new 3D scenario I make just becomes a new 'level'.

            Someone mentioned being interested in game development, so I suggested Godot to them for its simplicity. Honestly I hope more beginner friendly tuts start popping up soon.