I do get the point of developing on an overpowered machine, as you want as much of your workflow to be fast as possible.

I kinda wish there was some way to emulate/nerf performance realistically to match a given hardware target, but when I eventually do release a game I would just alter my target FPS according to https://gpu.userbenchmark.com/Compare/Nvidia-GTX-1080-Ti-vs-Nvidia-GTX-1070/3918vs3609

This is purely me eyeballing it, but the GTX1070's performance looks to be roughly the Median graphics card for Steam users atm.

    Bimbam I do get the point of developing on an overpowered machine, as you want as much of your workflow to be fast as possible.

    Yeah, that's exactly what I was going to say! And the big monitor helps speed things up.

    Bimbam I kinda wish there was some way to emulate/nerf performance realistically to match a given hardware target

    I still have the ancient Core i7-950/GTX780 for that.

    Bimbam I kinda wish there was some way to emulate/nerf performance realistically to match a given hardware target

    Me too. I have looked into it, and there is not an easy way to do this. Technically it could work, but it would have to be some sort of hardware switch. For example, disabling a portion of shader cores, slowing the processor, reducing the VRAM allocation, etc. On Windows, you can sort of do this with apps like MSI Afterburner. At least you can reduce the clock rate to an arbitrary number and things like that. But you would also need to limit the VRAM, disable shader cores, and a few other things for an apples to apples comparison. Linux is a little more limited in this sense in terms of overclocking (though there are some command line apps, they aren't as robust as the tools for Windows).

      cybereality that is one thing that is neat with the Steam Deck and Steam OS - you can adjust the system wattage and the GPU clock speed easily. It isn’t quite like the Windows tools yet, but it is the closest I’ve seen so far from an easy usability perspective.

      On Linux, maybe the tuned deamon could be used to nerf settings. Also has a graphical front-end I think.

      System drivers such as acpid also have tools to scale CPU clock, I think (never used them). Care should be taken adjusting system clock speeds from a root account, as it might interfere with the systems time stamps.

      And I believe both Nivida and AMD expose some sort of power limiting to the GPU. Stealthy miners depend on them ...

      I wrote a tool that could nerf networking. It injected itself into the Winsock sendto function and added random packet corruption, packet loss, reordering, etc, even when testing on a LAN or single machine.
      It was for testing my students' network game projects (they had to implement their own UDP based network protocols).

      My issue is not the resolution, but the physical size of the screen. I've got a 14" walmart laptop, and my games are mostly aimed at android, so that should be fine. I can run firefox, vim, xterm, etc. with large text. Godot's 3.x ide fits reasonably well -- I have to use mouseover to see some of the debug values -- but running the fonts at 24/26 causes 4.x menus to expand to the point that they force the right sidebar completely off the screen. Godot's interface is designed in godot, and it suffers in some areas because of it, though I'd say it's an overall advantage.

      I imagine they'll fix it before I want to move up. But you can see why a plain text-editor still appeals to me, although (ironically) the 2D/3D parts of 4.x are the ones most affected.

      cybereality

      Thanks, I think I want to actually turn this into a minigame, unlike previous projects that are largely just static scenarios.

      Something like Pokemon Snap where you can explore a small area around the oasis to try and get money shots of animals without startling them. Will mean needing to find/make animated HQ saharan animals though, but then any new 3D scenario I make just becomes a new 'level'.

      Someone mentioned being interested in game development, so I suggested Godot to them for its simplicity. Honestly I hope more beginner friendly tuts start popping up soon.

      Yeah, I always recommend Godot. Like if you don't know what to choose, then you should be using Godot.

        cybereality I think it's the best general purpose engine around, especially for beginners. Low hardware requirements. Easy language. Very good editor.

        I have to say the adjustments to the forum site have been excellent. Feels as good as the old one, maybe a little better. The wider space between posts is a little so so, but other than that it fits the bill.

        Honestly, I have been researching 3D engines and testing stuff for about 20 years. I have evaluated literally every 3D engine that has come out in that time (that was available to download) and most of them fell short. Though there were some good ones here and there, they never got enough attention and disappeared. So when I found Godot I was so happy. Finally a decent interface and an easy to use engine, that still looked pretty good, and was getting popular. This is rare.

        As for 3D, many complain about it. I haven't done anything yet with 3D, although I plan to. I also never tried making shaders, it sounds difficult.

          Bimbam Early WIP for next biome/scene (still in Blender atm):

          It would be a good demonstration of Godot's capabilities. 👍

          Nerdzmasterz I also never tried making shaders, it sounds difficult.

          The Book of Shaders.

          Nerdzmasterz
          I've never done 2D, but reading about the tilesets the other day I had the idea of a train driver running their train down a track and switching switches to chosse the non-dead-end track. Speed increases, at some point it is not visible if a track is dead end or not, so they have to rely on signals. Then, with increasing speed, tracks start to run parallel, with head on traffic. Things may appear on the track, fallen trees, maybe a derailed derelict of another train, and so on, until the terminal station appears -> game won. Running into things or over a dead end -> Game lost. I'd call it "Loco's Breath" :-)

          I have a little 3D experience with an own naive render framework, so the render pipeline is not a logical problem for me. But I don't know yet how Godot shaders fit in. They may have serious performance impact, for instance if every shader triggers an internal pipeline switch and a draw call. But I really don't know yet. I'll get to that chapter later.

          From what I see there isn't much reason to actually complain. And it is so far only one of two full blown engines (the other being Unity) that runs effortlessly on Linux, and the only one which brings its script editor with help lookup with it. Cool is Godot's easy integration of lower level languages, in case the performance reaches limits. Sure, it is not an engine with a billion bucks budget ...

          One thing I'd say may ease the confusion that comes with 3D is, if there is some and school was long ago :-), a little recap in linear algebra, vectors and matrices.

          @Bimbam
          The scenery is nice !


          btw.: Is it pronounced Godot like in "En attendant Godot" or Godot like in cough "Go, dot !" ?
          Questions over Questions ...

          Edit nevermind, found this: https://godotengine.org/qa/175/what-is-the-proper-way-to-say-godot
          So it is French for me :-)