Maybe it was a different thread but this came up before.

Basically Dev on the fastest machine available to you. You want your workflow to be as fast as possible and whatever you can justify paying for will make that process faster. Does GPU matter? well for high-resolution 3D texture baking for exame it can hugely speeds things up, but I feel like a lot of Godot devs probably don't care about this so much amd ultimately it won't 'stop' you if you dont have a good gpu, just slow you down.

Target deployment is trickier as in an ideal world you would just have the hardware to hand to test on. Ultimately this would be where early marketing to target Alpha builds to happy testers comes in, but generally I would just eyeball the approx difference in performance between your Dev card and target card and apply this to your FPS targets.
This won't pick up on things like OpenGL compat differences though, but there's no perfect solution really.

My personal 'need' for a high-performance GPU is not as simple as 'because hobbies', as I also use my GPU for my day job (I'm an ML/Data Scientist for a Startup/SMB, and believe me running my own hardware is a lot cheaper than AWS at our size). As such I'm happy to pay fairly hefty amounts as its tax deductible and the gains would be in the order of saving 20-30hrs/month, but also cognisant that my current solution IS sufficiently workable still, so waiting for new gen to drop to see how it affects price/performance ratios is fine by me.

I won't be waiting till January though lol.

    Bimbam You want your workflow to be as fast as possible and whatever you can justify paying for will make that process faster.

    If you spend 100x the time thinking about what to code than you do running the development tools, as I do, a fast computer doesn't necessarily make the process significantly faster. But when I got a new computer a few years ago, which I seem to do only about every ten years, I bought more power than I really needed (an Intel i9-9900K 3.6GHz CPU with 8 cores / 16 threads, 32GB RAM, two 500GB SSD hard drives) so it wouldn't become obsolete too quickly.

      I assume bit fiddlers who ponder over an algorithm probably spend more time just staring at the screen with funny symbols on it than a person who crunches a lot of data and spends more time waiting for the result (42) with a miner running in the background. All those cache misses, tststs ...

      I hate to say it, but a more responsive OS may also have an influence on the personal perception of felt speed ;-) Otoh, even the lowest level PCs today have so much power ... aww, scrub ... they don't run Windows 😛

      DaveTheCoder If you spend 100x the time thinking about what to code than you do running the development tools, as I do, a fast computer doesn't necessarily make the process significantly faster.

      It depends what you are doing, but this is generally true. Most of the time, when I am working, I am not even typing code. I just sit there and think. Sometimes I will lay down on the bed and take a nap, I've gotten good at programming and designing games when I am sleeping. Then after thinking for maybe an hour, I will write a few lines of code, and they usually work. One time I wrote an entire physics engine in a DX12 compute shader, it took about an hour for coding (after a full day of research). And I didn't compile it until it was done, and when I compiled, it worked perfectly with no errors.

        DaveTheCoder If you spend 100x the time thinking about what to code than you do running the development tools, as I do, a fast computer doesn't necessarily make the process significantly faster.

        Perhaps, but the time spent waiting is still time wasted if you are focused on the problem at hand and can't bring yourself to reorient your mind on another task while the computers busy doing your thing for you. Which is to say faster is always better, if you can afford it.

          Megalomaniak but the time spent waiting is still time wasted if you are focused on the problem at hand

          Right. I've had times where a compile took really long, when the app ran, I forgot what I was testing. 😭

            I just try to do other stuff as I wait, so no time is wasted. Grab a drink, freshen up, eat a meal... play with pets if you have any... Why stare at a loading screen?

            cybereality I forgot what I was testing

            Hehe :-) And these things happen more and more often the older one gets ...

            ... or maybe we just become more aware of it ?

              Maybe our brains are so filled with knowledge, it's hard to add any more.

              cybereality

              It's shocking how well the "don't feel like it right now" strategy works.

              I just tried Godot 4.x for a more professional project? I can't use control/z to undo things, tab to go to the next area... It's a bit unusual, but I feel I could get used to it. Are there no shortcuts for 4.x?

              The undo and redo hotkeys have always been buggy for me in 3.4. But it's not just Godot and it stretches between machines, so I never questioned it. They work normally for you?

              It's alpha software. Lots of stuff is broken.

              It's going now, mostly. That was weird. Tab seems to be the only real issue atm.

              Oh wow this thread is so active.

              Do yall know any libraries that help with networking stuff?
              Like adding chat to a game or a lobby system, dealing with latency, or anything like that?

              Is it feasible to work on a multiplayer game in godot, or is it just better suited for single player

                And, yeah, this is probably THE most active thread here, from what I can see.

                  Nerdzmasterz And, yeah, this is probably THE most active thread here, from what I can see.

                  2,500 posts!!! And the OP was a hit and run.

                    CLAYOoki Do yall know any libraries that help with networking stuff?
                    Like adding chat to a game or a lobby system, dealing with latency, or anything like that?

                    If you are targeting only HTML5 and are okay with peer-to-peer, then I have found using PeerJS as an embedded JavaScript library and then using the Godot JavaScript communication layer works decently. You do have to do all of the data conversion/translation from PeerJS to Godot and back though, so it is not really beginner network programming friendly. It also has some latency, but for non instant real-time games, it should work okay.
                    It does that a lot of work though - I spent 3 or 4 weekends to get the initial stuff setup…

                    However, if you are making a non-HTML5 game and/or want to use the more powerful Godot networking stuff, then the built-in high level networking code is the place to go 🙂
                    Edit: the reason I didn’t go with the high-level networking is just I didn’t want to host my own server for my game.

                    Nerdzmasterz

                    Ah I was hoping for some more plug-and-play resources.
                    It seems the networking will be a bit of its own project.

                    Edit: Does anyone know any open-sourced projects that handle all the annoying cases like cheating, multiple servers, and etc?

                      CLAYOoki I recall a similar conversation on this. The big answer I got was no. I can see from experience that pirates and hackers are dealt with in various ways.

                      You could try to keep all of the information on a server, or use a password, but neither of those methods are fool-proof. The server would probably be the best option, if one can afford it.

                        In my class Box I have this:

                        func setPopMessage(message : String = "Pop!"):
                        	_popMessage = message

                        but where I initialize my Box I'm not getting any auto-complete suggestions for the input:

                        	var box : Box = createBox(0,0,0)
                        	box.setPopMessage(NO SUGGESTIONS??? what am I doin wrong?)

                        Am I missing something?

                        There is nothing to suggest. message is a String, it can be anything you want to type in. The Pop! is just a default parameter that is assigned if the method call is empty.

                          I've never heard of that, and I doubt it is useful for anything (considering I've never heard of it). JPG and PNG are already more than fine. In any case, GPUs use their own texture compression, so the file format you use (like JPG) is only for getting it into the editor. Once your game is running on your computer or on a phone, the GPU uses it's own optimized format so it doesn't actually matter.

                            Neither knew it. https://qoiformat.org/ Why do I think of fish ? :-)

                            Use a format that suits you most. I am a fan of PNG for simple images and height map stuff, also because one can make an image in a few lines of code with libpng.

                            But of course it depends on what you need it for. One stumbles over all kind of formats for all kinds of purposes.

                            JPEG XL is interesting. That will (most likely) be the next industry standard image format (to replace JPEG, PNG, and GIF), so it would make more sense to support that, rather than random formats that will never work with anything. JPEG XL supports high quality lossy and lossless images (with much better quality and file size), transparency, animation, and a whole ton more. Once it is standard, there will likely not need to be any use for JPG/PNG/GIF for a long while.

                            https://chromeunboxed.com/jxl-jpeg-xl-new-image-file-type-chrome/

                            Do you know if there is a compressed format on the CPU side where one can still address single pixels ?

                            Edit: wow, JXL claims 30% better lossless compression than PNG, impressive. Also for 16bit gray scale ? Will risk an eye. There's an implementation here https://github.com/libjxl/libjxl, but it isn't in the Debian repositories yet.

                            Editedit: It's also rather fat, an allrounder, compared to PNG. But then again, if you want to store a lot of data and have a chance for 30% less demand on space for storage and traffic ...

                            I haven't been able to build the source yet. It's still not 100% final, so it's in constant flux. However the release page on Github has an older .deb file that does work. It at least lets you view the images, I'm trying to convert some stuff now. It's from last year, but it's good enough to test the quality and file size. Support is still not ready (as I said, it's barely final) but there is a GIMP plugin and Chrome just added support.

                              cybereality I've never heard of that, and I doubt it is useful for anything (considering I've never heard of it). JPG and PNG are already more than fine.

                              I had not heard about this format, too, so I was surprised. It's kind of claimed to be a "faster PNG".

                              cybereality I couldn't get the command-line tools working, but I used this website. https://jpegxl.io/

                              Your browser does not support JPEG XL.😞

                                Tomcat Your browser does not support JPEG XL.😞

                                Yes, like I said, it's very early. I don't expect support to be out until later this year in a stable release. You can kind of get it working with beta software, like I did, but it's not totally there. But in a few months I expect it will start rolling out. It's honestly going to be pretty big. We get better quality at much lower file size (if you look at the test I did, it's like 25% of the size at the same visual quality). Plus high quality high frame rate animation, that will be much better than GIF. And transparency of compressed images, which is not possible right now (especially important on the web). I'd say within 1 year, it will be the industry standard.

                                Krita doesn't work with the new format yet.

                                Though this release adds support for the new JPEG-XL format, which supports animation, exporting and importing animations in this format gives incorrect results. A fix is pending.

                                But XnViewMP and GIMP know about it.

                                I get artifacts when saving from .jpg:

                                Megalomaniak open standards, IMO are always preferable.

                                Yes, I agree, but with something like an image format, I think wide compatibility may win out. In any case JPEG XL is BSD License and royalty free, so I don't see any issue there.

                                The DDS format can be a handy one. Not only does it store many different GPU pixel formats, but it can do block compressed (the various GPU compressed texture formats), cube maps, voxels and custom mip maps.

                                I might need to do some source digging though, Godot's docs claim to handle loading custom mip maps from DDS files, but it fails on a DDS I've been using since the Ogre3D days.

                                The interesting thing about QOI for me is the claimed 20-50 times encoding performance vs png. Being able to quickly encode a lossless image for things like frame capture could be useful.
                                Plus the fact that the entire reference implementation is a single 669 line C header (a third of which is documentation) with no dependencies means it looks damn easy to integrate into small projects or rewrite directly in other languages.