Bimbam I'm betting Quadro loses.

Quadro was always bogus and a cash grab to exploit corporate clients. Back in the day, there was a soft-mod you could do that would turn like a $200 gamer card into a $4000 Quadro, and performance was the same. It's the same reason many cloud services have a free tier for individuals but charge you like $500/month if you work at a corporation. The product is the same, but they charge what they think people will pay. Or when the price says "Please Inquire" and they look up the market cap of your company and divide by 10 to tell you the cost. It's all just a game.

    Bimbam Quadro cards were just nerfed GTX cards with expensive firmware

    Is it possible to reverse RTX into Quadro?

    Bimbam So my point stands.

    The point of view that it's better to work behind a small monitor? Sorry I don't share it. I guess the text ejection bug (which is a bug) will be fixed but the suffering won't stop there.

    cybereality Back in the day, there was a soft-mod you could do that would turn like a $200 gamer card into a $4000 Quadro, and performance was the same.

    I would like to know more about it

      Tomcat I'll just leave it here

      I think, in 2022, requiring a 1080p monitor for production work is not unreasonable. Granted, there should be some provisions or options for smaller monitors, but it would be extremely difficult to create any sort of complex software or game today at 720p. Maybe if you are making a 240 x 360 pixel art game, you could do it (like people on that Pico-8 or whatever), but a 3D game? Not likely.

      Tomcat Is it possible to reverse RTX into Quadro?

      No, Nvidia blocked that trick probably like 15 years ago. But there is a mod to do vGPU, which might be useful for scientific work.

      https://wccftech.com/gpu-virtualization-functions-on-nvidia-geforce-cards-with-simple-mod/

      In any case, Quadro is not needed for game development. Most software works fine on consumer cards, and it's probably better anyway you are running on a similar stack to users. I guess there are some select industries that still need enterprise GPUs, but there is nothing special about them, besides a firmware device id which the software checks for. It's all just a scam.

        cybereality It's all just a scam.

        Well considering the price ratio right now… I don't know how in the civilized world, but in our country at the height of the crypto boom you could get Quadro almost cheaper than its gaming counterpart. 😃

        packrat Is it normal that I get logged out after roughly a hour since the website went back up?

        Make sure to toggle the check-mark for "remember me" or whatever it was again when logging in.

        Tomcat The point of view that it's better to work behind a small monitor?

        Not what I said. I said cater to your clients, don't expect them to have to cater to you.

        Most users are using 1080 or less, so accommodating for that should be the highest priority (minimum requirements), while promoting the benefits of larger resolutions (recommended requirements).

        My point was that when a user experiences a bug in Godot's UI on a 1080 screen, the solution is not 'get a bigger screen' 🤷

        Tomcat Is it possible to reverse RTX into Quadro?

        No idea, I've not been supporting CAD people for about 7 years so my knowledge may be out of date, but imagine firmware hacking is still a thing as it was then.

          I mean, 1080p is not exactly "low res", even in 2022. Yes, for artists/designers and such, they will usually have larger and higher resolution screens. But for indie developers, most are on a shoe-string budget, with some crappy laptop, and probably making a pixel art game for low end devices. Also, since Godot is free, and OpenGL has low system requirements, you end up attracting an audience that is younger or starting out and never made a game before. Not a professional audience like Unreal Engine. So it's pretty important that the editor works for them.

          If it came to that, the user would probably just uninstall and leave a bad review. Not being mean, it's just what almost happened to a game that I bought.

          It required a certain resolution, and luckily, I had a guru at the time to change my resolution to fit the game. It eventually became a favorite.

          Tomcat I am of the opinion that there is a difference in requirements between users/players and developers/creators. For example, it is recommended to develop the game on a computer level higher than the recommended for maximum quality of this game. Well, that's how I was taught…

          I would argue for the opposite, you define the minimum spec for the widest audience you want to possibly capture and develop against that minimum spec. You should still test higher end systems including monitors as well, especially now that we have ultra wides and super ultra wides and whatnot.

            All of this does bring up a question, tho. Can you change the res on a Godot game so the user won't have to go through what I did?

            2D and Keep settings are great, but they might be limiting. Sort of like a free aspect ratio.

            Bimbam Most users are using 1080 or less, so accommodating for that should be the highest priority

            But I do not argue with this statement. Yes, developing games on a large monitor, we can not forget that the game will sometimes be played at lower resolutions. But it is possible to emulate a small monitor on a large one, but the opposite… difficult.

            Megalomaniak I would argue for the opposite, you define the minimum spec for the widest audience you want to possibly capture and develop against that minimum spec.

            This point I do not really understand. Optimization usually happens at the end of the work. It is possible to estimate, but only approximately the desired level of minimum requirements. And it is necessary to test, on all available configurations.

              Tomcat This point I do not really understand. Optimization usually happens at the end of the work. It is possible to estimate, but only approximately the desired level of minimum requirements. And it is necessary to test, on all available configurations.

              Yes, I said develop against it, not necessarily on it. It's just that the minimum spec you've set yourself is the one you test the most.

                Tomcat But it is possible to emulate a small monitor on a large one, but the opposite… difficult.

                Actually, you can go both ways. I wrote a script that allows you do choose the resolution while in game (the idea was to use it on an options menu) and I can support any resolution, high or low. For example, I used this to test 4K performance on my 1440p monitor. I wouldn't say it was easy, it took me a day or two to write the script, but not that long in terms of how long it takes to make a game (and the code is generic, so could be used on any Godot project). Maybe I should release the source code, it is pretty handy.

                Megalomaniak Yes, I said develop against it, not necessarily on it. It's just that the minimum spec you've set yourself is the one you test the most.

                My computer is pretty good, but I have a $100 Intel mini-PC, as well as various laptops, that I use for testing. I would not want to develop on that $100 computer (I tried to cause I was going to make a video about it, and it was so painful). But it's important to test on more than your own machine. Different specs, GPU brands, OSes, etc.

                I would imagine res is sort of like sprites? You can shrink them, but it looks bad if you stretch them.

                I do get the point of developing on an overpowered machine, as you want as much of your workflow to be fast as possible.

                I kinda wish there was some way to emulate/nerf performance realistically to match a given hardware target, but when I eventually do release a game I would just alter my target FPS according to https://gpu.userbenchmark.com/Compare/Nvidia-GTX-1080-Ti-vs-Nvidia-GTX-1070/3918vs3609

                This is purely me eyeballing it, but the GTX1070's performance looks to be roughly the Median graphics card for Steam users atm.

                  Bimbam I do get the point of developing on an overpowered machine, as you want as much of your workflow to be fast as possible.

                  Yeah, that's exactly what I was going to say! And the big monitor helps speed things up.

                  Bimbam I kinda wish there was some way to emulate/nerf performance realistically to match a given hardware target

                  I still have the ancient Core i7-950/GTX780 for that.