- Edited
cybereality It's all just a scam.
Well considering the price ratio right now… I don't know how in the civilized world, but in our country at the height of the crypto boom you could get Quadro almost cheaper than its gaming counterpart.
cybereality It's all just a scam.
Well considering the price ratio right now… I don't know how in the civilized world, but in our country at the height of the crypto boom you could get Quadro almost cheaper than its gaming counterpart.
packrat Is it normal that I get logged out after roughly a hour since the website went back up?
Make sure to toggle the check-mark for "remember me" or whatever it was again when logging in.
A legit sundown in space- the ISS
Tomcat The point of view that it's better to work behind a small monitor?
Not what I said. I said cater to your clients, don't expect them to have to cater to you.
Most users are using 1080 or less, so accommodating for that should be the highest priority (minimum requirements), while promoting the benefits of larger resolutions (recommended requirements).
My point was that when a user experiences a bug in Godot's UI on a 1080 screen, the solution is not 'get a bigger screen'
Tomcat Is it possible to reverse RTX into Quadro?
No idea, I've not been supporting CAD people for about 7 years so my knowledge may be out of date, but imagine firmware hacking is still a thing as it was then.
I mean, 1080p is not exactly "low res", even in 2022. Yes, for artists/designers and such, they will usually have larger and higher resolution screens. But for indie developers, most are on a shoe-string budget, with some crappy laptop, and probably making a pixel art game for low end devices. Also, since Godot is free, and OpenGL has low system requirements, you end up attracting an audience that is younger or starting out and never made a game before. Not a professional audience like Unreal Engine. So it's pretty important that the editor works for them.
If it came to that, the user would probably just uninstall and leave a bad review. Not being mean, it's just what almost happened to a game that I bought.
It required a certain resolution, and luckily, I had a guru at the time to change my resolution to fit the game. It eventually became a favorite.
Tomcat I am of the opinion that there is a difference in requirements between users/players and developers/creators. For example, it is recommended to develop the game on a computer level higher than the recommended for maximum quality of this game. Well, that's how I was taught…
I would argue for the opposite, you define the minimum spec for the widest audience you want to possibly capture and develop against that minimum spec. You should still test higher end systems including monitors as well, especially now that we have ultra wides and super ultra wides and whatnot.
All of this does bring up a question, tho. Can you change the res on a Godot game so the user won't have to go through what I did?
2D and Keep settings are great, but they might be limiting. Sort of like a free aspect ratio.
Bimbam Most users are using 1080 or less, so accommodating for that should be the highest priority
But I do not argue with this statement. Yes, developing games on a large monitor, we can not forget that the game will sometimes be played at lower resolutions. But it is possible to emulate a small monitor on a large one, but the opposite… difficult.
Megalomaniak I would argue for the opposite, you define the minimum spec for the widest audience you want to possibly capture and develop against that minimum spec.
This point I do not really understand. Optimization usually happens at the end of the work. It is possible to estimate, but only approximately the desired level of minimum requirements. And it is necessary to test, on all available configurations.
Yes, Godot fully supports dynamic resolution. For 3D I think it is set like that by default, but you have to adjust settings to get it working properly in 2D. But either way, you'll have to do some work so it looks right on all different sized screens.
https://docs.godotengine.org/en/stable/tutorials/rendering/multiple_resolutions.html
Tomcat This point I do not really understand. Optimization usually happens at the end of the work. It is possible to estimate, but only approximately the desired level of minimum requirements. And it is necessary to test, on all available configurations.
Yes, I said develop against it, not necessarily on it. It's just that the minimum spec you've set yourself is the one you test the most.
Tomcat But it is possible to emulate a small monitor on a large one, but the opposite… difficult.
Actually, you can go both ways. I wrote a script that allows you do choose the resolution while in game (the idea was to use it on an options menu) and I can support any resolution, high or low. For example, I used this to test 4K performance on my 1440p monitor. I wouldn't say it was easy, it took me a day or two to write the script, but not that long in terms of how long it takes to make a game (and the code is generic, so could be used on any Godot project). Maybe I should release the source code, it is pretty handy.
Megalomaniak Yes, I said develop against it, not necessarily on it. It's just that the minimum spec you've set yourself is the one you test the most.
My computer is pretty good, but I have a $100 Intel mini-PC, as well as various laptops, that I use for testing. I would not want to develop on that $100 computer (I tried to cause I was going to make a video about it, and it was so painful). But it's important to test on more than your own machine. Different specs, GPU brands, OSes, etc.
I would imagine res is sort of like sprites? You can shrink them, but it looks bad if you stretch them.
I do get the point of developing on an overpowered machine, as you want as much of your workflow to be fast as possible.
I kinda wish there was some way to emulate/nerf performance realistically to match a given hardware target, but when I eventually do release a game I would just alter my target FPS according to https://gpu.userbenchmark.com/Compare/Nvidia-GTX-1080-Ti-vs-Nvidia-GTX-1070/3918vs3609
This is purely me eyeballing it, but the GTX1070's performance looks to be roughly the Median graphics card for Steam users atm.
Bimbam I do get the point of developing on an overpowered machine, as you want as much of your workflow to be fast as possible.
Yeah, that's exactly what I was going to say! And the big monitor helps speed things up.
Bimbam I kinda wish there was some way to emulate/nerf performance realistically to match a given hardware target
I still have the ancient Core i7-950/GTX780 for that.
Bimbam I kinda wish there was some way to emulate/nerf performance realistically to match a given hardware target
Me too. I have looked into it, and there is not an easy way to do this. Technically it could work, but it would have to be some sort of hardware switch. For example, disabling a portion of shader cores, slowing the processor, reducing the VRAM allocation, etc. On Windows, you can sort of do this with apps like MSI Afterburner. At least you can reduce the clock rate to an arbitrary number and things like that. But you would also need to limit the VRAM, disable shader cores, and a few other things for an apples to apples comparison. Linux is a little more limited in this sense in terms of overclocking (though there are some command line apps, they aren't as robust as the tools for Windows).
cybereality that is one thing that is neat with the Steam Deck and Steam OS - you can adjust the system wattage and the GPU clock speed easily. It isn’t quite like the Windows tools yet, but it is the closest I’ve seen so far from an easy usability perspective.
On Linux, maybe the tuned deamon could be used to nerf settings. Also has a graphical front-end I think.
System drivers such as acpid also have tools to scale CPU clock, I think (never used them). Care should be taken adjusting system clock speeds from a root account, as it might interfere with the systems time stamps.
And I believe both Nivida and AMD expose some sort of power limiting to the GPU. Stealthy miners depend on them ...
I wrote a tool that could nerf networking. It injected itself into the Winsock sendto function and added random packet corruption, packet loss, reordering, etc, even when testing on a LAN or single machine.
It was for testing my students' network game projects (they had to implement their own UDP based network protocols).