Maybe it was a different thread but this came up before.
Basically Dev on the fastest machine available to you. You want your workflow to be as fast as possible and whatever you can justify paying for will make that process faster. Does GPU matter? well for high-resolution 3D texture baking for exame it can hugely speeds things up, but I feel like a lot of Godot devs probably don't care about this so much amd ultimately it won't 'stop' you if you dont have a good gpu, just slow you down.
Target deployment is trickier as in an ideal world you would just have the hardware to hand to test on. Ultimately this would be where early marketing to target Alpha builds to happy testers comes in, but generally I would just eyeball the approx difference in performance between your Dev card and target card and apply this to your FPS targets.
This won't pick up on things like OpenGL compat differences though, but there's no perfect solution really.
My personal 'need' for a high-performance GPU is not as simple as 'because hobbies', as I also use my GPU for my day job (I'm an ML/Data Scientist for a Startup/SMB, and believe me running my own hardware is a lot cheaper than AWS at our size). As such I'm happy to pay fairly hefty amounts as its tax deductible and the gains would be in the order of saving 20-30hrs/month, but also cognisant that my current solution IS sufficiently workable still, so waiting for new gen to drop to see how it affects price/performance ratios is fine by me.
I won't be waiting till January though lol.