Plus crypto just tanked, so GPU prices are the lowest they've been in years. Who knows, what if crypto takes off tomorrow and blows up? Then you will be hitting yourself in the face for the next 2 or 3 years until the next crash.

    cybereality I'd like to be wrong but I think we are in for a quite long and hard recession, world wide. Crypto was a scam from the start and I doubt if it's coming back at least in present form. I'd say things like new vid cards will come up in much slower increments just like cars and all kinds of things to conserve resources. A lot of people will be lucky to have food on the table. The federal reserves of the world are probably some of the biggest idiots anyone ever put in charge.

    I guess what I'm saying is, you should make your decision (any decision) based on the available information at this point in time. For example, if you need a new video card, and there is a decent one you like for say $500, and you have $500 to spend, then the choice is clear. It is within your budget, it has the performance you need (and the performance will not drop, regardless of what comes out next month), and it is available today. If you base your logical process on speculation, then you probably have a 50/50 chance of being right (or maybe less, as there are multiple variables, all unknown). It's just not a good decision making process.

    For example, there are rumors that a new Nvidia card will come out in 2 months. This is not for sure, there are always wild rumors and most of the time they are bunk. But even if it were true, Nvidia always launches the flagship first. So we may wait another 2 or 3 months for a mid or low range version of the chip. This is not guaranteed, but a safe assumption. So, even in the most optimistic of this rumor/assumption, the RTX 4060 card would be coming out in November. But inventory is typically low, at first, and it will be a hot card everyone wants to buy. So there will be scalpers, the price will be inflated, and there is no idea when it will come down to MSRP. But let's make another assumtion, that inventory will pick up after 2 months and the price will go down. So now we are in January of 2023, and this is with the optimistic prediction. We still don't know if crypto will come back within that timeframe (unlikely, but there is a chance). We don't know how the economy will change, if inflation flies up, or the stock market crashes, or any other event that might make $500 today cost you $1,000 next year. We don't know if there will be manufacturer defects or bad batches (this has frequently happened with Nvidia in the recent years). There are just so many unknowns, and even in the best case scenario, if all those risks went in your favor, you still have to wait until next year, rather than buy a card today and have it in your computer in 3 days. This is why I say that basing purchases on rumors and speculation is always a mistake.

    I've given buying things because others suggest I need them. I can't possibly keep up with the rat race anyway. I buy hardware when I need it, not when another sow is being driven through the village (that a valid English saying ? :-)). For my own part, I don't need 4k resolution and imperceptible frame rates, and energy consumption and environmental impact are as important (Edit: proof reading, I'd say more important) to me as latest tech.

    There is, for instance, currently no car that interests me. Won't buy a new one even if my old car breaks down, I'll take the bus, then. Take something to read with me. Saves me a lot of time I'd waste holding on to a steering wheel, not the most intelligent occupation ...

    Actually, I feel that exactly there Godot stands out from other high end engines, besides from being accessible to everyone through programming interfaces that really work. Unreal didn't for me, and Unity has too much fluff, technically and legally.

    Yeah, back to work, nice day everyone :-)

    Quadro vs GeForce

    NVIDIA GeForce RTX 3070 Modded With 16 GB GDDR6 Memory

    Russian-based modder, VIK-on, has upgraded the NVIDIA GeForce RTX 3070 graphics card with 16 GB GDDR6 memory. The modding required some technical experience but in the end, VIK-on had the RTX 3070 running with twice its original memory capacity.

    My point is that if the difference between Quadro and GeForce was insignificant, the redesigns would be mass-produced.

    Perhaps there is a slightly deeper difference in the hardware than is commonly thought.

      Tomcat Perhaps there is a slightly deeper difference in the hardware than is commonly thought.

      I believe the difference is 64 bit performance.

        Pixophir I believe the difference is 64 bit performance.

        This is a somewhat belated response to the old discussion

        cybereality …but there is nothing special about them, besides a firmware device id which the software checks for. It's all just a scam.

        @cybereality 's word has certainly more weight than mine. If there are differences between the Quadro and Geforce, these can certainly be found out.

        I want 64bit on the GPU. That'll make my life so much easier :-)

        Bimbam

        That's my favorite time to buy... the older models. Get 'em dirt cheap and use the money you saved for something fun. 🙂

        Tomcat My point is that if the difference between Quadro and GeForce was insignificant, the redesigns would be mass-produced.

        Well the board is not exactly the same. I don't know about the current designs, but at one point (about 15 years ago) the GeForce and Quadro cards were exactly the same, just with basically a firmware switch that unlocked additional functions. This has changed since then. But in most cases, I would assume the GPU chip itself is the same. Because it is much cheaper to mass produce the same chip for all different models, then to run up a production line for each SKU. However, that is not to say the board is exactly the same. They could have different sizes in RAM, different memory controllers or bus speeds, etc. so it can get complicated.

        CPUs are the same way. Basically every i7 chip (of a generation) is the same chip. Intel will mass produce all of the same chip, and the ones that test better become the high end model (which they charge more for) and the ones that are not as good become mid-range or low-end. Or some are slightly defective, so they take an 8-core part, disable 2 cores, and sell it as a 6-core. But they are all essentially the same chip.

          cybereality But in most cases, I would assume the GPU chip itself is the same.

          The fact that there is only one basic chip is not a secret and is not hidden. But a simple re-flashing doesn't turn a GeForce into a Quadro like "about 15 years ago". That may have been the case, but now I can't call such a situation "It's all just a scam".

          Unfortunately, I can't check the differences in practice right now. I was going to get an Xeon W-3335 and an RTX A2000 12GB, but circumstances forced me to get a tablet for mobility.

          Well, not necessarily a scam, but in some ways an artificial market segmentation. But now that I look at the prices, there is not as huge a difference in the Quadro card prices as there once was. Either way, all companies do this in some way or another. You can decide whether you call that a scam or not, or just good business.

          And this was not just 15 years ago. There was a hack from last year that enabled GPU virtualization on consumer cards, a feature supposedly only possible on Quadro and Tesla chips. You can even look at the code on Github, the driver basically just checks for an id number to decide if the feature is going to work or not.

          https://www.notebookcheck.net/Hack-allows-unlocking-GPU-virtualization-functionality-on-consumer-NVIDIA-cards.531761.0.html

            I can't help but miss my old game engine sometimes, just because I am used to it. Granted, I have used it for eight years straight. Then I come back here after I get sick of it... I am so conflicted, as I just want an engine to learn and stick with. I don't want to spend another eight years to learn coding basics, tho. Hoping after a while, I will become more satisfied here.

            Okay, rant over.

            cybereality There was a hack from last year that enabled GPU virtualization on consumer cards, a feature supposedly only possible on Quadro and Tesla chips.

            Oh, that's very interesting 👍️ but that's just one of the features of Quadro:

            All said, the vgpu_unlock mod does not fully replace purchasing an NVIDIA-recommended vGPU solution.

            Buying the Autodesk/NVIDIA's 'recommended solution' is almost entirely for warranty and support.

            Geforce cards have been shown to be 100% compliant with Autodesk's own tests before, they just don't state them as 'verified' because NVIDIA won't support them at a driver level for CAD software if something were to go wrong due to a geforce driver issue (I can't vouch for all cards/driver combos obviousyl, but you can find videos of people showing this).

            If your running a business where reliability/warranties are paramount, then Quadro is the de facto standard, I'm not debating that. But not because dropping in a Geforce card won't do the same job most of the time for a tenth of the price. But because NVIDIA don't profit from it, and it will be arguably some amount less 'reliable'. It's not a scam, just business really.

            For VRAM limited tasks like certain ML/Pro 3D animation/simulation work, the extra available (24gb+) on the Quadro/Tesla range and double precision may force your hand, but for most people (surveyors especially) these limits are unlilely to impact them.

            Anyway I guess my point is if your an indie, self employed or even startup/smb, the actual risk of using your Geforce card, VRAM permitting, todo the same job for a tenth of the price is minimal (though should always be considered according to your use case). And as your Geforce card is likely clocked higher, there's a fair chance it will out perform it's like for like Quadro card in such tasks anyway, but comes with shorter third party warranty and potential (albeit small) driver compatibility problems.

            If however you're paying the ~$2k/yr for Maya already, you're probably not fussed that Quadros are expensive anyway.

            Kojack So holding off until maybe september for imminent 4000 series seems a good idea. Well, at least to see what's up with them, since current rumors are only the 4090 will be september, the cheaper ones are later.

            And it'll take time, but previous generations prices should drop a bit over time in response to the new generations cards, unless nvidia pulls an nvidia and increases the MSRP again...

            I'm glad nothing at all has changed since I went through my consumer electronics phase.

            If your intent is to write games that other people will run on their platforms, does it really matter whether your development system has a state-of-the-art GPU?