Bimbam My first attempt at an actual plugin.
Nice job.
Bimbam My first attempt at an actual plugin.
Nice job.
I have one. I got me a new PC last October. Had to, because the old one (just bought 5 months before) suffered from thermo-mechanical alteration when the place was covered by a lava flow from the Cumbre Vieja eruption. I paid 720,- Euros for an RX6700XT and 450,- for an AMD Ryzen 5800X :-/
Only problem with this PC is, it draws around 400 watts when overclocked, and still 70 when running within specification. Though it is blazingly fast. I say "problem" because my new place will be totally off grid with solar alone, begging a somewhat more modest approach to nightly hacking at the pc from the batteries. And, tbh, I am at an age where a cold beer in the fridge is more important than a few fps ;-)
Just read in the news that the EU will reign in the miners, among other things demanding from them to file in studies concerning environmental impact of their ... doings. Goes in the right direction, imo.
tl,dr: I find the approach to develop hardware that's less demanding energy wise deserves to be supported. Also, current vega based graphics of the 5x00G series are pretty good, but looking at what's ahead with faster ram and NAVI2 and even less power consumption it may be worth waiting. All of these can run what I am doing with CG.
Honestly, we don't even need new video cards or computers. Current hardware is already good enough to render completely photorealistic scenes in real time. The software is just far behind. Like my system, my CPU has 16-cores and 32-threads. Most software can barely take advantage of a fraction of that (a lot of software is still single-threaded, or parallel with weak algorithms). GPUs are super fast and powerful. So I think my current computer will be fine for another 5 years or even 10 years, and still see consistent improvement every year, provided the software can catch up.
That's exactly the point. It's more than enough power that we have, we rarely use it anyway.
Games as a whole and preparing data for the render pipeline don't lend themselves well to parallelization anyway. From my layman's observation, it is all far too sequential. Better applications lie in scientific calculations where a high exactness and numerical stability is needed, and models are calculated on a grid based approach. But games rarely have have that requirement.
It's a problem of resources. Making faster processors only requires a handful of technicians, and though it's tedious work it's only really difficult when they're shrinking the minimum resolution -- and even that is well-researched. Making software that takes advantage of multiple processors is a much harder problem, and everyone who touches any piece of software has to figure it out or they end up dragging everyone else back. When you program linearly there's no need to synchronize the results of an unknown number of processes (and you can't assume any number of threads beyond one).
Some programmers have told me that multi-threading is easy, but they're usually thinking of solutions that are really inefficient, like just locking anything that has to be shared. If every process has to wait on something, how are you improving anything? On the other hand, if you split data between threads, you lose efficiency in other ways, like threaded compression algorithms which may have exactly the same data in two threads, but the threads don't know that so they store it twice anyway.
Well it's theoretical. In principle, a compiler could automatically multi-thread code, but no one has really figured it out yet. And I agree, the topic is too complex to expect every programmer to know how to do it. But it seems to be the direction we should be going.
Hello Gamers. I've been looking for some nice forums to join as a better alternative to social media. Happy to join here!
Welcome @CLAYOoki
Given only the top-end RTX cards can handle 4k with raytracing and a lot of people are not one, but two generations behind thanks to prices over the last two years, I am very happy prices are coming down.
As much as my 1080Ti has been a beast for a long time, honestly, I'm running new games on Medium or worse to get 60FPS on 4k and suffering in Cycles render times compared to the latest and greatest.
What I will say is we are in an annoying limbo where even if prices are dropping, all evidence suggests the next-gen RTX is due in 2-4 months and spec leaks are looking like almost double current RTX performance, so buying now both is and isn't a good idea -.-
Similar story for the next Ryzen, which I'm still miffed about Threadripper now being pro only (which basically means you pay wild prices the more cores you want, and I want more).
There is always something new coming in a few months. By that logic, you will never spend a dime.
A new card, yes. But for nvidia a new series is every 2 years.
The 900 series Geforces came out september 2014.
The 1000 series Geforces came out may 2016.
The 2000 series Geforces came out september 2018.
The 3000 series Geforces came out september 2020.
So holding off until maybe september for imminent 4000 series seems a good idea. Well, at least to see what's up with them, since current rumors are only the 4090 will be september, the cheaper ones are later.
Sure. But when a new series comes out, stock is low, everyone wants it, there are scalpers, the price goes up, it's out of inventory. By the time the price drops down to MSRP, then a new card is about to come out and you are in the exact same situation again (and probably regret waiting and not just buying something when you had the chance).
Plus crypto just tanked, so GPU prices are the lowest they've been in years. Who knows, what if crypto takes off tomorrow and blows up? Then you will be hitting yourself in the face for the next 2 or 3 years until the next crash.
cybereality I'd like to be wrong but I think we are in for a quite long and hard recession, world wide. Crypto was a scam from the start and I doubt if it's coming back at least in present form. I'd say things like new vid cards will come up in much slower increments just like cars and all kinds of things to conserve resources. A lot of people will be lucky to have food on the table. The federal reserves of the world are probably some of the biggest idiots anyone ever put in charge.
I guess what I'm saying is, you should make your decision (any decision) based on the available information at this point in time. For example, if you need a new video card, and there is a decent one you like for say $500, and you have $500 to spend, then the choice is clear. It is within your budget, it has the performance you need (and the performance will not drop, regardless of what comes out next month), and it is available today. If you base your logical process on speculation, then you probably have a 50/50 chance of being right (or maybe less, as there are multiple variables, all unknown). It's just not a good decision making process.
For example, there are rumors that a new Nvidia card will come out in 2 months. This is not for sure, there are always wild rumors and most of the time they are bunk. But even if it were true, Nvidia always launches the flagship first. So we may wait another 2 or 3 months for a mid or low range version of the chip. This is not guaranteed, but a safe assumption. So, even in the most optimistic of this rumor/assumption, the RTX 4060 card would be coming out in November. But inventory is typically low, at first, and it will be a hot card everyone wants to buy. So there will be scalpers, the price will be inflated, and there is no idea when it will come down to MSRP. But let's make another assumtion, that inventory will pick up after 2 months and the price will go down. So now we are in January of 2023, and this is with the optimistic prediction. We still don't know if crypto will come back within that timeframe (unlikely, but there is a chance). We don't know how the economy will change, if inflation flies up, or the stock market crashes, or any other event that might make $500 today cost you $1,000 next year. We don't know if there will be manufacturer defects or bad batches (this has frequently happened with Nvidia in the recent years). There are just so many unknowns, and even in the best case scenario, if all those risks went in your favor, you still have to wait until next year, rather than buy a card today and have it in your computer in 3 days. This is why I say that basing purchases on rumors and speculation is always a mistake.
I've given buying things because others suggest I need them. I can't possibly keep up with the rat race anyway. I buy hardware when I need it, not when another sow is being driven through the village (that a valid English saying ? :-)). For my own part, I don't need 4k resolution and imperceptible frame rates, and energy consumption and environmental impact are as important (Edit: proof reading, I'd say more important) to me as latest tech.
There is, for instance, currently no car that interests me. Won't buy a new one even if my old car breaks down, I'll take the bus, then. Take something to read with me. Saves me a lot of time I'd waste holding on to a steering wheel, not the most intelligent occupation ...
Actually, I feel that exactly there Godot stands out from other high end engines, besides from being accessible to everyone through programming interfaces that really work. Unreal didn't for me, and Unity has too much fluff, technically and legally.
Yeah, back to work, nice day everyone :-)
Quadro vs GeForce
NVIDIA GeForce RTX 3070 Modded With 16 GB GDDR6 Memory
Russian-based modder, VIK-on, has upgraded the NVIDIA GeForce RTX 3070 graphics card with 16 GB GDDR6 memory. The modding required some technical experience but in the end, VIK-on had the RTX 3070 running with twice its original memory capacity.
My point is that if the difference between Quadro and GeForce was insignificant, the redesigns would be mass-produced.
Perhaps there is a slightly deeper difference in the hardware than is commonly thought.