My first attempt at an actual plugin. Seems gif is too big. Basically it's a basic terrain generator where a Bézier curve defines the shape. Pretty much only useful for simple platforms or basic level block out and is pretty jank, but might interest some.

https://github.com/Bimbam360/Curve_Terrain

    Gorgeous !


    Looking forward to the next generation of amd onboard graphics navi2 to get me a pc for development with low power consumption. Yay.

      Pixophir Prices on video cards are starting to drop because of fewer crypto miners, so it should be a good time to buy soon.

      I have one. I got me a new PC last October. Had to, because the old one (just bought 5 months before) suffered from thermo-mechanical alteration when the place was covered by a lava flow from the Cumbre Vieja eruption. I paid 720,- Euros for an RX6700XT and 450,- for an AMD Ryzen 5800X :-/

      Only problem with this PC is, it draws around 400 watts when overclocked, and still 70 when running within specification. Though it is blazingly fast. I say "problem" because my new place will be totally off grid with solar alone, begging a somewhat more modest approach to nightly hacking at the pc from the batteries. And, tbh, I am at an age where a cold beer in the fridge is more important than a few fps ;-)

      Just read in the news that the EU will reign in the miners, among other things demanding from them to file in studies concerning environmental impact of their ... doings. Goes in the right direction, imo.

      tl,dr: I find the approach to develop hardware that's less demanding energy wise deserves to be supported. Also, current vega based graphics of the 5x00G series are pretty good, but looking at what's ahead with faster ram and NAVI2 and even less power consumption it may be worth waiting. All of these can run what I am doing with CG.

      Honestly, we don't even need new video cards or computers. Current hardware is already good enough to render completely photorealistic scenes in real time. The software is just far behind. Like my system, my CPU has 16-cores and 32-threads. Most software can barely take advantage of a fraction of that (a lot of software is still single-threaded, or parallel with weak algorithms). GPUs are super fast and powerful. So I think my current computer will be fine for another 5 years or even 10 years, and still see consistent improvement every year, provided the software can catch up.

      That's exactly the point. It's more than enough power that we have, we rarely use it anyway.

      Games as a whole and preparing data for the render pipeline don't lend themselves well to parallelization anyway. From my layman's observation, it is all far too sequential. Better applications lie in scientific calculations where a high exactness and numerical stability is needed, and models are calculated on a grid based approach. But games rarely have have that requirement.

      It's a problem of resources. Making faster processors only requires a handful of technicians, and though it's tedious work it's only really difficult when they're shrinking the minimum resolution -- and even that is well-researched. Making software that takes advantage of multiple processors is a much harder problem, and everyone who touches any piece of software has to figure it out or they end up dragging everyone else back. When you program linearly there's no need to synchronize the results of an unknown number of processes (and you can't assume any number of threads beyond one).

      Classic Example

      Some programmers have told me that multi-threading is easy, but they're usually thinking of solutions that are really inefficient, like just locking anything that has to be shared. If every process has to wait on something, how are you improving anything? On the other hand, if you split data between threads, you lose efficiency in other ways, like threaded compression algorithms which may have exactly the same data in two threads, but the threads don't know that so they store it twice anyway.

      Well it's theoretical. In principle, a compiler could automatically multi-thread code, but no one has really figured it out yet. And I agree, the topic is too complex to expect every programmer to know how to do it. But it seems to be the direction we should be going.

      Hello Gamers. I've been looking for some nice forums to join as a better alternative to social media. Happy to join here!

        CLAYOoki Welcome m8! Isn't social media great? Instead I'm here splitting brain cells trying to imagine good ways to use multi-threading.

        Given only the top-end RTX cards can handle 4k with raytracing and a lot of people are not one, but two generations behind thanks to prices over the last two years, I am very happy prices are coming down.

        As much as my 1080Ti has been a beast for a long time, honestly, I'm running new games on Medium or worse to get 60FPS on 4k and suffering in Cycles render times compared to the latest and greatest.

        What I will say is we are in an annoying limbo where even if prices are dropping, all evidence suggests the next-gen RTX is due in 2-4 months and spec leaks are looking like almost double current RTX performance, so buying now both is and isn't a good idea -.-

        Similar story for the next Ryzen, which I'm still miffed about Threadripper now being pro only (which basically means you pay wild prices the more cores you want, and I want more).

          There is always something new coming in a few months. By that logic, you will never spend a dime.

          A new card, yes. But for nvidia a new series is every 2 years.
          The 900 series Geforces came out september 2014.
          The 1000 series Geforces came out may 2016.
          The 2000 series Geforces came out september 2018.
          The 3000 series Geforces came out september 2020.
          So holding off until maybe september for imminent 4000 series seems a good idea. Well, at least to see what's up with them, since current rumors are only the 4090 will be september, the cheaper ones are later.

            Sure. But when a new series comes out, stock is low, everyone wants it, there are scalpers, the price goes up, it's out of inventory. By the time the price drops down to MSRP, then a new card is about to come out and you are in the exact same situation again (and probably regret waiting and not just buying something when you had the chance).

            Plus crypto just tanked, so GPU prices are the lowest they've been in years. Who knows, what if crypto takes off tomorrow and blows up? Then you will be hitting yourself in the face for the next 2 or 3 years until the next crash.

              cybereality I'd like to be wrong but I think we are in for a quite long and hard recession, world wide. Crypto was a scam from the start and I doubt if it's coming back at least in present form. I'd say things like new vid cards will come up in much slower increments just like cars and all kinds of things to conserve resources. A lot of people will be lucky to have food on the table. The federal reserves of the world are probably some of the biggest idiots anyone ever put in charge.

              I guess what I'm saying is, you should make your decision (any decision) based on the available information at this point in time. For example, if you need a new video card, and there is a decent one you like for say $500, and you have $500 to spend, then the choice is clear. It is within your budget, it has the performance you need (and the performance will not drop, regardless of what comes out next month), and it is available today. If you base your logical process on speculation, then you probably have a 50/50 chance of being right (or maybe less, as there are multiple variables, all unknown). It's just not a good decision making process.