My use case: I generate heightmap terrain from random noise, already have code to produce a plane mesh at the given resolution by reading 2D noise at each vertice. What I want to do differently in one project is have the height data at a lower resolution from the terrain mesh... this is both to get smoother terrains, better performance by reading less noise data and storing less terrain info, and most importantly making terraforming simpler by having fewer edit points across a large grid with visual terrain drawn at higher resolution.
I need to know what's the interpolation system I have to use for generating a smooth mesh across a set of 2D points, essentially to produce a bezier 2D surface like a 1D curve. When a vertice on the heightmap is between any 4 height data points (X and Z axes) how do I make it take the average height of those points based on its distance from each one? For a simple example let's say I have the following point data:
- X 0 - Y 0 = 1
- X 0 - Y 1 = 0
- X 1 - Y 0 = 0
- X 1 - Y 1 = 1
If a vertice is at position X 0.5 - Y 0.5 so in the center, its height should (probably) be 0.5 in this case: It would increase the closer it got to either corner with 1 and decrease closer to either of the corners that have 0. When its position is exactly one of those corners then it's precisely 0 or 1.
Obviously I don't want spikes to form: Not looking for vector interpolation which would cause prism holes and pyramids, the surface should go like a sine wave between corners so each hole and bump located at a point should look like a hemisphere... ideally, if it's not too complicated, perhaps spikiness could be made a point property so I can choose how to blend between the two modes. For the sake of LOD and being able to hide distant terrains, I'll likely create an individual mesh between every 4 points... as such I wouldn't mind if the function running per patch only works with one set of corners long as neighbors end up perfectly seamless.