I was trying to do some custom mesh making, originally by creating a variably tall cylinder and then deforming the vertices at different levels. However I didn't see that scaling very well for my goals.

My new attempt is to take a file exported from blender, create multiple instances of it, deforming each level based on my code, and then merging it into one mesh.

The exported model is more or less like an uncapped cylinder for the most part.

The logic is to take the mesh, offset the MeshInstance3D vertically based on its offset, and then use SurfaceTool to add them together.

func combineMeshes(meshes: Array):
	print("Stitching %s meshes" % meshes.size())
	var surfaceTool = SurfaceTool.new()
	surfaceTool.begin(Mesh.PRIMITIVE_TRIANGLES)
	for instance in meshes:
		assert(instance is MeshInstance3D, "not given a MeshInstance3D")
		surfaceTool.append_from(instance.mesh, 0, instance.transform)
		
	surfaceTool.generate_normals()
	self.meshInstance.mesh = surfaceTool.commit()

It basically works, in that it creates the instances of the mesh, deforms them, and combines them vertically in a column.

The issue is that even though they are in the same Mesh they don't connect to each other. There is a very clear micro-line separating each one and the lighting makes it even more obvious.

Issue 1: This was especially bad when I was modulating the rotation of the levels slightly which I don't don't anymore.

Issue 2: It was also an issue when I hadn't perfectly leveled the points of my model, which made the aa_bb of the mesh go way off because there was a high point in my model that broke alignment. I feel like even if that was a thing if it was stitching the meshes together it would be joined but just as an artifact anyway.

I then just tried doing it with a BoxMesh and it actually works perfectly. Here is an example of 6 boxes on top of each other.

My thought is that the vertexes of the BoxMesh just align with each other enough that the system considers them contiguous when they are added to SurfaceTool. And maybe the triangle count of my Blender mesh just happens to miss that? But if that were the case I'd kind of imagine more glitchiness than what I am seeing.

I'd like to just have logic to stitch them together between adding meshes but I don't exactly know how I can do this.

cybereality I took a look, what should I be looking for with that tool? I don't see anything specifically about recreating the surface or anything

It gives you the vertices of a mesh, which you can copy, alter, do whatever, and make a new mesh.

To be clear the shading discontinuities are the result of normal interpolation, or rather lack there of. Effectively the face normals are pointing in slightly different directions or at an angle such that the face shading results in that discontinuity. Merging vertices might/could take care of it, however if all you need is the shading to be continuous just making the relevant normals aligned should be enough for that.

    Megalomaniak what about the gaps in the vertices when I move them a bit? Is there a solution to fix that?

    You would have to move the vertices to matching positions. You'll likely have to implement a recursive method to identify matching vertices and move them to respectively to their matching median positions.

    Mind, if the caps appear because you are trying to animate it then bone animation might be a better solution.

      This is correct, but I don't think you have the control on the surface level. That is why you need MeshDataTool as you can manually set the vertices and normal directions.

      So yeah somehow they are still making gaps though making sure that Blender was vertically aligning them has removed the gaps.

      So how does this work? I got the impression it's an array of vertices, so if I'm adding all the vertices together shouldn't it just stitch it all into a cohesive surface? Is it that the vertices are just somehow barely not matching the previous vertices so it doesn't continue the mesh, and since there's a gap it's apparent?

      MeshDataTool, or SurfaceTool or ArrayMesh I guess you mean @cybereality but I get that I need some kind of control.

      Megalomaniak I tried to use a cylinder mesh and then programmatically add bones but I couldn't get good results from it. Possibly just because I didn't know how to apply weights programmatically or possible because there is some missing APIs for adding bones.

      Can you explain what you're trying to accomplish? Maybe there is an easier way.

        cybereality That's good question. The idea is that I want to use proc gen to build a plant but I want it to have variation. So you can imagine that along its length it might be thinner or thicker, or the branch might bend somehow.

        So far I tried to

        1. Build my own mesh, which was a lot of work, I don't quite get the order of building the vertices and it seems rather finicky. I wanted to start with something
        2. Build a CylinderMesh and then mutate it. This worked when I was resizing vertices at different levels, but trying to bend the cylinder didn't make sense. So I tried adding bones programmatically but that didn't work. Probably ultimately this was a bad approach because a lot of plants are a long way away from a cylinder anyway.
        3. This approach: Take an open-ended mesh from blender that I can stack on top of itself. Each piece's vertexes can be mutated as much as I want in this case but I need the stacked meshes to glue together more reliably. It seems like I might just have to figure out how to have the vertexes mix more so Gogot knows it's homogeneous?

        What you need to do is export a base mesh from Blender and use MeshDataTool to modify the vertices and normals. It's complex getting it right, but I think that is the only way that is going to work.

          Megalomaniak I have thought of this but I wanted o depict actual living plants not imaginary ones, but it might just ultimately be something I have to give up.

          cybereality I looked at the MeshDataTool APIs and it seems similar to the other options (ArrayMesh, SurfaceTool) so I'm not sure what exactly this tool does. Do you mean I should be iterating through the vertexes basically? What does that tool do for my purposes if you know what I mean. Like do I still have to do some sort of math to align the meshes or is there a feature of MDT to help with that

            L-system is a way to describe the behavior of plant cells and to model the growth processes. It's up to you do make a system then describe real plants as approximately accurate then as possible. The fact that it is general enough that it can describe/generate fictional or new species is not a weakness.

            You could also use a hybrid approach where you use a L-System to do a partial modeling, of the stem and any further branches but the leaves and flowera/crowns or fruits or what have you, you could still use manually premodeled assets for.

              Megalomaniak Maybe I should look more into it, though I don't think this fixes the actual rendered representation?

              It should help you produce a better base for it that might in turn fix your issues. I'm assuming that your manually produced base segments or however to call them might have multiple small but ultimately significant issues in modeling/alignments. You could just fix them manually, but assuming you will have a plurality of them, maybe it's just easier to generate them rather than manually model them all.

              Besides, even if you produce those manually you might inevitably still run into needing some form of L-system or adjacent method for producing variations procedurally. Otherwise you might as well just manually model plants fully and just instance them via random picking from a set.

              graggu I looked at the MeshDataTool APIs and it seems similar to the other options (ArrayMesh, SurfaceTool) so I'm not sure what exactly this tool does.

              ArrayMesh and Surface tool are making copies of an object, not modifying an objects topology. MeshDataTool is working on the level of vertices and indices and normals, as you would in OpenGL or Vulkan, so you have full control.

              For example, you can scale the mesh data on the vertex level (same number of vertices but different shape) and it will not look disconnected.