It gives you the vertices of a mesh, which you can copy, alter, do whatever, and make a new mesh.

To be clear the shading discontinuities are the result of normal interpolation, or rather lack there of. Effectively the face normals are pointing in slightly different directions or at an angle such that the face shading results in that discontinuity. Merging vertices might/could take care of it, however if all you need is the shading to be continuous just making the relevant normals aligned should be enough for that.

    Megalomaniak what about the gaps in the vertices when I move them a bit? Is there a solution to fix that?

    You would have to move the vertices to matching positions. You'll likely have to implement a recursive method to identify matching vertices and move them to respectively to their matching median positions.

    Mind, if the caps appear because you are trying to animate it then bone animation might be a better solution.

      This is correct, but I don't think you have the control on the surface level. That is why you need MeshDataTool as you can manually set the vertices and normal directions.

      So yeah somehow they are still making gaps though making sure that Blender was vertically aligning them has removed the gaps.

      So how does this work? I got the impression it's an array of vertices, so if I'm adding all the vertices together shouldn't it just stitch it all into a cohesive surface? Is it that the vertices are just somehow barely not matching the previous vertices so it doesn't continue the mesh, and since there's a gap it's apparent?

      MeshDataTool, or SurfaceTool or ArrayMesh I guess you mean @cybereality but I get that I need some kind of control.

      Megalomaniak I tried to use a cylinder mesh and then programmatically add bones but I couldn't get good results from it. Possibly just because I didn't know how to apply weights programmatically or possible because there is some missing APIs for adding bones.

      Can you explain what you're trying to accomplish? Maybe there is an easier way.

        cybereality That's good question. The idea is that I want to use proc gen to build a plant but I want it to have variation. So you can imagine that along its length it might be thinner or thicker, or the branch might bend somehow.

        So far I tried to

        1. Build my own mesh, which was a lot of work, I don't quite get the order of building the vertices and it seems rather finicky. I wanted to start with something
        2. Build a CylinderMesh and then mutate it. This worked when I was resizing vertices at different levels, but trying to bend the cylinder didn't make sense. So I tried adding bones programmatically but that didn't work. Probably ultimately this was a bad approach because a lot of plants are a long way away from a cylinder anyway.
        3. This approach: Take an open-ended mesh from blender that I can stack on top of itself. Each piece's vertexes can be mutated as much as I want in this case but I need the stacked meshes to glue together more reliably. It seems like I might just have to figure out how to have the vertexes mix more so Gogot knows it's homogeneous?

        What you need to do is export a base mesh from Blender and use MeshDataTool to modify the vertices and normals. It's complex getting it right, but I think that is the only way that is going to work.

          Megalomaniak I have thought of this but I wanted o depict actual living plants not imaginary ones, but it might just ultimately be something I have to give up.

          cybereality I looked at the MeshDataTool APIs and it seems similar to the other options (ArrayMesh, SurfaceTool) so I'm not sure what exactly this tool does. Do you mean I should be iterating through the vertexes basically? What does that tool do for my purposes if you know what I mean. Like do I still have to do some sort of math to align the meshes or is there a feature of MDT to help with that

            L-system is a way to describe the behavior of plant cells and to model the growth processes. It's up to you do make a system then describe real plants as approximately accurate then as possible. The fact that it is general enough that it can describe/generate fictional or new species is not a weakness.

            You could also use a hybrid approach where you use a L-System to do a partial modeling, of the stem and any further branches but the leaves and flowera/crowns or fruits or what have you, you could still use manually premodeled assets for.

              Megalomaniak Maybe I should look more into it, though I don't think this fixes the actual rendered representation?

              It should help you produce a better base for it that might in turn fix your issues. I'm assuming that your manually produced base segments or however to call them might have multiple small but ultimately significant issues in modeling/alignments. You could just fix them manually, but assuming you will have a plurality of them, maybe it's just easier to generate them rather than manually model them all.

              Besides, even if you produce those manually you might inevitably still run into needing some form of L-system or adjacent method for producing variations procedurally. Otherwise you might as well just manually model plants fully and just instance them via random picking from a set.

              graggu I looked at the MeshDataTool APIs and it seems similar to the other options (ArrayMesh, SurfaceTool) so I'm not sure what exactly this tool does.

              ArrayMesh and Surface tool are making copies of an object, not modifying an objects topology. MeshDataTool is working on the level of vertices and indices and normals, as you would in OpenGL or Vulkan, so you have full control.

              For example, you can scale the mesh data on the vertex level (same number of vertices but different shape) and it will not look disconnected.

                cybereality

                I was trying to do this and noticed a few things.

                I tried just merging all the vertexes into an ArrayMesh and it gave me an error "Vertex amount (2680) must be a multiple of the amount of vertices required by the render primitive (3)." Looking into my mesh it didn't have vertexes that are a multiple of three, and it used INDEXES in the array as well. So I was able to get it to work by also duplicating indexes and incrementing them but I still had the gap problem as I had before.

                For MeshDataTool I just don't know what to do with that. It gives you the option to create from a mesh and then also to commit to it, as well as add vertexes and normals. So that's nice but not really adding anything that SurfaceTool or ArrayMesh had already.

                I tried to just create a MeshDataTool and then add all my vertexes and normals, but realized there is no way to add indexes. So when I do it I end up with the error above as well, where the number of vertexes need to be multiples of 3.

                At this point I'm not sure what to do besides maybe figuring out how to export a mesh that doesn't use indexes or ensuring it exports a multiple of 3.

                2680 is not divisible by 3. Seems like a straight forward error. You are manipulating triangles, which consist of 3 points. If you want to share vertices, you would use indices.

                All those solution would work, I think MeshDataTool is the right way, but you have to understand the math and logic first.

                  cybereality I know it's not. And it's not an error, in the scenario I'm not adding or removing any vertices. It's that the indexes are what's making it work (the imply duplication of vertices and presumably make it divisible by 3). The fact that MeshDataTool doesn't operate with indices is the problem and why it's not usable. Either what I've given up on this. Thanks