Hi, my goal is to write a shader for a smooth gradient.
For example I would like to be able to color a sphere so that it's bottom is completely black and its top completely white, like this:

My problem is that the coordinates in the VERTEX built-in are in the local Mesh space. This means that coordinates are given from some origin and can be both >1 and negative.

The example from above only works since I've had this magic + 1.0 in the shader-code and the sphere had a radius of 1.0.
If I make the sphere larger without changing the shader, the gradient changes:

Is there a way to normalize these VERTEX coordinates to [0..1] in the shader? My goal would be that I could put this gradient shader on meshes of different sizes.

  • Pixophir replied to this.
  • You could create a couple of uniforms that you assign via script the sphere objects position and scale, or maybe even a single uniform that holds the whole transform matrix for it. You could then extract the location, scale and rotation from it as needed, though that's a bit more complex in terms of the data structure.

    I'd probably go with just 2 uniforms for scale and global_position

    You could make the spatial-shader worldspace via world_vertex_coords RenderMode but there would be a lot more you would have to handle manually then. Or maybe I'm just thinking of skip_vertex_transform instead...

    You could create a couple of uniforms that you assign via script the sphere objects position and scale, or maybe even a single uniform that holds the whole transform matrix for it. You could then extract the location, scale and rotation from it as needed, though that's a bit more complex in terms of the data structure.

    I'd probably go with just 2 uniforms for scale and global_position

    You could make the spatial-shader worldspace via world_vertex_coords RenderMode but there would be a lot more you would have to handle manually then. Or maybe I'm just thinking of skip_vertex_transform instead...

    nilolo Is there a way to normalize these VERTEX coordinates to [0..1] in the shader? My goal would be that I could put this gradient shader on meshes of different sizes.

    Talking in OpenGL terms here, you can calculate the normals in the vertex shader based on current geometry and pass them along to the fragment shader. That'll also save some memory space in the vertex buffers. Normals of a sphere aren't worth storing anyway.

      Thanks for your input. So I guess putting the mesh scale into the shader via uniform would be a good option. In that case I think I'm going to use the AABB, since I'm only interested in the height for now.
      If I divide the AABB dimensions by the scale, I should be able to normalize VERTEX to [0,1].

      Pixophir Thanks for your input, I think you slightly misunderstood my question. I'm not interested in the normal vectors but in transforming the coordinates that are saved in VERTEX into the interval [0..1]. Since they're in local space and can be any size depending on the mesh.

      Oh, I see. Yes the vertex stage acts on the vertex positions passed into it. So one would just parametrize the sphere's radius as a uniform and use -1/1 clamped vertex coordinates right away. This allows for applying more model transformations including scaling without distortion of the shape.