Still pretty newbish about shaders so I may be using the wrong terms in places - appreciate any corrections.

I have an image (the recent eROSITA soft xray all sky survey https://en.wikipedia.org/wiki/EROSITA ) which is presented in the Aitoff projection (https://en.wikipedia.org/wiki/Aitoff_projection) - as a learning exercise I wanted to write a shader that would map that image onto the inner surface of a CSG Sphere.

In a vertex() shader, I'm using the VERTEX.xyz coordiantes to convert the local model space position to spherical coordinates so that I can unwarp the aitoff projected data, and I'm setting UV.xy values for each vertex. There's a little domain mapping thing where the aitoff projection is I think on {-1, +1} and UV coordinates are [0,1] but that's easy.

The problem I think I'm having is how to manage the periodic boundary condition - there are verts at spherical coordinates that basically are in two places on the map - consider (in degrees, latitude longitude) (0, 180) which is the 'same' point as (0, -180) but map to UV coordinates of [x = 1, y = 0.5] or [x = 0, y = .5].

Fine by itself, but when we texture map a triangle and interpolate some local UV coordinate, there are some triangles in my mesh that have one or more verticies from the pos. domain and one or more from the negative, and the whole texture gets squashed in some extra repeat.

I have some ideas about how to fix this but I imagine I am wandering off into crazy town and there are probably better known solutions - my googling has failed me though. One (non godot) paper I saw made reference to checking the 'filter width' - essentially, 'how big of a UV space is this triangle sampling' and adjusting the UV coordinates appropriately if the width seemed too large - I would think that information is accessible in the fragment shader but I can't figure out where that might be.

My likely-bad idea is to try to add alternate UV2 coordinates for vertices that are on the boundary - but I'm not sure I totally see how that will work.

Thanks for any help!

Are you calculating the coordinates in the vertex or fragment shader? 'Cause if it's the vertex shader, that might be the problem. Try doing it in the fragment shader instead. Bare in mind that the VERTEX value in the fragment shader is different from the one in the vertex shader. The former is in view space while the latter is in model space (world space if the shader uses the world_vertex_coords render mode). You'd have to manually pass the model space vertex using a varying.

They were in the vertex portion. As you say, using the VERTEX value in the fragment shader led to a background that didn't move (since that was now view space, that makes sense) and copying the model local to a varying did remove the seam-artifact that I was getting. I'm not sure I understand why yet. I assume that in the fragment shader there's a way to unproject the view coordinates back to a model-local coordinate - is it more efficient (generally) to use varying and copy it or to unproject it with out the varying?

I believe it would be more efficient to use the varying, since you wouldn't need to do the extra matrix multiplications required in the fragment shader to transform the vertex. The reason why the seams happen in the vertex shader is because of the way the UVs get interpolated. One vertex in a triangle could have a value close to 1.0, while the other vertex might be close to 0.0. This is because those triangles lie at the seams of the wrapping, and so the UV data in between gets interpolated the wrong way.

2 years later