I'm having difficulty finding out what the B in RGB controls on a CanvasTexture. As far as I can see it isn't doing anything. Is it just unused?
While I'm asking about this, does alpha do anything either?
I'm having difficulty finding out what the B in RGB controls on a CanvasTexture. As far as I can see it isn't doing anything. Is it just unused?
While I'm asking about this, does alpha do anything either?
Does 2D actually use the Z component of a normal map though? I was messing around with it and seeing no visual changes.
Me:
I found my issue. I was doing all this experimenting using TileMaps. Well, it's actually each TileSet TILE's unique material that is doing the lighting calculations. That in particular is what was discarding the blue channel. Actually I have no idea what exactly it does. But if I override that material with a CanvasMaterial or ShaderMaterial, then I get blue behaving how I would think it should. Or you know, if I just test out normals on a regular Sprite.
award Just as a side node, a normal in the normal map can theoretically be defined by two numbers only. The third (z) component is implicit and can be calculated from x and y components using the sphere equation. However this would cost you needless sphere equation calculation per fragment and what's even worse you lose a lot of precision for far inclined normals because z gradient starts to converge towards infinity as you approach the "equator". It can have significant effect if you encode normals into a byte per component format as is often the case.
You wouldn't happen to know if there's any better documentation on 2D (or 3D) shader normals, would you?
I've been experimenting around more and some things are weird and just not explained on the Canvas Shader page. For starters, normal values all go from -1 to 1 rather than 0 to 1. Passing your 0 to 1 values to NORMAL_MAP rather than NORMAL compensates for this, or it's easy enough to compensate for manually. The docs allude to this but don't explain it. If you're generating normals in a shader, like I just was, it can cause quite a headache. I'm sure you already knew about that since it's the same for spatial shaders.
Secondly, I have very little understanding of how PointLight2D.height interacts with the normals. It's related in some way to all of the channels, but not straightforwardly. "Hits at 45° x units away." doesn't entirely explain it.
Then there's the NORMAL_MAP_DEPTH out variable. Not a clue what it does. Maybe nothing in 2D? ¯\_(ツ)_/¯
If there aren't good docs then I'm gonna go make another github issue...
award Do you fully understand how normal mapping generally works in 3D? I think it's exactly the same for 2D in Godot, as if you were mapping a camera facing quad in 3D. The only difference is that some things are implicit in 2D. Godot implements standard tangent space mapping, everything looks to be done by the book. So read up any general normal mapping info, it should clear things up a bit. Surprisingly, Wikipedia article on normal mapping is quite decent, maybe start there. This is also a good explanation.
The difference between NORMAL and NORMAL_MAP is that former is in object/camera space while latter is in tangent space. Those spaces may actually be identical when mapping a 2D sprite.
The light height is just its imaginary z coordinate. If the canvas is xy plane then the height is light's distance from that plane (towards us). So if you move the light source x units from origin in the canvas plane, and same amount of units "up" from the canvas plane, the light ray will hit the origin at 45 degs inclination.
NORMAL_MAP_DEPTH affects perceived intensity/depth of the bump effect. Iirc it's a factor that non-proportionally scales the TBN basis, biasing the normal inclination when transformed from tangent space to world/camera space using TBN matrix. I don't see any reason for it not to work in 2D.
award Nope! I'd always thought it was still 0 to 1.
It'd be good thing to understand if you're algorithmically generating normal maps.
As an exercise, I suggest writing a "manual" normal mapping spatial shader, effectively implementing in your fragment function what Godot does afterwards with NORMAL_MAP value.
Something like this: ignore the NORMAL_MAP, read and decode/normalize the normal map texture, alter the normal accordingly using per-pixel TBN (tangent-binormal-normal) matrix, and then write it to NORMAL. If done properly, you should get the same results as when writing from normal texture to NORMAL_MAP. But in this case you also get the "achievement unlocked" sound effect
That sounds like a good idea. For a front facing quad though, I'm not wrong about the relationship between tangent and object space though am I? Specifically for a front-facing quad, object space (x, y) -> tangent space 0.5 * (1.0 + r, 1.0 + g), or no? The next thing I do will be per-pixel TBN I swear . This project has been generating normals based off of sprite edge-distance, which I think I have at least almost right.
EDIT: Yes the shader is brute force and absurdly expensive. Planning on rasterizing what it generates.
award For a camera facing quad and a 2D sprite, we can assume that object space and tangent space coincide. For a 2D sprite I think the object space and camera space also coincide. But this needs to be checked.
However, when writing the normal from fragment shader, Godot expects NORMAL to be in camera space, while NORMAL_MAP in tangent space AND remapped do 0-1 range, exactly like the values that are encoded into a standard normal map texture.
Also, if you write to NORMAL_MAP, then NORMAL will be ignored.
So when you want to write your calculated per pixel normal from the fragment shader, you can do one of the following:
NORMAL = camera_space_normal;
// or
NORMAL_MAP = fma(tangent_space_normal, vec3(.5), vec3(.5));