I'm making a water shader with refraction, but objects above water also get refracted. I've used the depth buffer to stop this before, but I can't get it to work. I already use the depth buffer for coloring:

float depth = texture(DEPTH_TEXTURE, SCREEN_UV).r;
depth = depth * 2.0 - 1.0;
depth = PROJECTION_MATRIX[3][2] / (depth + PROJECTION_MATRIX[2][2]);
depth += VERTEX.z;
depth = exp(-depth * beer_factor);
float colorDepth = clamp(1.0-depth,0.0,1.0);

The variable colorDepth is simply depth inverted and clamped between 0 and 1 to use for alpha.

The refraction in SpatialMaterial also has this problem, so that's no help either.

I thought refracting only if depth is less than zero would work, but that didn't do anything. Does anyone know how to do this?

Some unity shaders might help if they were adapted, but I haven't found anything yet.

I've collaborated with someone working on a water pack before. The project's abandoned, but the materials there can be useful, including this one.

I assume this is the code to refract only underwater objects?

float depth_tex = texture(DEPTH_TEXTURE, SCREEN_UV + screen_offset).r;
            vec4 world_pos = INV_PROJECTION_MATRIX * vec4((SCREEN_UV + screen_offset) * 2.0 - 1.0, depth_tex * 2.0 - 1.0, 1.0);
            world_pos.xyz /= world_pos.w;
            float depth = distance(VERTEX.xyz, world_pos.xyz);
            if ((depth < fade_distance))
            {
		//code for refraction

So is world_pos the world position of the current pixel in the depth texture?

In this case, no. I named them wrong. That's actually the view space position.

How come world pos is a vec4? I thought just the depth relative to the water(using VERTEX.z, I think) needs to be compared. Unless I'm mistaken, all I need to do is get the depth relative to the water plane, and only refract if depth > 0.

The reason why it's a vec4 is because the w component was needed for perspective division. It's something that must be done after transforming the normalized device coordinates to view space and vice versa. Notice the part where I divided it by w.

And yeah, simply comparing the z components for depth would've sufficed too, but I believe it's more accurate if I compared it by distance.

So what is world_pos, how does getting the distance work if it's not actually the world position?

Like I said, I named it wrong. That's the position of the depth sample in view space. The vertex position of the water's surface is also in view space. Figuring out the distance between the two can tell you how much water is in the current fragment being rendered.

You said it's view space position? What does that mean?

@Dschoonmaker said: You said it's view space position? What does that mean?

It means that the position of the fragment, or any arbitrary point really, is relative to the scene's the camera. The builtin VERTEX variable is in view space for example, meaning that it's position is relative to the camera.

So it could be compared to VERTEX.z to check if that pixel is underwater.

Oops, I spent several minutes trying to figure out why your solution wasn't working. Then I put in an if/else statement for debugging(make the water red if it should refract, green if it shouldn't), but the water remained a clear blue. . .

Then I realized I was typing in the shader for the bottom of the water. Godot has this weird little bug where if I click on the top of the water it selects the bottom & vice versa. Maybe it will work now :p

Thanks for your help @SIsilicon28 , but neither of your solutions(there were two different methods in the code) worked. The second almost did, but some objects too far below the surface didn't refract. I'll try to find more about depth buffer & refraction.

Hmm, is your project in GLES2 or GLES3? Also the code you were referring to in the beginning only handles fogging, not refraction. That's in a different part of the shader.