Can I get the texture coordinates of a plane where the collision detected (with raycast)?

crecre Posts: 3Member
in 3D

Is there any method to get the texture coordinates from a raycast? I would create a plane (MeshInstance) which would has a snow/mud type of effect, so when something falls to the plane like a ball or any other object, it would push the texture. The plane and shader have been already created, it is working with a heightmap as well, so I tried to create a viewport and draw to the heightmap inside with this viewport, but the projection didn't work completely.
I would find a solution such as RaycastHit.textureCoord in Unity.

Best Answer

  • cyberealitycybereality Posts: 926
    Accepted Answer

    Well, you could just add half the side (or scale it first and add 0.5).

    var a = (-25 / 50) + 0.5
    

Answers

  • MegalomaniakMegalomaniak Posts: 2,773Admin
    edited June 10

    Since the original post that was caught in the moderation queue has a more complete title and otherwise the exact same post content I'll leave it around and delete the other duplicate topic.

  • cyberealitycybereality Posts: 926Moderator
    edited June 10

    I don't believe you can get the texture coordinate built-in, but there should be a way to do it. Maybe do a raycast down to the floor plane, get the collision point in world space. Then, you can convert this world space point into UV space, should be pretty easy for a square plane with 0,0 to 1,1 texture coordinates. For example, discard the y value and then scale the x and z to the reciprocal of the size of your plane.

  • cyberealitycybereality Posts: 926Moderator

    Oh, and welcome to the forum!

  • bitshift-rbitshift-r Posts: 59Member

    See this repository for an example of using the MeshDataTool.

    Also this question .

  • crecre Posts: 3Member

    Thanks the answers.

    @bitshift-r said:
    See this repository for an example of using the MeshDataTool.

    Also this question .

    I have started to implement this version, I used the methods from this repo (just changed the methods which are not working in Godot 3.2 for example: Matrix3 to Basis), but I got an error, because the determinant will be 0 every time. I don't know exactly why. Can you give the principle of this calculation?

    @cybereality said:
    I don't believe you can get the texture coordinate built-in, but there should be a way to do it. Maybe do a raycast down to the floor plane, get the collision point in world space. Then, you can convert this world space point into UV space, should be pretty easy for a square plane with 0,0 to 1,1 texture coordinates. For example, discard the y value and then scale the x and z to the reciprocal of the size of your plane.

    Thank you, it could be good, I'm going to try this. but which would be the best way to handle the negative coordinates? For example If the plane in 0, 0, 0 and the size is 50 on X and Z axis as well, then there will be some points which are in the negative side (for example -25,0-25), but also there are points on the positive side (25, 0, 25). I just need to use an offset value on both axis in proportion to its distribution?

  • cyberealitycybereality Posts: 926Moderator
    Accepted Answer

    Well, you could just add half the side (or scale it first and add 0.5).

    var a = (-25 / 50) + 0.5
    
  • crecre Posts: 3Member

    @cybereality said:
    Well, you could just add half the side (or scale it first and add 0.5).

    var a = (-25 / 50) + 0.5
    

    Thanks. Finally, I had some time and implemented it. I created a script which attached to the floor.
    At first, It calculates the first coordinates in the world space:
    "self.get_global_transform().origin.x - self.get_mesh().get_size().x/2", and the same on the z axis of course. So these are the upper left coordinates in the view of the plane. After that, the offset values will be determined on both of the axis
    For example on X axis: "(-1 * xUpperLeft) - self.get_mesh().get_size().x/2" and the same on Z as well. It will be called in the ready function and store the offset values.

    Finally there is the GetUV function with posX and posZ parameters and it calculates the real UV coordinates:
    "((posX + self.offsetX) / self.get_mesh().get_size().x) + 0.5"

    0.5 is necessary, because it uses the middle of the object when it calculates the offsets (because self.get_global_transform().origin.x will give the coordinates of middle of the object), so it has to add this constant value at the end.

    Overall, it could work when the object is not in the middle of the world space, I can place it anywhere, but it can't handle the rotations, however it is also not important right now for me.

  • cyberealitycybereality Posts: 926Moderator

    Awesome! So glad to help.

Leave a Comment

BoldItalicStrikethroughOrdered listUnordered list
Emoji
Image
Align leftAlign centerAlign rightToggle HTML viewToggle full pageToggle lights
Drop image/file