Pages

Showing posts with label Tips and Tricks. Show all posts
Showing posts with label Tips and Tricks. Show all posts

Sunday, February 4, 2024

Finding a Perpendicular Vector Using Dot Product

     While setting up a rig for the turtle, I decided to add IK/FK snapping functionality to help with animation. To set this up, I needed a way to calculate the pole vector for the legs when snapping IK to FK. I found this great tutorial by Greg Hendrix explaining how to do that. https://vimeo.com/240328348

    As I incorporated this into my rig though, I was confused on how he found the vector that's perpendicular to the middle joint. After experimenting with it, I found a different method that makes sense to me, so I want to record some notes on how it works.

     Basically, I needed this vector to help place the pole vector. With this vector, I could subtract it from the mid joint to get a new vector that's perpendicular to the IK line. This new vector could then be scaled depending on how far I want the pole vector to be from the mid joint.

    The first step was to find the length along the IK line up to the perpendicular point. I found this through calculating the dot product of the rootToMid line and ikLine. The important thing that I realized is that ikLine needs to be normalized for this to work. If I didn't normalize, it returned a length that was scaled by the ikLine vector. By normalizing ikLine, it gave the exact distance to the perpendicular point.

    In python, I wrote this out as follows.

 ikLineMidLength = rootToMid * ikLine.normal()

    After getting the length, I divided it by the ikLine length to get a scale value. 

ikLineScale = ikLineMidLength / ikLine.length()

    Finally, I multiplied the ikLine vector by the scale value to get a new vector at the perpendicular point.

ikMidVector = (ikLine * ikLineScale) + rootJntVector

    Here's the Python code that I used to find the perpendicular vector and place a locator at it's coordinates. After finding that vector, I also needed to offset it by the root vector. The initial vector subtractions caused everything to be relative to the origin, so it needed to be moved back to match the original joint locations.

import maya.cmds as cmds
import maya.OpenMaya as om

def addLocator(inputVector, inputName):
    cmds.spaceLocator( p=(inputVector.x, inputVector.y, inputVector.z), n=inputName )
    
# Joint Positions
rootJointPos = cmds.xform('rootJoint', q=True, ws=True, t=True)
midJointPos = cmds.xform('midJoint', q=True, ws=True, t=True)
endJointPos = cmds.xform('endJoint', q=True, ws=True, t=True)

# Joint Vectors
rootJntVector = om.MVector(rootJointPos[0], rootJointPos[1], rootJointPos[2])
midJntVector = om.MVector(midJointPos[0], midJointPos[1], midJointPos[2])
endJntVector = om.MVector(endJointPos[0], endJointPos[1], endJointPos[2])

# Distance Vectors
ikLine = endJntVector - rootJntVector
rootToMid = midJntVector - rootJntVector

ikLineMidLength = rootToMid * ikLine.normal() # Find the distance to the point on ikLine that is perpendicular to the midJoint.
ikLineScale = ikLineMidLength / ikLine.length() # Find the ratio of the perpendicular distance to the length of the ikLine.
ikMidVector = (ikLine * ikLineScale) + rootJntVector # Scale the ikLine by the ratio to find the vector at the perpendicular distance.

# Place Locator at Perpendicular Vector
addLocator(ikMidVector, "ikMidVector")

 

    Initially I had some trouble figuring this out. In particular, I was stuck on the mid length dot product until I realized that the IK Line needed to be normalized. Hopefully I can go back to this post if I ever forget how this works.

    Aside from that, I've also started on a script where you can provide 3 input joints and automatically place a new pole vector based on them. I'll make another post about that when it's ready.

 

Sunday, October 16, 2022

How to Convert World Space Normals to Tangent Space

    Recently, I ran into some shading issues using a world space normal map on my billboards. When instanced meshes have different rotations, it seems Unreal Engine can't keep the normals consistent. Since Unreal's imposter baker creates world space normal maps, I needed to find a way to convert them to tangent space.

 


Billboards with world space normals.

Billboards with tangent space normals.


    To fix this issue, I converted my world space normal map to tangent space with xNormal. Here are the steps that I took.


  1. First, there are two things that you need. One is the source mesh, and the other is the world space normal map baked from that mesh.



  2. Rotate the mesh to match xNormal's coordinates. I don't yet know a great way to figure this out, so I just guessed until I got it working. For me using 3ds Max, I rotated the mesh 90° in X and 180° in Z. Make sure to reset XForm before rotating. If there are already transforms on the mesh, this will interfere with how xNormal interprets the orientation.



  3. Export the mesh as an obj file. These are the settings that worked for me.



  4. Go to the tools section in xNormal, open the object/tangent space converter, enter the settings for your mesh, and generate. In this case, Xnormal is actually converting from object space to tangent space. So long as you didn't move your mesh from it's original pivot point, it should work fine with world space normals.




  5. Check the output tangent space map. It should look something like this. This is close, but you will still need to edit some of the channels so it's compatible with Unreal.



  6. Look at the individual RGB channels of the map in an image editor, such as Photoshop. If you imagine a light shining from the edges of the image, you can get a sense of how the channels should look. In this map, red is coming from the left, green from the bottom, and blue from the back. For Unreal Engine, you want the red from the right, green from the bottom, and blue from the front.

    These are the original axis directions in the baked tangent space map.

    These are the directions needed for Unreal Engine. In this case, the red and blue channel are inverted.


  7. Invert the red and blue channels of the texture. I used Substance Designer for this. Using an RGBA split, I inverted the red and blue channels and then re-merged them. After that, I used the alpha channel to make a mask for the background areas. To expand the edges of it a bit, I used edge detect. After that, I filled the area with the default normal map color. This should prevent any artifacts on lower MIP maps in the texture.



    The final normal map.


  8. Finally, import the normal map into Unreal. There are two options for how you can import it. One is importing it like a standard normal map without changing any settings. For the other, you can import it as a linear color texture and then adjust it in the material. I opted for the linear color because it seems to take slightly less space.



  9. If you want to use a linear color map, change all settings to default and uncheck sRGB. Whenever you use it in a material, make sure to add a constant bias scale node. Set the bias to -.5 and scale to 2.

    Linear color texture settings.


    Make sure to add a constant bias scale node anytime you use a linear color normal map.

    That's it! You should now have a functional tangent space normal map.