Here's the setup: http://i.imgur.com/SnJ7Jmd.png
The context I'm working from: http://i.imgur.com/nk9C7hb.png
src: http://www.fastgraph.com/makegames/3drotation/
I'm writing a Python script that fuses together sensor data to locate a point in space and emulate a virtual reality (HTC Vive) controller. I have all the hardware pieces communicating with the Python script, but I'm struggling getting the math to work correctly. The results are not what I expect, and I'm not sure how to proceed without wasting time fiddling.
Block diagram of sensors and software
https://static1.squarespace.com/static/5258a733e4b0804fa2549966/t/584056dd59cc68075cbad5db/1480611556967
The Leap Motion sensor is fixed to the front of the VR head set and outputs Left and Right hand positions in x,y,z. To get these positions in world space (i.e. coordinate system the VR headset uses), I need to find the transformation matrix:
VR headset orientation+position defines the transformation matrix
#Transformation Matrix created from Oculus headset data:
hmdTransformMatrix = RotMatrix_yawpitchroll(oculusVR.yaw, oculusVR.pitch, oculusVR.roll, [oculusVR.x,oculusVR.y,oculusVR.z]) #order is important, taken from FreePIE documentation
The order of euler angles (yaw, pitch, roll) is important, because we need to undo the rotations in the reverse order. From the software's documentation, line 42 of the source code sets the orientation order as Yaw, Pitch, Roll https://github.com/AndersMalmgren/FreePIE/blob/master/Lib/OculusVR/Code/src/ovr_freepie.cpp
So my utility function "RotMatrix_yawpitchroll" takes the headsets orientation and position, and produces the 4x4 transformation matrix (being careful to undo the rotation first, then translate the headset position):
Implementation of HMD mounted Leap Motion Transformation Matrix
def RotMatrix_yawpitchroll(yaw, pitch, roll, posVector):
#returns Transformation matrix from orienting a point in world space, by undoing the yawpitchroll in reverse order, then translating the posVector
# radians
v = [ [1,0,0, 0], [0,1,0,0], [0,0,1, 0], [0,0,0,1] ] #new translation matrix @ position 0,0,0
theta = roll
afterRotateRoll = matrixMultiply(R_matrix_roll(-theta), v) #rotate roll
theta = pitch
afterRotatePitch = matrixMultiply(R_matrix_pitch(-theta), afterRotateRoll) #rotate pitch
theta = yaw
afterRotateYaw = matrixMultiply(R_matrix_yaw(-theta), afterRotatePitch) #rotate yaw
v = [ [1,0,0, posVector[0]], [0,1,0,posVector[1]], [0,0,1, posVector[2]], [0,0,0,1] ] #new translation matrix of HMD position; first rotate, then position
tranformationMat = matrixMultiply(afterRotateYaw,v) #transformation matrix of HMD (orientation and position)
return tranformationMat
#right hand rule
def R_matrix_pitch(theta):
R_matrix_X = [[1,0,0,0],[0,math.cos(theta),math.sin(theta),0],[0,-math.sin(theta),math.cos(theta),0],[0,0,0,1]]
return R_matrix_X
def R_matrix_yaw(theta):
R_matrix_Y = [[math.cos(theta),0,-math.sin(theta),0],[0,1,0,0],[math.sin(theta),0,math.cos(theta),0],[0,0,0,1]]
return R_matrix_Y
def R_matrix_roll(theta):
R_matrix_Z = [[ math.cos(theta),math.sin(theta),0,0 ] , [-math.sin(theta) , math.cos(theta),0,0], [0,0,1,0],[0,0,0,1 ]]
return R_matrix_Z
def matrixMultiply(m1,m2):
prodM = []
for i in range(len(m1)): #for each row of m1
row = m1[i]
newRow = []
for j in range(len(m2[0])): #for each column of m2
y = 0
for x in range(len(row)):
rowEl = row[x]
colEl = m2[x][j]
y += rowEl*colEl
newRow.append(y)
prodM.append(newRow)
return prodM
With the transformation matrix we can translate the Leap's hand position data into the same coordinate space as the VR headset. However, the leap's coordinate system is a bit different when mounted to the front of an HMD:
Oculus Coordinate System: https://developer3.oculus.com/images/documentation/pcsdk/latest/tracking.jpg
Leap Motion's : https://di4564baj7skl.cloudfront.net/assets/leapjs/Leap_Axes_annotated-d06820cfbcb73e553f65e3774490ac36.png
Furthermore, the documentation reveals:
"The origin of coordinates normalized with an interaction box is the bottom, left, rear corner. You can translate the normalized coordinates to move the origin to a more suitable location."
https://developer.leapmotion.com/documentation/csharp/devguide/Leap_Coordinate_Mapping.html
This is the reason for the 0.5 offset and y/z axis swaps:
right hand position given by leap motion device:
#leap units are roughly 2/5 oculus units
leapRx = (leap.rightxpos-0.5) * float(0.4) #offset 0.5 so that coordinate system is where headset's coordinate system is
leapRy = (leap.rightzpos-0.5) * float(0.4) #leap has slightly different coordinate system
leapRz = (leap.rightypos) * float(0.4)
Now translate Leap's Right hand position into world space using the new transform matrix
leapTransformation = matrixMultiply(hmdTransformMatrix, [ [1,0,0,-leapRx], [0,1,0,-leapRy], [0,0,1,-leapRz],[0,0,0,1] ])
scaleFactor = 1000 #mm in a meter
RightHandPos_x = leapTransformation[0][3]* scaleFactor
RightHandPos_y = leapTransformation[1][3])*scaleFactor
RightHandPos_z = leapTransformation[2][3] *scaleFactor
position right controller/hand
hydra[1].x = RightHandPos_x
hydra[1].y = RightHandPos_y
hydra[1].z = RightHandPos_z
Running this code seemingly offsets and rotates the virtual hand to the left of where it should be. Moving my head makes everything rotate but around the wrong point. At first I thought the issue had to do with the hydra driver expecting a relative position to a basestation, so I subtracted the position of the headset on startup. That didn't work either.
Any help or tips to solving this is greatly appreciated. I've been drawing diagrams, stepping through each part of my code, and reading documentation to find what I've overlooked. Is it something more fundamental or am I'm missing something simple?