Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Regarding the optimal value for IMU acceleration #44

Open
yeongjejo opened this issue Nov 1, 2024 · 12 comments
Open

Regarding the optimal value for IMU acceleration #44

yeongjejo opened this issue Nov 1, 2024 · 12 comments

Comments

@yeongjejo
Copy link

I'm currently applying my sensor to a live demo. The upper body movement works naturally, but when it comes to walking or running, the lower body movement is unnatural, and the character does not move.

I suspect this issue might be related to the sensor's acceleration or physics engine parameters. My sensor includes gravitational acceleration, while I've confirmed from the documentation that sensors like Xsens do not include gravitational acceleration. I proceeded with the simulation after removing gravitational acceleration, but the same issue persists.

Could you advise on how to configure the acceleration settings in this case? (e.g., Should gravitational acceleration be removed?)

Also, if it isn't an issue with the acceleration, should I modify the RBDL library code?

Thank you.

(I have a different sensor, not Xsens or Noitom.)

@Xinyu-Yi
Copy link
Owner

Xinyu-Yi commented Nov 1, 2024

Hi, if your acceleration contains gravity, it typically means that it is expressed in the sensor frame. So, for most sensors, you need to first change the acceleration to the global frame by left-multiply it with the sensor orientation, and then minus the negative gravity, which should be a constant and depend on wich global frame your imu is using. You need to check that if you place the imu on the desk in arbitrary orientation, it always report zero acceleration; and if you drop the imu in the air, it should measure a constant vector with magnitude = 9.8. Then you can use this acceleration for the live demo.

If the check can pass, the codes should work.

@yeongjejo
Copy link
Author

Hello. Thank you for your response.

I adjusted the acceleration as you suggested and proceeded with the simulation, but I am still experiencing the same issue with the walking motion. The sensor I have does not match the axis orientation of Xsens, so I manually adjusted the axes. In this process, I also rotated the acceleration axes accordingly. Could this have caused the issue? (For example, when moving the X-axis of the head sensor, the Y-axis of the character moved, so I swapped the X and Y inputs for both the quaternion and acceleration.)

Just in case, I have also attached a video of the simulation.

Thank you again for your time and response.

pip_-_Example_-_Windows._Mac._Linux_-_Unity_2022.3.5f1__DX11__2024-11-03_21-50-57.mp4

@Xinyu-Yi Xinyu-Yi

@Xinyu-Yi
Copy link
Owner

Xinyu-Yi commented Nov 4, 2024

(For example, when moving the X-axis of the head sensor, the Y-axis of the character moved, so I swapped the X and Y inputs for both the quaternion and acceleration.)

This means that your calibration is wrong. You should not change the sensors' reading. The calibration needs to first align a sensor x-y-z with your left-up-forward, and then face this direction to do the T-pose. Maybe your first step is wrong.

You can also find a walking sequence in the TotalCapture dataset and find what the difference is between your calibrated input and the dataset.

@Xinyu-Yi
Copy link
Owner

Xinyu-Yi commented Nov 4, 2024

And, if you directly swap two axes, the handedness will change. This cannot work unless your original coordinate system uses a left-handed system

@yeongjejo
Copy link
Author

2

Thank you for your additional response.

Could you confirm if the coordinate system you mentioned corresponds to the left-handed coordinate system in the image?

Additionally, based on the link we reviewed (https://community.element14.com/learn/learning-center/the-tech-connection/w/documents/27687/tracking-human-motion-with-xsens-dot-wearable-sensors), we understand that the Xsens Dot sensor uses a right-handed coordinate system. When using the Xsens Dot sensor, did you change the coordinate system to a left-handed one?

Thank you for taking the time to assist us.

@Xinyu-Yi
Copy link
Owner

Xinyu-Yi commented Nov 4, 2024

We always use a right-handed system. You need to confirm whether your imu uses a right-handed system. For most cases it's yes, and in these cases you should not swap two axes.

@Xinyu-Yi
Copy link
Owner

Xinyu-Yi commented Nov 4, 2024

Just put your imu on the desk and observe its raw acceleration. It should measure a g=9.8 pointing straight upwards. Then you can easily figure out the sensor coordinate frame by rotating the sensor and read the acceleration.

@yeongjejo
Copy link
Author

Thank you for your additional response. Based on the test results, the motion has improved significantly compared to before, but it still doesn’t seem completely natural (around 60% of the performance compared to the demo video).

The conditions I understand for the motion to work properly are as follows:

During the I-pose, all sensor axes should be aligned in the same direction.
During the T-pose, the tpose_calibration() function aligns the left arm (imu_set.get()[1][0]) with the Unity axes (xyz should be aligned as left-up-forward) using this transformation:
torch.tensor([[0, 1, 0], [0, 0, 1], [1, 0, 0.]]).mm(RSI).
Is there anything wrong with the conditions above?

For our sensors, the axes are already set as shown in the I-pose image. Therefore, in the code torch.tensor([[0, 1, 0], [0, 0, 1], [1, 0, 0.]]).mm(RSI), to align the xyz axes as lleft-up-forward, I modified the matrix to:
[0, 0, 1], [-1, 0, 0], [0, 1, 0.].

Is my approach incorrect? Should the sensor raw data axes be fixed during the I-pose, or is it important for all sensor raw data axes to be aligned in the same direction during the T-pose?

Thank you very much for taking the time to respond to this lengthy question.

@yeongjejo
Copy link
Author

I have set the accelerometer axes and the quaternion axes to be the same.

Thank you.

pose

@Xinyu-Yi
Copy link
Owner

Xinyu-Yi commented Nov 12, 2024

The calibration process just does two things:

  1. Set the global frame with the y-axis pointing up
  2. Calculate sensor-to-bone orientation

Then, the input to the network is:

  1. Bone orientation in the y-up global frame.
  2. Bone acceleration in the y-up global frame in m/s^2.

I'm not sure whether your process is correct. You need to make sure that these two steps are done correctly.

@yeongjejo
Copy link
Author

yeongjejo commented Nov 28, 2024

Hi,
It's been a while since I last wrote.

Thanks to your invaluable response, I was able to succeed in operating the PIP with the sensor I have. I am truly grateful that you took the time to provide such a detailed answer. If I ever have the chance, I would love to meet you in person and treat you to a meal.

I have the utmost respect for your research and will continue to support your work in the future.

Thank you.
@Xinyu-Yi

@Xinyu-Yi
Copy link
Owner

Oh, that's great!

If you have any further questions, feel free to ask me. :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants