-
Notifications
You must be signed in to change notification settings - Fork 40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Regarding the optimal value for IMU acceleration #44
Comments
Hi, if your acceleration contains gravity, it typically means that it is expressed in the sensor frame. So, for most sensors, you need to first change the acceleration to the global frame by left-multiply it with the sensor orientation, and then minus the negative gravity, which should be a constant and depend on wich global frame your imu is using. You need to check that if you place the imu on the desk in arbitrary orientation, it always report zero acceleration; and if you drop the imu in the air, it should measure a constant vector with magnitude = 9.8. Then you can use this acceleration for the live demo. If the check can pass, the codes should work. |
Hello. Thank you for your response. I adjusted the acceleration as you suggested and proceeded with the simulation, but I am still experiencing the same issue with the walking motion. The sensor I have does not match the axis orientation of Xsens, so I manually adjusted the axes. In this process, I also rotated the acceleration axes accordingly. Could this have caused the issue? (For example, when moving the X-axis of the head sensor, the Y-axis of the character moved, so I swapped the X and Y inputs for both the quaternion and acceleration.) Just in case, I have also attached a video of the simulation. Thank you again for your time and response. pip_-_Example_-_Windows._Mac._Linux_-_Unity_2022.3.5f1__DX11__2024-11-03_21-50-57.mp4@Xinyu-Yi Xinyu-Yi |
This means that your calibration is wrong. You should not change the sensors' reading. The calibration needs to first align a sensor x-y-z with your left-up-forward, and then face this direction to do the T-pose. Maybe your first step is wrong. You can also find a walking sequence in the TotalCapture dataset and find what the difference is between your calibrated input and the dataset. |
And, if you directly swap two axes, the handedness will change. This cannot work unless your original coordinate system uses a left-handed system |
Thank you for your additional response. Could you confirm if the coordinate system you mentioned corresponds to the left-handed coordinate system in the image? Additionally, based on the link we reviewed (https://community.element14.com/learn/learning-center/the-tech-connection/w/documents/27687/tracking-human-motion-with-xsens-dot-wearable-sensors), we understand that the Xsens Dot sensor uses a right-handed coordinate system. When using the Xsens Dot sensor, did you change the coordinate system to a left-handed one? Thank you for taking the time to assist us. |
We always use a right-handed system. You need to confirm whether your imu uses a right-handed system. For most cases it's yes, and in these cases you should not swap two axes. |
Just put your imu on the desk and observe its raw acceleration. It should measure a g=9.8 pointing straight upwards. Then you can easily figure out the sensor coordinate frame by rotating the sensor and read the acceleration. |
Thank you for your additional response. Based on the test results, the motion has improved significantly compared to before, but it still doesn’t seem completely natural (around 60% of the performance compared to the demo video). The conditions I understand for the motion to work properly are as follows: During the I-pose, all sensor axes should be aligned in the same direction. For our sensors, the axes are already set as shown in the I-pose image. Therefore, in the code torch.tensor([[0, 1, 0], [0, 0, 1], [1, 0, 0.]]).mm(RSI), to align the xyz axes as lleft-up-forward, I modified the matrix to: Is my approach incorrect? Should the sensor raw data axes be fixed during the I-pose, or is it important for all sensor raw data axes to be aligned in the same direction during the T-pose? Thank you very much for taking the time to respond to this lengthy question. |
The calibration process just does two things:
Then, the input to the network is:
I'm not sure whether your process is correct. You need to make sure that these two steps are done correctly. |
Hi, Thanks to your invaluable response, I was able to succeed in operating the PIP with the sensor I have. I am truly grateful that you took the time to provide such a detailed answer. If I ever have the chance, I would love to meet you in person and treat you to a meal. I have the utmost respect for your research and will continue to support your work in the future. Thank you. |
Oh, that's great! If you have any further questions, feel free to ask me. :) |
I'm currently applying my sensor to a live demo. The upper body movement works naturally, but when it comes to walking or running, the lower body movement is unnatural, and the character does not move.
I suspect this issue might be related to the sensor's acceleration or physics engine parameters. My sensor includes gravitational acceleration, while I've confirmed from the documentation that sensors like Xsens do not include gravitational acceleration. I proceeded with the simulation after removing gravitational acceleration, but the same issue persists.
Could you advise on how to configure the acceleration settings in this case? (e.g., Should gravitational acceleration be removed?)
Also, if it isn't an issue with the acceleration, should I modify the RBDL library code?
Thank you.
(I have a different sensor, not Xsens or Noitom.)
The text was updated successfully, but these errors were encountered: