Files
J1B4-Fitbot/docs/documentation/research-questions/motion-tracking-system-analysis.md

48 lines
2.6 KiB
Markdown

## Analysis for Motion Tracking System on the Pepper Robot.
---
For our project, we want our users to be able to see the movement they're making, so they can
see whether they're executing the fitness task correctly. For a tracking system as such, we'll need to use quite
some mathematics.
Our current approach is defining the required path by a set of vertices; points in 3d space.
These points define along which path the user has to move their limbs to, depending on what the activity is
and where they've placed the tracking devices.
A path can look like the following
<img height="128" src="../assets/motion-path-example-vertices.png" width="128"/>
To be able to measure the position of our tracking device, we'll have to use sensors that allow us
to retrieve useful information that can provide us with either position or velocity. The device will have
to be calibrated initially, of course, due to position being relative.
To acquire our measurements, we've chosen for the following configuration for the tracking device:
- ESP8266 (Wi-Fi only)
- Accelerometer / Gyroscope combination (BNO085)
We've chosen for this configuration due to the fact that it delivers all of our needs, with the smallest form factor,
whilst also delivering quality measurements.
This sadly does come at the cost that the ESP8266 does not have a dedicated Bluetooth chip, therefore making
connectivity to our robot a little more complicated.
The calculations behind the tracking system will be done in the following steps:
1. Acquire calibration point (zero point)
2. Convert relative acceleration and rotation to position, relative to the calibration point
3. Generate a path object that can be compared to the correct one
4. Calculate the difference of the path at every measurement sample
At first, to get the calibration point, the user will have to stand still for a moment without moving.
This can be sensed by the device, which will then be sent to the Pepper robot.
We've decided to send this data using Wi-Fi using a HTTP WebSocket connection, due to the fact that our
ESP8266 does not have a Bluetooth chip present, and due to the fact that a WebSocket connection allows us
for a both fast and secure data transfer.
Second, to convert our relative acceleration rotation to a useful position, we'll have to first convert the
acceleration vector `A(x, y, z)` and rotation vector `R(x, y, z)` to an acceleration vector that is
perpendicular to the normal vector of the earth. This is because the acceleration vector of the device
is relative to its own axes, and not to the earth's normal vector.
To convert this, we'll have to multiply the acceleration vector `A(x, y, z)` by the rotation matrix