I need to know how raw magnetometer data is processed please.
When I plot raw magnetometer data with the CHR serial interface, the range is roughly -100 to 150.
When I plot processed magnetometer data, the range is roughly -05 to 1.5, and this does not look like 16 bit data. What’s going on?
After the magnetometer is calibrated, the processed data should be normalized so that the 3-element mag vector is unit-norm regardless of the orientation. If the processed data is not unit-norm, then it reflects poor calibration or the possible presence of nearby objects distorting the magnetic field.
Also note that the processed mag data registers (addresses 0x69, 0x6A, and 0x6B) are 32-bit floating-point values, while the raw data is stored as 16-bit integers.
I can’t use the floating point values in my microcontroller.
Can you tell me how to calibrate after I get the data into my microcontroller?
That could be tricky. The process of converting the raw data to processed data involves a floating point matrix multiply and an addition. To do that on a microcontroller, you’d need to pull the floating point calibration terms off the device beforehand, convert them to fixed-point, and implement a fixed-point matrix multiply algorithm on the micro.
You must be logged in to reply to this topic.
Copyright (c) CHRobotics LLC. All rights reserved.