We are using LidarLite v3HP to measure e.g. snow level. We need highest possible precision (repeatability). We do NOT need fast sampling. In the collected data, we see a slight temperature dependency of the readings. We do oversampling (e.g. 100 samples, then average them).
We use the following settings:
sigCountMax = 0x80; // Default
acqConfigReg = 0x08; // Default
refCountMax = 0x05; // Default
thresholdBypass = 0x00; // Default
Questions:
Is there a way to improve readings / precision?
Should we change the configuration settings (above)? Goal: best precision, range for static targets.
Is there a known temperature dependency we could compensate for?
Is there a way to tell if a distance value is “good” or “bad”/“invalid”? E.g. by reading the “status” register?
I tried to ask these questions at Garmin’s support center. However, they replied that I should ask the retailer for help, because the Lidar is an OEM device. Hot very helpful
The default valid measurement detection algorithm is based on the peak value, signal strength, and noise in the correlation record. This can be overridden to become a simple threshold criterion by setting a non-zero value. Recommended non-default values are 0x20 for higher sensitivity with more frequent erroneous measurements, and 0x60 for reduced sensitivity and fewer erroneous measurements.
The best way to answer all your questions is by experimentation in a controlled environment, where you have a set up with accurately known distances, temperature controls, etc. That way you can determine the effect of various parameter settings, and the effect, if any, of temperature.