Hello,
I have several (3) Micro:Climate Kits, and what follows applies to all of them.
When I measure the rain gauge by using the extension functions from makecode.microbit.org, I see that most rain events (tipping event) add 0.022" (inches) of rain. However, in the guide:
https://learn.sparkfun.com/tutorials/mi … rain-gauge
it says it is 0.011" of rain per event. Which one is the correct? I don’t know.
Moreover, if I divide the rain measure by 0.022, and manually trigger an event, most of the times I read how the counter increases by 1, but other times it increases by 1.5. It is as if the libraries, somehow, are messing with the 0.022 and 0.011 in an inconsistent way.
I tried pouring 1L of water through the gauge and mentally counted the ticks while at the same time I measured the number of events with the micro:bit.
The first liter was poured a bit too fast, and got 390 real ticks while the code measured 444.
Then I repeated the pouring in a reasonably slower way (not superslow either) and got 404 (me) vs 495 (readings) on one run, and 401 (me) vs 494 (readings) on a second run. Overall, it would seem that 1 about 4 or 5 times, the library interprets the reading as 1.5 instead of 1.
If we go to
https://github.com/sparkfun/pxt-weather … therbit.ts
line 178
let inchesOfRain = ((numRainDumps * 11) / 1000)
which seems OK and consistent with the guide (0.011 inches per event).
Code from line 187 is more complicated (I can’t program in TypeScript), but from inspecting the library, I would say that in order to get events, I should divide inches by 0.011 and then I would not get any half-integer measures. However, this means that most of the events (around 80-85% of them), are giving a +=2 instead of +=1, and I can’t understand why.
I would appreciate any help.
Thanks