Has anyone come across what the actual deviation in ppm is for the Artemis’ RTC? I’ve been digging through the Apollo3 Blue datasheet, but haven’t come across any accuracy values yet. I’m very curious to know how it would operate across a wide range of temperatures.
Thanks for your reply. I’ve recently done some tests with the Artemis against the Maxim DS3231SN, which has an accuracy of ±3.0 ppm over the temperature range -40°C to +85°C. Considering that it’s integrated with the Apollo3 and is not temperature-compensated, I’ve found the Artemis to perform quite well.
Subjecting an Artemis Nano to significant temperature swings produced 1-2 seconds of drift over 24 hours. This equates to an error of approximately 15-30 ppm. I imagine at a constant temperature above 0°C it’d be much more stable.
I’ve been wondering about this for a while, I always assumed this was a TC crystal, but apparently not… maybe that uses too much power… anyways, I have a bunch of fielded devices w artemis nano boards in them… they seem to run ~1-2s fast per day, so I time sync them daily… anyways they’re fielded in a very warm climate… my boxes tell me they’re like 130f inside… so you can expect that in warm climates as well