through this procedure i have a set of calibrated value (getCalibrated…() function), according to my light source and parameters, so i essentially have the spectrum of my light source under certain conditions of measurement.
but later, when I make an acquisition by placing a leaf in front of the sensor, is it enough for me to directly take the value returned by getCalibrated? or do I have to perform operations on raw data?
I’ve build a pretty darn functional general purpose spectrometer that displays both absorption and refracted light that may be pretty much plug and play for your use case. It includes both a sampling function that allows for user specified gain and LED intensity, sample length (the sample method is a little different than I have seen on line but seems to work great) and a calibration script for either an all white or all black enclosure. I’ve been lazy and haven’t posted to github yet, but wil, including the enclosure STL file for 3d printing.
The way i’m calibrating is as per below. Note I am using all sensor “channels”, from viz to IR, as all are diagnostic and information-rich, but they all have their own sensitivity issues so I’ve tried to find a simple calibration approach that allows for adjustments per channel/spectral range.
Calibration Output:
• Dark Calibration: During dark calibration, the sensor is capturing ambient noise (zero-reference) when no light is present. These values are saved to a file (e.g., dark_calibration.csv) and represent a “baseline offset” to subtract from future runs.
• White Calibration: During white calibration, the sensor captures a maximum light reference. These values are saved to a file (e.g., white_calibration.csv) and represent the “upper bound” to normalize intensity readings for future runs.
This maps the raw intensity to a range between 0 and 1, where:
• 0 represents the dark (minimum intensity).
• 1 represents the white (maximum intensity).
• Values outside this range are clipped (e.g., negative values become 0).
How it works?
The calibration process adjusts the raw sensor output using the dark and white calibration files:
• Calibrated Intensity Formula:
Calibration intensity = (raw intensity - dark intensity)/(white intensity - dark intensity)
I have not measures organic stuff yet, but a boatload of minerals and gemstones and it’s remarkably accurate so far. In many ways, despite the lack of full spectrum analysis, the fact that the board measures the range it does, something no lab spectrometer does that I know of, it may even exceed lab grade for my use case (ie, not all bands are equally diagnostic, and some are not even considered in reference libraries is absorption ‘signatures’ by mineral).
I hope that helps . PM me and i can share code (several pytbon scripts orchestrated by a master shell script, plus the arduino code).