I’m working on gaining up an almost DC signal using a precision/instrumentation amplifier, but I’m heavily concerned with how intrinsic noise will affect the read signal. I’m reading the output with a 24-bit ADC - taking a maximum of 1.25mV to ~1V with a gain 1000. Should I be filtering the input and output for this noise? I know that op amp noise is greater at lower frequencies, but is the noise itself low frequency? Would a simple low pass filter take care of most of this?
If the input is “almost DC” and your A/D is relatively fast, then you can filter the results in software. But first just display ~20 consecutive readings and see if there is any noise coming through.
So you are hoping to read a 1mV signal as full scale using a 24 bit ADC ?
2 ^ 24 = 16777216, so 1 LSB will be 0.000 000 000 060V = 60 pV (pico volts).
Assuming that the ADC only really has 20 bits resolution and accuracy, this moves a 20 bit LSB up to about 1nV.
What is the ADC part you intend using ?
What is the required bandwidth (DC-100Hz ?)
How fast are you planning to sample the signal ?
What op-amp are you planning to use ?
What is the source impedance of the transducer ?
More information about application please.
===============
If you are serious about this might I suggest you read all you can about op-amps and ADCs.
You are certainly going to need some very quiet power supplies, an RF-proof tin box to house the signal processing in, and you might want to also read up on “input guarding”