An analog to digital converter is connected to a sensor that outputs a voltage f
ID: 2996281 • Letter: A
Question
An analog to digital converter is connected to a sensor that outputs a voltage from between 1 mV to +100 mV. You want to use an 8-bit ADC with an input of 0 to 10 V to read the signal into a computer where and input less than XLSB gives you a digital value of 00000000. What is XLSB? What is the actual magnitude of the percent quantization error when using this ADC without any amplification of the input signal when the input signal is 40 mV and 100 mV. What is the largest possible gain that you can include between your sensor and your ADC to reduce the quantization error? If you use an amplifier with a gain of 75, what would be the approximate quantization error for a 40 mV signal from your sensor?Explanation / Answer
a) 8 bit means we can represnts 2^8 numbers = 256
hence for representing between 0-10V we can have Xlsb = 10/(256-1) = 10/255 = 2/51V = 0.0392156V = 39.2156mV
b) If there is no amplification then Xlsb = 100mV/255 = 0.392156mV
for 40 mV input signal error = 40mV-(39.2156) = 0.78431mV in % = 1.95775 %
for 100 mV input signal error = 0% because we can show exacr 100mV using 11111111 code.
c) Largest possible gain = 2550 because if we use this value of gain then
Vin = 0-100mV and vout = 0-100*2550mV = 0-255V
it mean we can show each number using A/D converter without any quantization error,
for example we want to show 2V then = 00000010 no error will occur.
d) gain = 75
o/p of sensor = 40mV
o/p of amplifier = 75*40 = 3V
max o/p of sensor=100mV
max o/p from amplifier = 75*100 = 7.5V
Xlsb = 7.5/255 = 0.0294117V =29.411mV
error = 3-(101*0.029411) = 2.9705mV in % = 0.099%
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.