1) If a 0-5120 mV analog input interval is used, how many bits of resolution are
ID: 1847536 • Letter: 1
Question
1) If a 0-5120 mV analog input interval is used, how many bits of resolution are needed of the resolution of the LSB is <1 mV?
2) If a digital multi-meter is to display 5-digits, what resolution is necessary for the ADC?
3) The output of an ADC can also be in binary coded decimal (BCD) format, so that it can directly drive descoders for 7-segment displays. For a BCD output, 4 bits are needed for each decimal position in order to represent the numbers from 0 to 0.
How many output pins would an 8-bit ADC need for BCD output?
Explanation / Answer
1.) lets assume resolution of LSB is 1mv
so for a full scale(5120mV) we get a reading of 5120(decimal) = 1010000000000 (13 bits)
Using 13 bit resolution we get 8191 for full scale(5120mV) that is (5120/8191) = .625mV per LSB
2.)5 digits -> maximum of 99999(decimal)
for 16 bit resolution we get a maximum value as 65535 and for 17 bit we get 131070(>99999)
So the resultion of ADC can be any number of bits less than or equal to 16
3.)8-bit ADC will give a maximum output of 255(decimal). That is 3 binary coded decimal are needed where the MSB is never greater than 2. So we need only 2 bits for the MSB and 4 bits for the remaing two places. SO a total of 10 bits will be the minimum number of output pins..
Related Questions
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.