The American Standard Code for Information Interchange (ASCII) has 128 binary-co
ID: 1921977 • Letter: T
Question
The American Standard Code for Information Interchange (ASCII) has 128 binary-coded characters. If a certain computer generates 100,000 characters/second, determine the following:
a. The number of bits (binary digits) requiredper character.
b. The number of bits/second required to transmit the computer output, and the minimum bandwidth required to transmit the signal.
c. For single-error detection capability, an additional bit (parity bit) is added to the code of each character. Modify your answers in parts (a) and (b) in view of this information.
THANKS FOR THE HELP!! =)
Explanation / Answer
A.) 27 = 128, therefore the number of bits needed for 128 characters is 7 bits.
B.) 1 character / second = 7 bits / second. 100,000 characters / second = 700,000 bits / second. The minimum bandwidth required to send the message is 700,000 bps.
C.) 8 bits would be needed with the additional parity bit since 7 bits + 1 bit = 8 bits.
1 character / second = 8 bits / second, therefore 100,000 characters / second = 800,000 bits / second. The minimum bandwidth required is 800,000 bps.
Related Questions
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.