A. At the above rate (8.5 seconds), how long does it take to infect over one tho
ID: 344377 • Letter: A
Question
A. At the above rate (8.5 seconds), how long does it take to infect over one thousand servers (i.e. 1,024 or more), if there is only one server currently being infected? Show your work.
B. Continuing from question A., how long does it take to infect over one million servers (i.e. 1,048,576 or more), if there are 1,024 servers currently being infected?
C.From the answers given in A and B, what would you guess how long it will take to infect over one billion servers (i.e. 1,073,741,824 or more) if one million servers have already been infected?
D.Suppose the current time is 11:54:20 (11 O’clock 54 minutes 20 seconds) am. From the answers given in parts A, B, and C by what time will the SQL Slammer Worm infect over 1 trillion servers if only one server is being infected now? Show your work.
E. How much bandwidth (Mbps) is consumed by a server infected by SQL Slammer Worm under the following condition? Show your work.
UDP datagram payload: 376 bytes
UDP header: 8 byes
IP header: 20 bytes
Ethernet header and trailer: 26 bytes
UDP datagram transmission rate: 500 datagrams per second
Explanation / Answer
1)
we have 1 server being infected with the worm. The worms double every 8.5 seconds.
For 1024 worms the number of units required will be:
2^x = 1024
x=10
so total it will take 10*8.5 = 85 seconds for the count to reach 1024.
c)
we have 1024 servers infected. 1 million servers will be 1,048,576 which is 2^20.
we know that it doubles every 8.5 seconds.
so number of units required will be 2^x * 2^10 = 2^20
or x+10 = 20
x=10.
so it will take 10*8.5 = 85 seconds.
d)
the ratio will remain same in this case also.
2^30 will be 1 billion and 2^20 is 1 million.
so it will take total of 85 seconds in this case.
d)
for 1 trillion it will take 85+85+85+85 = 340 seconds.
It takes 85 seconds for 1 server to 1024, 1024 to 1 million, 1 million to 1 billion and so on....
so time will be
340 seconds is 5 minutes and 40 seconds.
so the time will be 12:00:00.
g)
we have payload value of 376 bytes and including other parts the total bytes used in packet is
376+8+20+26=430 bytes.
transmission rate is 500 datagram per second.
so total transmission is 430*500 = 215,000 bytes per second or 2.15 mega bytes per second or 2.15*8 mega bits per second = 17.2 Mbps.
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.