Help me work through this, please! 4. A certain computer algorithm used to solve
ID: 2870119 • Letter: H
Question
Help me work through this, please!
4. A certain computer algorithm used to solve very complicated differential equations uses an iterative method. That is, the algorithm solves the problem the first time very approximately, and then uses that first solution to help it solve the problem a second time just a little bit better, and then uses that second solution to help it solve the problem a third time just a little bit better, and so on. Unfortunately, each iteration (each new problem solved by using the previous solution) takes a progressively longer amount of time. In fact, the amount of time it takes to process the k-th iteration is given by T(k)=(1.2^k) +1 seconds.
A. Use a definite integral to approximate the time (in hours) it will take the computer algorithm to run through 60 iterations. (Note that T(k) is the amount of time it takes to process just the k-th iteration.) Explain your reasoning.
B. The maximum error in the computeris solution after k iterations is given by Error=2k^(-2). Approximately how long (in hours) will it take the computer to process enough iterations to reduce the maximum error to below
0. 0001?
Explanation / Answer
A) 1 hour=3600 seconds.
1/3600*integral[1.2^k +1]dk, parameters from 0 to 60
172 hours.
The integral sums the time.
B) Error=2k^(-2)=0.0001; k^(-2)=0.00005; -2*LN k=LN 0.00005;
k=4.952, so k>4.952 or 5 iterations would be below 0.0001.
Related Questions
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.