A worker is offered a job at two different firms. At the first firm, the wage pa
ID: 1120524 • Letter: A
Question
A worker is offered a job at two different firms. At the first firm, the wage paid increases each year. At the second firm, the wage is constant over time. The worker anticipates working three years at whichever job she accepts and then retiring. The first firm offers a three-year contract paying an annual salary of $50,000 in year 1, $60,000 in year 2, and $70,000 in year 3. a. If the workers discount rate is r = 0.1 (10%), what must the annual salary be at the second firm for the worker to accept its job offer? b. How does the lowest annual salary that must be paid by the second firm in order to attract the worker compare to the average annual salary paid under the three-year contract offered by the first firm?
Explanation / Answer
A)
Calculate PV of the income from first job:
PV = 50,000/(1.1)1 + 60,000/(1.1)2 + 70,000/(1.1)3
PV = $176,264.02
In order to accept job offered from second job, PV of second job salary shold be atleast the PV in case 1
That is,
Assume salary each year be $S
PV = S/(1.1)1 + S/(1.1)2 + S/(1.1)3
PV = 2.941S = $176,264.02
S MINIMUM = $59,933.67
B)
Average salary from first job = (50,000 + 60,000 + 70,000)/3 = $60,000
Average salary from second job = $59,933.67 (lower)
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.