This is a geometric progression where the value of the daily salary is multiplied by a constant, namely 2 in this case.
T(n)=2n
The first day is day 0.
Σi=0 to n-1 (2i)
=2n - 1
So the first day is 1 cent. (20=1)
The second day's salary is 2 cents.(21=2)
The third day's salary is 4 cents.(22=4)
....
The nth day's salary is 2n-1
and the sum up to the nth day is
2n -1.
Do not forget to divide by 100 to convert cents to dollars.
Be very careful with the variable that you use to store the values.
If you use integer using VB2008 or Java, you will have up to 231, or 31 days.
If you use long integers, you will have up to 63 days' worth of salary.
In VB2008, you can have up to 80 (I think) bits using type Decimal, so up to 80 days' salary.
However, using Java's BigInteger and BigDecimal class, the number of digits of accuracy is immaterial. I have made programmes that run with 15000 accurate decimal digits.
Generally C-compilers accomodate less precision.
If your teacher does not require accuracy to the last digit, you can always use Double Precision with most of the common languages to get about 15 digits of accuracy, with (almost) no problem of overflow.
I have been stuck on this problem in my software design class and I hate going in these places and looking stupid but my professor had a family emergency and my homework is due tomorrow night.
Design a program in pseudocode that calculates the amount of money a person would earn over a period of time if their salary is one penny the first day, and then doubles each day. The program should ask the user for the number of days. Display what the salary was for each day, and then show the total pay for the end of the period. The output should be displayed in dollar amount, not number of pennies.
I cant get it for the life of me
1 answer