Originally posted by BuzzLY
In essense, the three of you each paid $9 for the room ($10 minus the one you got back), totalling $27, and gave the bellhop $2, for a grand total of $29. What happened to the other dollar?
This is a rounding error as demonstrated in my first post 1/3 = .33
Each person antied up $10 for a total of $30
They were susequently refunded $5 for a total of $25
Meaning that each person paid 10 - (5/3) dollars for the room
This comes out to two of the people paying $8.33 and one person paying $8.34.
They each gave the bellhop a tip of 2/3, or one person tipping $0.66 and two people person tipping $0.67.
And taking back $1.00 each for a total of $3.00
Now when we total all that up:
8.37
8.33
8.33
0.66
0.67
0.67
3.00
we get $30.00
Any programmer who cannot answer a problem that deals with rounding errors is in trouble and had better not be writting software for my bank.
Now the real question is this, why can't a computer exactly represent the value 0.1 in binary?