Hey guys and gals, I was looking at this table and example, and am wondering why it feels... wrong, and if one of you could explain where I'm going wrong.
"In our example, Joe starts with $10,000 and gets a 100% rate of return in year one, bouncing his balance up to $20,000. The next year the market drops by 50% leaving him with $10,000 again. In year three it goes up again by 100% to $20,000. Then drops again in the fourth year by 50%, setting him right back at $10,000.
In this case the market did average a 25% rate of return. But how much additional cash does Joe have left to show for his 25% average rate of return?
Zero."
excel:
year market starting balance ending balance
1 100% 10,000 20,000
2 -50% 20,000 10,000
3 100% 10,000 20,000
4 -50% 20,000 10,000
average 25% 15,000 15,000