As long as the CAGR at the end of the 30 years is > the mortgage rate, you came out ahead, regardless of any crashes along the way. This has happened in every 30-year period in history, IIRC, compared to today's mortgage rates.

Thankfully our powers of recollection are not necessary, because the internet can recall for us. Cfiresim tells us that this failed to happen less than 5% of the time assuming a mortgage rate of 4%. Unfortunately, cfiresim.com seems to be down at the moment (an unfortunate coincidence that detracts from my point about the internet's ability to substitute for our human recollection :-P), but when it's back up we can easily check whether this statement is true for any given mortgage rate. For rates less than 3.5%, I think you are definitely correct.

Ok, for anyone interested, I just spent more time in cfiresim than I'd care to admit trying to answer this question and here's what I found:

1. The highest mortgage rate with an absolute 100% historical success rate using cfiresim's default investment parameters (i.e., 75/25 stock/bond allocation and 0.18% expense ratio) is

**1.88%**. But the conservative anti-leveraged-investing-via-mortgage crowd shouldn't get too excited yet, because the success rate remains in the 99% range for much higher mortgage rates (it's the single pesky 1929 start year that keeps preventing us from achieving absolute 100% success), and the success rate is still in excess of 95% for a mortgage rate of 4.0% (it is 95.65%, to be precise). Moreover, see the penultimate paragraph below for an explanation of why these success rates may be seriously understating the actual historical success rates.

2. Interestingly, a 100% historical success rate could be achieved for higher mortgage rates by increasing the

*bond* portion of the portfolio's asset allocation. The highest mortgage rate I could find having a 100% historical success rate using a non-default investment allocation was

**3.15%**, and it required an investment allocation of between 20-25% equities (and 75-80% bonds). I think this can be explained by the fact that low equity exposure was needed for the 1929 start year (which was obviously a very bad year for stocks) to clear the hurdle for investments to outperform the mortgage rate, combined with the fact that each year's principal + interest outlay in a mortgage's amortization schedule represents a relatively large percentage of the mortgage's initial principal balance for all but the cheapest of mortgages.

For the above cfiresim testing, I used the methology I described in

post # 292 above (i.e., enter the annual principal + interest payments required by the mortgage's amortization schedule and set the spending plan to "not inflation adjusted").

However, if you run a cfiresim test using an assumed fixed investment return (instead of actual historical data) in an amount equal to the mortgage's interest rate, cfiresim reports failure. For example, the annual principal + interest outlay on a $1M mortgage having a 4% interest rate is $57,289.80. When you run a cfiresim simulation with a $1M starting portfolio and annual non-inflation-adjusted spending of $57,289.80 and set the investment return to 4% constant market growth, that should result in perfect success with a portfolio ending balance of exactly zero dollars. Instead, cfiresim reports failure, with a portfolio ending balance of negative $150k. I think this is most likely due to cfiresim's assumptions about the timing of the portfolio withdrawals, which probably don't line up with the monthly payment schedule required by a mortgage. In any event, I think this means that this methodology of testing the historical leveraged-investing-via-mortgage success rate materially misstates the

*actual* historical success rate, and I'm not sure but I believe the misstatement will usually (or always?) constitute an

*under*statement of the actual success rate.

Thoughts, comments, and objections relating to this analysis are welcome.