I did a longer post at my little blog about this, but if one buys their research as I do, I think it puts some of the planning we all do with FireCalc, cFIREsim, PortfolioCharts, etc in a different light.
For example, the interest rate data embedded in FireCalc, cFIREsim, PortfolioCharts, and PortfolioVisualizer is too high for thinking about the future. That's maybe especially case for FireCalc and cFIREsim which use long-term real interest rates over 150-years-ish maybe 1-2% above where rates are today... and then long-term trend is for ever lower long-term real rates in future.
Note: When I did my blog post, I think the long-term real interest rate on sovereign debt was pretty close to the 700-year trendline.
Then the other thing their data suggests is the reason people didn't spot this very steady trend earlier is because the datasets weren't large enough to include all the tail events. I.e., datasets with 150 years of data weren't long enough. Which to me begs the awkward question of how safe is it, really, to estimate tail events (like for safe withdrawal rates) using 50 years of data (like PortfolioCharts and Portfolio Visualizer do) or even 150 years of data (like FireCalc and cFIREsim do).
BTW I love the above referenced tools. And as Tyler has often said about PortfolioCharts, he'd love to have more data he can use. So absolutely not throwing rocks here at the fine work the planning tool creators have done. But I think the research from Rogoff, Rossi and Schmelzing suggest the approach of using a half century or century of data isn't enough?
Interesting thoughts,
@SeattleCPA! This is awesome stuff, I hope not to mess it up.
Trying to replicate your reasoning from data in the article after also reading your blog post, I tentatively agree/ disagree as follows.
i. with the first in-thread post's statement that current interest rates are on trendline, agree/disagree depending on whether "current" means a couple of years ago (yes!) or right now (no, we have negative real rates now, trend is for slight positive real rate in my analysis on first iteration; discussing below, Arabic numbers).
ii.Re blog post's idea that rates today are lower than cFIREsim data and recent century, strongly agree - recent century was near zero in real terms, we're now several percent negative.
iii. Re note in post above in thread that at the time of the blog post, long term real interest rate on sovereign debt was close to trend, I estimate below that trend is to have a real rate about .34% real terms, so nominal rates would be very close to the inflation rate. Was that true at the time of the blog post?
Could you share more on the steps you took, and the interest rates that you used when calculating?
I based my analysis attempt on table 6 in the article (page 29; appears on page 30 of my pdf), "Macro Variables: Averages by Period." Steps:
1. Table shows average real interest rate for the period 1311-2021 as 6.02 percent per annum.
2. Article suggests long term decline in rates is about 1.6%/century.
3. 1.6%/century x 7.1 centuries in period = 11.36% change expected from start point to endpoint.
4. Difference expected between average and endpoint = 11.36 / 2 = 5.68%.
5. Expected rate at endpoint = 6.02 - 5.68 = 0.34% for 2021. There's obviously variance around this, just calculating the model.
6. 150 years expected change during cFIREsim data period* compared to 2021 = 1.5 centuries x 1.6%/century = 2.4%.
7. Average during 150 year change = 2.4%/2 = 1.2%.
8. Expected rate during cFIREsim era = l.34+1.2 = 1.54%.
9. Expected change during next 50 years = 1.6%/century x .5 centuries = 0.8%. I am using this as a proxy for the remaining investment period of a young investor, viewing that as the foreseeable future in our lifetimes(though my lifetime will likely be shorter).
10. Expected rate during next 50 years = .34 - (.8/2) = .34-.40 = -.06%.
11. Expected difference between cFIREsim data and foreseeable future = 1.54%-(-.06%)= 1.60%.
12. Obviously if we just look at cFIREsim vs 2021 (aka "now", close enough), difference would be 1.54 - .34 = 1.2% difference between expected rate then and expected rate now.
Initial readings and questions:
A. Broadly, my analysis suggests real rates of roughly zero for the forseeable future, starting at a few tenths of a percent and trending slowly to below zero for long-lived investors.
B. Analysis does suggest rates will be lower than what was expected during cFIREsim data.
C. Ah, but what was the actual rate during cFIREsim?
According to the table, average real rate 1914-2021 was 0.19% already! In which case, why should we expect that the next few decades would be different from cFIREsim's data at all?
Is cFIREsim data really 150 years, and the extra years' higher rate from the 1800s really relevant to now?
Another aspect worth considering might be the effect of variance from trend, versus the effect of the trend line itself. Reading the graphs in Figure 2 and Figure 3 (pages 26 and 28 of article, 27 and 29 of pdf), the variance around trend is often several percent per year, as I guess we are experiencing now and have experienced intermittently depending on age and nation of residence (I assume this data pertains most directly to USA as the provider of most stable loans). So the variance is far larger than the trend.
From skimming the article and looking at the graphs without year by year data, it seemed like some eras have a subtrend that varies from the trend for up to decades at a time, with additional year to year variances around the subtrend so to speak. It seems to me on first consideration that this leaves room for almost any interest rate outcome during the next few decades, such as:
D. Negative real rates of several percent, similar to 1970s and this year, perhaps through the same mechanism of having inflation
E. positive real rates of several percent, perhaps through issuance of high interest bonds designed to fight inflation
F. dramatic swings between D and E, causing large gains and losses for investors in bonds depending on investor choices
G. F might produce interest rates broadly right on trend despite dramatic short term variances
H. Something completely different that I can't think of right now
Fwiw, it appears to me that rates from 1800-1914 shown in table 6 were on average above trend (4.61% vs about trend expectation of 3.64%) while rates from 1914-2021 were below trend (.19% vs trend expectation of about 1.19%). Yet because the trend is down, being "on trend" would produce about the same average real rates of .19% that we're used to during the past century.
In the short term, should we expect change - or instead similarity, in that we're already used to what the trend would suggest? Or after all, should we recognize that variances from trend are having bigger effects than the trend itself, implying that preparation for the variance contingencies is what investors need to do?
*Was the cFIREsim data period really 150 years? I don't know that part, just using the 150 years mentioned upthread.