Wait, why is Marginal vs Average not a correct comparison? The numbers in your quoted Bogleheads example MDM are all over the place and look like and apples to oranges comparison.
The Bogleheads' example made sense to me: the numbers showed that making a Roth contribution resulted in more withdrawal money. The marginal contribution rate was 15%; the average withdrawal rate was 13% and the marginal withdrawal rate was 25%. If marginal vs. average was correct, then (because 15% > 13%) traditional would be better. It wasn't, indicating that marginal vs. marginal is the correct comparison.
I also read the MMM link and am confused by that. At one point in your explanation you state:
Fast forward a few more years, and Joe's investments have grown enough that withdrawals over his retirement life span are projected to hit $39,050/yr. The average rate is irrelevant to Joe's decision. Any future contributions to traditional accounts will be withdrawn at a 15% marginal rate, so Joe might want to switch to Roth (or not) at this point.
Which seemed to jump from out of nowhere. Why did you start comparing $20k withdrawals with $40k withdrawals?
Because, for MFJ with no other income or deductions, $39,050 gross translates to $18,450 taxable, the border between the 10% and 15% brackets.
When you withdrawal, you do pay taxes at your average rate. When you contribute, you save taxes at your top marginal rate. What am I missing?
Yes, you do pay taxes at your average rate if you look at total income and total taxes. You also pay taxes at your marginal rate if you look at your last dollar of income and how much that was taxed.
Let's look at a year in a working life. In previous years, good mustachian behavior has occurred and retirement accounts have been funded. But this year the lure of cinnamon-spiced lattes and other hedonism is strong. Let's just skip a year of 401k and IRA contributions - what harm could it do? You check your current account balances and determine that, in retirement, you will be able to withdraw $W/yr, paying $X/yr in taxes. Your average tax rate will be Y% and your marginal rate will be Z%. This is the situation if you do not make a 401k/IRA contribution this year.
Then sanity returns, you realize those lattes are not your friend, and you decide to contribute. You see that if a traditional contribution is made, you save at your current marginal rate. Looking further, you also see that this year's contribution will allow you to withdraw more than $W/yr in retirement - and each extra $/yr that you withdraw will be taxed at your marginal rate in retirement.
With that knowledge, you compare the two marginal rates and decide if, this year, you will contribute to traditional or Roth.
You also look at the previous post and say "hey, if seattlecyclone sees it this way, it must be true!"