Just a clarification as your reply makes me think I may have been misrepresenting my own position:
I don't favor using a 5% withdrawal rate either, personally. Risk of failure it too high for my tastes. It doesn't sound like any new math or analysis prompted this guy to declare that 4.5% and now 5% are the "new 4% rule" but just his own shifting views about risk tolerance people should be willing to accept, which may itself be the result of the fellow being 25 years older than he was when he published his own paper prior to the Trinity study, and the changes in worldview that age brings.
I hear you that everyone's situation is different, but I think the situation of someone spending a lot of time and money on training and then being in a spot where they can monetize that training now in a way they may never able to do again if they walk away is not uncommon.
Oh I completely agree that this situation is a common one, particularly on the board since we tend to enrich for high earners and high earning occupations in general are ones where skills go stale late in life.
My point is simply it is a continuum even when this is that case. Say you're making 4x right now what you would be able to make as an entry level person in your field coming back to work after a decade off doing other things. If there is a 50% chance of having to come back to work until you earn one year at your current salary (so 4 years of later in life work) if you FIRE now, and a 10% chance of the same thing if you work an extra year, clearly it makes sense to work an extra year. Certainly of 1 year of work saves 4 years of work * (.5-.1) = 1.6 expected years of work in the future.
But once we get to the point that an year of work pre-FIRE is saving you less than one year of expected work post fire, say going from 10% to 5% odds of having to coming back to work long enough to earn one year's current income, which would still take you four years (4 years of work * (0.1 - 0.05) = 0.2 expected years of work in the future) you are trading
on average working more total years for a reduction in the uncertainty about the total number of years you will work.
Trading reduced expected value for reduced uncertainty is perfectly rational and something we do all the time. It's why people dollar cost average into the market with large windfalls (reduced expected value vs lump sum, but the worst case scenarios are less bad than lump sum). It's why we buy insurance even when not legally required to do so (insurance has to have a negative expected value for the customer or the insurance company will go bankrupt).
But the differentiator here isn't the ratio of how much money per year you or I make now vs what we'd make if we had to find work after being out of the workforce for 5 or 10 years. That ratio helps determines when the expected value of working another year before FIRE goes from decreasing expected total years worked to each additional year
increasing expected in total years worked. But since so many people work well past that break even point for expected total years worked, I think it is the degree to which we value the absence of uncertainty which is the bigger determiner of how long people will work and how far down we'll push our withdrawal rates before pushing the big red button.