I guess I don't understand why knowing how much I am saving vs. how much I am earning is worthless. The more I save, and the lower my expenses go, the faster I reach FI...
If you maintained a 70% savings rate for 10 years in a row, yea everyone would agree it sounds great. But if in the meantime your salary had doubled - uh oh, you have fallen victim to lifestyle inflation!! Maybe you calculated ~9 years to FI when you started saving, but now that you are spending twice as much the original estimate does not apply. Your savings rate can easily give you false confidence if you compare year over year.
^this. Plus, what's ultimately important isn't how much you save relative to your earnings, but how much you will need in ER. For many their expenses in ER will differ significantly from their working years, as things like commuting, networking, daycare and business attire will drop out, while new expenses like travel and hobbies may increase. Many find their post-working expenses drop substantially. Others actually increase their spending.
Also, changes in your salary and large infrequent expenses can have an almost comical effect on your % saved. As alanB said if your salary increases 10% each year but your savings rate stays constant, that's bad. OTOH if you take a pay cut or have an emergency expense your savings rate may plummet, but that doesn't automatically mean you have decades more work (particualrly if you already have substantial savings... at that point market forces often have a greater effect on your retirement date than any additional contributions).
I find total savings and the ratio of savings to expected earnings to be much more useful, and far less volatile. For example, assuming you are using a 4% WR you need 25x future expenses. I can track how much I save relative to that amount (e.g. "this year I saved 1.2x future expenses") while also noting market gains/loses ("with market gains I am at 18x expenses).
YMMV- I've simply never found savings rates to be more than a broad-brush metric.