I was talking to a pair of coworkers the other week. I adore them both, and while we were talking about income inequality and damaging social conventions, I was down with the conversation. Then one of them said that the American dream is dead, the other agreed, and I was so flabbergasted that I couldn't think of anything to say.
Of the three of us, two have Master's degrees. The other is thinking about getting a Master's, and we work for an organization that will help with at least part of the cost. One of them has moved up through the ranks of the organization and is now Asst. Director of our department. I was able to leave a toxic job and work part time for a while. All three of us have paychecks that are as steady as you can get right now, healthcare, and retirement. Granted, the paychecks aren't super duper high, but they're not terrible. I was able to start a couple of side gigs and put that money aside, and I know both these women have some in-demand skills they could monetize.
All three of us are living the American dream, so it struck me as odd they would think it was dead.