What Does Take America Back Mean?

When politicians say, “we will take America back,” what exactly does that mean? Take us back to what? Where have we gone is what I want to know.

America loves the future; we are not ones to dwell on the past. Today’s news is old tomorrow. We read about the latest jello shot, compartmentalize it and move on. The constant gun violence is a perfect example. The lack of strict gun restriction laws at a federal level is mind-blowing. Do politicians mean to take us back to the wild west days because it was so fantastic when there was very little law and order?

When states pass laws to ban abortion after having access to it for decades, is that a big score for taking back America? When women had no choice unless, of course, you have money. When white men control women’s bodies, and women who support this are happy to be controlled by men? And rich people had more control over poor people’s choices, so they could never break the cycle of poverty?

When LGBTQ people are not allowed to be honest about their sexuality or gender? We should close our eyes and pretend that we are all heterosexual and have always been? So we don’t acknowledge everyone in educating our children by not speaking the word gay and taking books off the shelves about different ways of happy lives? Is that what taking back America means?

Cutting taxes to the point where the cameras don’t work in subways, where police radios aren’t functioning, where we don’t put capital into maintaining public housing for fifty years, where we cut funding for arts and exercise programs in public education, where our transportation in urban areas are falling apart, where homeless people can’t afford a roof over their head, where mentally challenged people don’t have a safety net? Is this what it means to take America back?

Are we so focused on the future that we believe that the times of Leave it to Beaver, when white men were superior to everyone, women stayed home, and few were educated? Where every person stays in the lanes that we were told to stay in? Where conservativism is comfortable, and the future is not?

Taking America back seems against the fray of everything we stand for as a nation. We came here to create a new world, so why are so many interested in returning to a time when we were racist, bigots, white supremacists, and lawless, and perhaps we never left? Or maybe when politicians hope for when they tout “we need to take America back,” is all they truly care about is getting elected, staying in power, and continuing the rhetoric.