Dystopian America

“Dystopia” as defined by the Merriam-Webster dictionary is “an imagined world or society in which people lead wretched, dehumanized, fearful lives.” This is what America is beginning to feel like.

Children are afraid to go to school in fear of a shooting. Cops attack the innocent black man but let the guilty white man walk free. Casual racism that eventually turns into hate crime goes visually unnoticed by the media. Same-sex couples can’t walk down the street holding hands or even show any type of affection towards their partner for fear of harassment. Women are told to hold keys between their fingers and keep pepper spray on them so they don’t become the next girl to go missing. Immigration and Customs Enforcement (ICE) take “illegal” children from their parents and keep them in cages.

All of these situations listed are so normalized into the American society to the point where people don’t care anymore. This is why America feels so dystopian. All of these bad things are happening, everything feels like it’s going downhill in this country, but no one cares. The fact that the youth have to shoulder the consequences of the problems created by the older generation feels so fake to me. It shouldn’t be our job as children to fix or worry about this, let alone write about it.

There are many ways we can help, and the most important one is to speak out. If you want to help raise awareness or simply learn more about something, visit https://www.aclu.org/.

Print Friendly, PDF & Email

Mehvish Khan

Learn More →

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.