Here’s what we really need to do to make America “great” again. Bring back the jobs!
Category: Society
Anything from trends, politics, court hearings, to pop culture that’s significant to popular culture or American society.

The US needs to stay out of the war happening in Eastern Europe. We got too many issues to still deal with at home.

Our current work culture is broken, but to think you don’t need to work anymore is narcissistic.

Vaccinations were supposed to be the ultimate answer. So why are we still wearing masks? 😷

When I hear cis men complain about being short, I remind them that at least they’re not short at both ends. 😝