- Joined
- Oct 26, 2020
- Messages
- 194
I've been living in Europe for the past 2-3 years. I was raised as an American but just don't feel like America is it anymore. Everything from the police, the gun laws, school shootings, racism, quality of western women, vaccine mandates, arrogance and the entitlement of the general population compared to other countries etc. I feel like quality of women isn't even comparable to Eastern european women or any other women. It makes me not wanna go back to the U.S ever again.
Any inputs or thoughts?
Any inputs or thoughts?