Kind of a vague question. But I guess anyone that responds can state their interpretation.
Edit:
I guess I’m asking because everything I’ve learned about America seems to not be what I was told? Idk how to explain it. Like it feels like USA is one event away from a civil war outright corruption and turning into a D class country.
No. I haven’t lived there since I was 6. I am extremely lucky.
I have family there and the only news I hear are bad. I’m also still a citizen of the country and I’m worried it might break apart and I’ll end up stateless.