I ask this having been to events with national/ethnic dress, food, and other cultures. What can a white American say their culture is? It feels that for better or worse it’s been all melted together.
Trying to trace back to European roots feels disingenuous because I’ve been disconnected from those roots for a few generations.
This also makes me wonder was their any political motive in making white American culture be everything and nothing?
Saying a part of white American culture doesn’t mean those parts only apply to that culture. It’s the most popular religion and seeps into many parts of the state.