Black woman in America (my experience)

What does it mean for me to be a black woman in America? Honestly, even though I have dark skin, I have no idea. I’m trying to figure it out and find my way around. Intersectional feminism and racism is what “they” tell me is true and what I’m being told is true. Is my personal experience true or is it a lie, or am I naive to the things of this world that are steeped in white supremacy and systemic racism? Am I the only black woman who’s naive to these things or is in their 30s and doesn’t believe in these things?

I hear all this stuff and while my brain comprehends it, my heart doesn’t get it. I’m old school: I can’t think of every white person as inherently racist. I grew up with the belief that there are 2 kinds of racists—overt racists and subtle racists.

Overt racists are not ashamed to admit that they hate people of color. They use racial slurs freely and to the faces of people of color. They are part of the KKK, neo-Nazis, and skinheads.

Subtle racists are the ones who like black people—as long as they stay in their corner. Just don’t move into our nice neighborhood or work at my job or take my promotion (but if my white colleague got it, that’s OK). Or don’t stand next to me at the bus station when I’d feel safer standing next to a white man. You can often tell the difference in subtle racists by their non-verbal cues. They can be harder to identify, but they exist.

But I’m not sure I buy the idea that ALL white people are racist. I would need to have that explained to me. That all white people inherently are against people of color?

It’s hard for me to look at my white husband and think, Gee, I love this exceptional, racist man. Why would I want to be married to a racist? Sort of odd to me, I think.