This is not directed at anyone, just sharing some thoughts -- In my youth of growing up in the 60s & 70s, the white schools often ignored the facts of history. I was raised 90 miles from
here, yet we never heard in the schools a single word about what is considered one of the most significant events in United States history. Everything changed after that day, and it took me 49 years to learn about what happened almost in my back yard. A lot of us whites are catching up years later on what we were never told.
We heard in school about what the Nazis did to the Jews, yet we skipped over what the settlers did to the original Americans, who were here first.