Schools in America have a legacy of indigenous genocide, segregation and other disgusting manifestations of oppression that haven't been reckoned with or even adequately acknowledged. Although this prejudice in education has become more covert, largely as a result of the successes of the Civil Rights Movement, it hasn't gone away but only festered. There is a significant minority of right-wing extremists among us who are overtly and violently hell-bent on rapidly transforming the United States into a White Christian Fundamentalist Ethno-State by any means necessary. They breed at an alarming rate, they're heavily armed, and they exist within all of our institutions, including law enforcement, the military, all three branches of government, and education. In several states, there's swift and harsh political movement toward banning books and punishing teachers for teaching American history, and for merely acknowledging that homosexuality exists. Last year, theocratic fascists in the Supreme Court also decided to strike down affirmative action. While imperfect, that was one of the primary guard rails for avoiding a descent into overt, legal discrimination in academia. This racist and cruel ideology infecting our society must be vigorously rooted out. The role of the educator is to teach rather than indoctrinate, even when it becomes politically inconvenient or illegal.