Science and technology shouldn’t have to play into identity politics espoused by the left. But at MIT, numbers are racist.

The Data for Black Lives Conference, set to take place on January 11 in Cambridge, Massachusetts, is addressing issues in technology that they feel pertain to racism in the United States. Run by Yeshimabeit Milner, former campaign director for the Soros-funded Color of Change, the group Data for Black Lives claims that it is studying how to use data analytics in order to combat racism. One of the topics in the conference to be addressed is how “mathematics classrooms are breeding grounds for racialized myths of superiority and deficiency.”

The title of that particular segment is, “Rising Above the Gathering Storm: Education, Justice and Mathematics.” In the description, Data for Black Lives goes after math in a special way (what did addition and subtraction ever to do you?), saying that mathematical proficiency is a “matter of life and death.”

Ask an English major about that. We’re here, we’re mathematically inefficient, get used to it.

“Math,” continued the pamphlet, “is, more than any other subject, associated with notions of fixed intelligence.” And that, assumedly, perpetuates racism.

Robots and automating jobs is also racist, because “black people bear the brunt of this automation.” The solution to this? “We cannot achieve the goals of economic justice and equality without seriously reckoning with the history of slavery in the United States and the need for reparations.”

If reparations are the answer, the problem goes a lot deeper than math.

The conference also has a Google-sponsored event, titled Systems Dynamics Workshop. The workshop promises to teach attendees how to “find and address the root causes of disparities and inequities that exist” in criminal justice, healthcare, education, and banking.

The group, Data for Black Lives, has been championed in Congress by Senator Cory Booker (D-NJ), who has endorsed the previous demands made in the 2017 Data for Black Lives Conference.