This professor was cited by Trump's impeachment team — he says 'they misrepresent what I wrote quite badly'

President Donald J. Trump delivers a commencement address during a ceremony honoring the Nation's Graduating Class of 2020 Friday, May 22, 2020, in the East Room of the White House. (Official White House Photo by Andrea Hanks)

In a legal brief submitted this week, one of the sources cited by former President Donald Trump's impeachment lawyers is a 2001 article by Brian C. Kalt, a University of Michigan law professor. Attorneys Bruce Castor, David Schoen and Michael T. van der Veen use Kalt's article to argue against Trump's second impeachment — and according to a Twitter thread by Kalt, they have taken his arguments out of context "badly."

Kalt's 2001 article dealt with late impeachment. Trump, following the Jan. 6 attack on the U.S. Capitol Building, was impeached late in his presidency for incitement to insurrection — too late, according to his impeachment lawyers. But Kalt, noting that the brief "cites my 2001 article on late impeachment a lot," explains, "The article favored late impeachability, but it set out all the evidence I found on both sides — lots for them to use. But in several places, they misrepresent what I wrote quite badly."

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up