DURST: Advice from Cheney: Screw Em

Dear Mr. Cheney: I'm a former CEO of a large international oil concern who is headed into the public sector, but because of a couple of deferred pension payments that won't vest until I'm in office some pansy ass organizations are screaming "conflict of interest." What should I do?

- Dehydrated in Dallas

Dear Dehydrated in Dallas: Screw em.

Dear Mr. Cheney: I got hit a lucky steak and amassed a salary of about $20 million over the last ten years, but my tax returns show only 1% of that went to charitable contributions. Because my new partner has recently been spouting off about how a policy of private donations is so much better than that old tired discredited philosophy of public assistance I've been catching a lot of heat lately. Any suggestions?

- Wired in Wyoming

Dear Wired in Wyoming: Screw em.

Dear Mr. Cheney: Recently, I was on a podium and my boss called a guy wandering past us an asshole and I agreed with him. So far, no problem. The guy was an asshole. Unfortunately we were near an open microphone and all his colleagues heard what we said about him. Should we apologize to him or to his colleagues or both?

- Irate in Iran.

Dear Irate in Iran. Screw em all.

Don't forget to read "Dear Mr. Cheney" each and every week or just for the next nine, for more of his trademark new generation, kinder, gentler advice.

Will Durst can't wait for the compilation.

ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up