news politics  

Your comments on ...

Why 'Liberal Hollywood' Is a Myth

Hollywood has a reputation for being a bastion for liberalism in America. It hasn't totally earned it.