Hollywood is a very insular place politically. They are quite far left of the rest of the country and they maintain that by actively refusing to work with anyone who doesn't agree with them politically. Many top executives, actors, writers and directors actively try to social-engineer the culture with their media. The make narratives that support leftist agendas and make characters we like who hold leftist views. They also make villains and characters we don't like who support views leftists don't like.
It's an open secret. (for example this book about it)