Women control most of the household spending so it makes sense that all of the media (which makes its money of sales/advertising) panders to them above all else. Women care about everybodies feelings being respected and political correctness. Is this enough to create modern western culture as is?