Reactionaries Week: Long March (3)

The cultural right is convinced that Hollywood is determined to make America woke. Is there any truth to the allegation?

The ongoing strike involving actors and writers should tell us that Hollywood is not monolithic. The striking artistic types are mostly young, talented, and financially insecure; it is probably fair to assume that they are predominantly woke. Decisions regarding the films and TV shows that are actually made, however, rest in the hands of capitalists. They are not woke, on the whole; their only concern is making money.

In my experience, some TV shows and movies are, in fact, advertisements for wokeness; the capitalists agreed to make them because they perceived there was an audience for them that justified the investments. Most shows and movies, however, don’t have any messages touching on wokeness. The allegation, taken as a whole, is false.