Why have documentaries become so much more popular in recent years?
Have you ever stopped to think about how much more prominent documentaries have become in the last 25 years?
I remember pre-2000 documentaries just weren't a big thing at all. Few people would seek them out, they very rarely became part of the pop cultural conversation, and they were widely thought of as just boring educational entertainment.
Now documentaries are everywhere and OFTEN break out into popular culture. They are today just a regular part of the film/TV landscape.
I think this is one of the few ways that things have gotten BETTER in terms of the state of film and the film industry as time has gone on. But I have to ask: What is it that precipitated this change?