Over the course of the last hundred years, the Walt Disney Company has had a pretty big influence on American society.
Whether it’s through the theme parks and the travel industry or through the movies and content they create, they’re definitely having an impact on many people’s lives. Sometimes, more impact than you think!