
The Truth About Hollywood Cowboys
Dan Snow's History Hit
00:00
Why Did Hollywood Turn to Westerns?
Hollywood turn to westerns. I guess at some stages they just became hugely fashionable. Dad, as a kid, said all he did was watch westerns. But noman, i you watch a western, it's always the native american is shown as the aggressor. S shown as evil, shom, as wicked. There's nothing simple about ors history. And let's talk about cattle. That was business, right? I mean, that was the economic driver. The way, at that expansion west was largely sustained, am guessing, before you could move crops and cereals over huge distances. Cattle was another huge business. As if you had 300, a
Play episode from 06:10
Transcript


