TV shows that teach us more about American History

I genuinely miss Anthony Bourdain’s Parts Unknown. A show on CNN showed up parts of the world many of us didn’t get the chance to know or visit in our lifetimes. His passing left a big hole in what we could learn about the world. As someone who didn’t grow up in the US, it also taught me about where I live.

In the years since, TV shows about food have touched on American history and parts of this country that we don’t know. If you’re interested in food and finding out about American history you didn’t get taught in school; I can’t recommend these shows enough:

And on the subject of race in America, W. Kamau Bell’s new book, Doing The Work, along with Katie Schatz, is an excellent workbook for personal work on anti-racism. This is a pivotal book, and everyone should buy it. Trust me.

I hope these show you something new about the US and its history. It’s a crucial time to ensure history, in its entirety, is shared.