The Walking Dead tells the story of life following a zombie apocalypse. It follows a group of survivors, led by police officer Rick Grimes, traveling in search of a safe and secure home.
Usually because they use depict rape and sexual violence against women in a medium meant for entertainment. Other shows, like Walking Dead, are just awful.