It's ending after next season. They may have decided already that next season was the last, and if they didn't, they needed to. The show has gotten really boring.
Pedophilia will be the next right that Hollywood and liberals will fight for. You have to love Hollywood. They are constantly pushing movies with inappropriate scenes. Yet everyone is all up in arms for some reason
Pedophilia will be the next right that Hollywood and liberals will fight for. You have to love Hollywood. They are constantly pushing movies with inappropriate scenes. Yet everyone is all up in arms for some reason
I didn't care for Season 4 that much, and I started Season 5 and only got about 4 episodes in and haven't finished it.
The first season was great, but after that it's gone steadily downhill. I really don't get why it's praised so much anymore, it's not even remotely close to as good as it was.