For those who love the show and watch it since episode number 1, getting to season 4 and seeing everything changed is a big surprise.
For three seasons you won't see anything about politics. But on season 4, out of no where, the plot brings strong statements about feminism, racism, political views and etc. Don't get me wrong, I like political shows and the visibility they bring to important conversations. BUT the way season 4 approached those subjects was too forced and out of nowhere. Politics had never been the point of the show, and from that point on it became the center of the plot.
Besides that, I hate how unbalanced the show was, not portraing a single good Republican or a strong conservative figure.
Just pushing too much the bar for nothing at all.
For three seasons you won't see anything about politics. But on season 4, out of no where, the plot brings strong statements about feminism, racism, political views and etc. Don't get me wrong, I like political shows and the visibility they bring to important conversations. BUT the way season 4 approached those subjects was too forced and out of nowhere. Politics had never been the point of the show, and from that point on it became the center of the plot.
Besides that, I hate how unbalanced the show was, not portraing a single good Republican or a strong conservative figure.
Just pushing too much the bar for nothing at all.