Not much action here .... what are people thinking about this show now?
This show seemed to make such a big splash with lots
of people lauding it for being so edgy. I watched for a
while, mostly out of interest for Damien Lewis who I
thought did such an outstanding job in "Homeland".
After finally finishing the last season I have to say this
series pretty much disgusts me all around.
Not only are all the characters in it repulsive and
immoral, but the story itself really seems to have no
arc, or moral foundation, and just gets worse and
worse as episodes go by. I am pretty sure when and if
there is a new season I am not going to be jumping on
it anytime soon, or maybe ever.
Does anyone agree, or disagree? Do you like this show?