Yeah, but assuming episodes of GoT are 1 hour long, you still need to multiply by the number of episodes, which for our purposes can probably be assumed to be all of them.
Personally I'd tend to discount the energy used by the TV by the way. People will likely watch something else if they couldn't watch GoT. But the energy used in the data centre probably isn't as small as you suspect; each data centre module probably uses a few kW, and can serve perhaps a good few dozen people watching at the same time. Apparently, episodes normally run for an hour with some time allowed for advertising, so actually 50 minutes if you're paid to stream it, with specials over 80 minutes long. 73 episodes in total apparently, and say 10% of that is specials it means a runtime for most people of (50/60*0.9+82/60*0.1)*73=64.7h.
Here we get into more hand wavy back of a beer mat type numbers. Say these data clusters use 5KW and can serve 100 simultaneous views, that's 5*64.7KWh for any 100 viewers or 324KWh, spread over the actual time they decide to watch the series. Times that by 18.6million viewers divided by the 100 accounted for by each data cluster, and I make that 6.02 GWh.
Now I still make that a fraction of a percent of a 7TWh figure for bitcoin. The difference is in the multiples, a GWh is a KWh preceded by 6 zeroes, while a TWh is a KWh preceded by 9 zeroes, so 1GWh is actually 0.1% of 1TWh. To be fair, I dunno if your figure includes those illegally sharing the video files, and they're probably using less efficient computers to share and view the files, but you'd have to have a lot of them to even get to 10% of what bitcoin uses I reckon.