WW 701: The Bailiwick of Guernsey

Beep boop - this is a robot. A new show has been posted to TWiT…

What are your thoughts about today’s show? We’d love to hear from you!

I think the evolution of Windows Update naming conventions (Feature Pack, Critical, Cumulative, etc) grew out of enterprises wanting more contextual information on how urgent it was to get a given patch into production. Like all things - what started out with good intentions (presumably) has grown into an obtuse system that doesn’t really convey the information now that it is intended to.

I do applaud Microsoft’s efforts to decouple more and more elements of Windows so that they can be updated/patched independent of an OS update. I contrast this with Apple - which has to release a new OS version just to add emojis to iMessage.


Leo wasn’t clear about the name his new toy, the electronic notepad. At the end of discussions like that, repeat the names of whatever device you’ve been discussing. This listener usually is doing other things while listening and frequently get more interested as the conversation goes on and then has to back track to the first mention of topic or device. Short simple names based on letters need to be repeated clearly.

1 Like

I don’t think it’s @Leo’s fault that the product is poorly named and unmemorable.


Not faulting Uncle Leo… just saying that I didn’t catch the title… which happens often cause I listen to twit tv in the back ground.

Thanks for the reply… a nice gift for the gf, just don’t know if I like her $400-worth.

Yeah, that looks great, but not at $400.

Re: cloud gaming. I agree with Leo as far as if high-end games can be effectively rendered and presented to the end user, this is sort of a gold standard litmus test as to what can be accomplished via hosted services in computing. In fact, Nvidia’s GeForce Now product is essentially a preview of Microsoft’s upcoming Cloud PC service. I don’t know if this is still the case, but I was able to escape to an explorer.exe shell pretty easily when I tried it out at launch time.

As an aside, the company I work for has been offering this service for years. We PoC’d our first GPU-driven workloads in 2016, rendering CAD software for engineering firms. My internal “PoC” was running Doom (2016) on a Chromebook :smiley:. The challenge for our users has never been the ability to render in our datacenters, it’s almost always last mile connectivity. The WAN is a turbulent ocean rather than a placid lake, a fact that is amplified by latency-sensitive applications like this.

However, latency is still the limiting factor for serious gaming, and I don’t believe this is something that will ever be overcome. Paul made the point that rendering a Call of Duty match on a system in a datacenter would eliminate network latency to other players. You’ve got to consider the fact that the majority of latency experienced by the player, especially in CoD games, is caused by the netcode, or the portion of the game engine that processes player inputs and synchronizes them to the shared virtual world. Based on YouTuber Battlenonsense’s analysis, WAN network latency accounts for a mere 17% of the total average latency experienced by the end user in the new CoD. (by the way Paul, the latest CoD’s netcode appears to be particularly atrocious, which probably explains your negative multiplayer experiences)

Their testing scenario showed 24ms of average latency in-game latency. In my experience with Google’s Stadia service, I see around 8-14ms of latency to their gaming servers. So with a game like CoD, I’m be cutting that 17% figure down to around 7%, not much of a difference (assuming I’d experience similar game server latency as BN). But WAIT! Latency has now been added to our control inputs, as they must traverse the WAN. Google might have a lot of Stadia datacenters, but the latency still won’t be 0ms. So lets add that ~12ms of latency back on, and consider the fact that now the game is unaware of this additional latency. This is significant because nearly all games these days have mechanisms built in to their netcode to mitigate network lag and jitter. Now we’ve simply shifted the latency from a realm visible to the game engine to one that’s not, thusly preventing these clever mechanisms from helping out.

This is all super esoteric, and for the majority of casual gamers it doesn’t make a lick of difference. But my point is this shift to cloud gaming is going to result in a poorer product. It’ll work fine for Facebook games, but they doesn’t cost much compute power to render them locally. I just hate to see things like this diluted for the sake of convenience.

edit: wow that was a rant.

TL;DR I’m a gaming snob who is not onboard for cloud gaming

1 Like

Great show have to say the audio quality is superb with move away from Skype