I’m a GPU, how about you?
This piece, from London-based Fearghal Kelly, is the first in a series of posts from KIT’s international team focusing on more technical issues.
It’s easy to get distracted by the glossy black shells and colourful interfaces of the many thousands of video platforms in existence today. As consumers and users, we often lose sight of the advancements that get us here. KIT digital is right in the middle of building these platforms and we are fortunate enough to work side by side with the key technology enablers.
Underneath the usual 3LAs (Three Letter Acronyms) like Digital Rights Management (DRM) and Content Delivery Network (CDN), there are some less familiar names. In this post, we’d like to celebrate an unsung hero – the Graphics Processing Unit or GPU.
Not only is the GPU at the heart of the video revolution, but while you read this, the world’s GPUs are acting in tandem to disentangle the proteome and create novel medical treatments that will eventually directly impact your life.*
A little background
In a nutshell, the graphical interfaces we love to use on our smart-phones, PCs and games consoles are computationally complex. To display a beautifully crafted and animated ‘swipe,’ ‘drag,’ or ‘zoom’ is demanding on chipsets. Watching video is even more demanding with up to 120 screen refreshes every second. We expect our devices to run an Operating System (OS), applications (apps) AND give us this beautiful graphical experience… all at the same time.
Standard Central Processing Units (CPUs) just could not cut the job of being visually stunning and running the farm, so to speak. Over the years anyone who has experienced jittery video playback on their desktop has likely seen this processing limitation in action, even from something as seemingly simple as an email arriving in the background.
Make My Player Work!
The engineering approach to address this problem has been to segregate display-based processing from the more mundane, and less computationally expensive, OS processing. The common approach is to build the GPU into the video card of the PC/MAC. These cards are dedicated to high power graphics display at rates up to and beyond 1080p. Alternatively, CPU designers frequently include a separate ‘Graphical Processing Unit’ on the CPU itself, operating almost independently, yet in parallel, with the main CPU cores. No more jitter.**
Got It! So What?
The GPU is an extremely powerful computer in its own right, designed to address the mathematical problem of displaying 24 frames of video at 1080p in full color. That’s a lot of data transposition. This video data can come from a movie file or from the animated display of your interaction with Windows Aero or the iOS Retina display.
That’s also a lot of raw processing power – power that sits there doing nothing a lot of the time while you write a document or read an email. On a server in a data center, with no video display requirement, the GPU is doubly idle.
The video processing industry realised that if they could tap into the GPU, they could use it for tasks such as transcoding and transrating video on a commercial scale. Because the GPU is dedicated, the conversion throughput is frequently faster than real time – faster than a human could watch it back on a VCR. An added bonus is lower costs, as these CPUs and graphics cards are significantly less money than dedicated HD-SDI cards, for example. The caveat, of course, being that we are working with files here and not tape decks. Cloud based transcoding was an awakening, but video processing continued to be device specific.
Around this time, academics and medical researchers also realized that this latent processing power could be pooled for social good.
GPU and You
Those of you who used SETI to try and find ET in the 90s will appreciate this more useful application of the GPU. The Folding@home initiative was set up in 2000 to harness the latent power of CPUs and lately GPUs. Using grouping technology, the project uses internet connections to access the GPU in your home PC and data centre server when its not required for anything else. By combining all these GPUs, researchers effectively have a supercomputer that is used to accelerate modelling of diseases such as Alzheimers and various cancers. One of the program’s many goals is the creation of personalized medical care, through understanding of the unique proteome sequencing in each and every human being. The focus is on GPUs again because of the vast data (frame) transposition required for proteome cycles to be modelled. The project has even created an application to harness the Playstation 3 GPU, which is capable of 3D processing, and therefore vast data transposition.
KIT and the GPU
A large part of our focus here at KIT, is the workflow management and orchestration of not just one video file, but the tens of thousands we handle each day for people like AT&T, Liberty Global and BSkyB. Every one of these videos will start off life on a tape and progress through tens of transcode and QA steps before you watch them on one of the many playback platforms we have built.
Many of these steps are processed using our partners’ software talking to a GPU somewhere in the world. Our job is to coordinate these steps to create an experience for a consumer that is as flawless and seamless as watching TV. By the time you watch a video at home, it’s been through multiple GPUs, including the one in your Set Top Box (STB) or Connected Digital Television. The GPU can facilitate work and entertainment and when it’s not needed, its processing power can be used to fuel medical breakthroughs.* Folding.stanford.edu ** If you have fiber or live within a mile of an ADSL exchange