In 2010, Microsoft unveiled the Kinect, touting it as a revolutionary new gaming device. Swing an imaginary lightsaber and that would be translated onscreen. Throw a football and it would be caught on your TV. Fifteen years later, we know the Kinect as an expensive failure. Microsoft overestimated the demand for playing games with your body. But the Kinect did still turn out to be revolutionary - just not for gaming.
It became a robotics game changer, enjoyed a brief dalliance with pornography, and is now upsold as a ghost hunting toy. None of which would have been possible had a community of hackers not come together to fashion open source drivers for the Kinect, freeing it from the limitations of being locked to the Xbox 360 and opening new frontiers of experimentation, creative expression, and commercial advancement.
Technically, nothing the Kinect did was entirely new, says Memo Akten, an artist working with code, data, and AI and an assistant professor at the University of California, San Diego. The small camera projected a grid of infrared dots and read deformities in that pattern to discern depth. In an early example of machine learning, it recognized human limbs and gestures. "Those capabilities existed in research and industrial systems for many years," he adds. Those systems cost in the region of $5,000 to $12,000. Here was Microsoft selling a variation of the technology for $150.
What had previously required very expensive equipment and/or complex multi-camera setups with manual alignments, calibration, and correspondence was now available off the shelf. Kyle Machulis, CEO of Nonpolynomial and founder of buttplug.io - an open source project for controlling sex toys - was working on $250,000 mapping systems not dissimilar to the Kinect in 2010. He quickly recognized the peripheral as an opportunity to "democratize that technology."
On November 4th, Machulis headed out to pick up a Kinect to reverse engineer. An hour later, New York-based DIY electronics producer Adafruit announced OpenKinect: a bounty of $1,000 - a prize that it would raise to $3,000 - for whomever offered evidence of the Kinect working on any operating system.
Imagine being able to use this off the shelf camera for Xbox for Mac, Linux, Win, embedded systems, robotics, etc.," Adafruit wrote in its announcement. "We know Microsoft isn't developing this device for FIRST Robotics, but we could! Let's reverse engineer this together, get the RGB and distance out of it and make cool stuff!"
Doing so was not a simple case of taking the Kinect apart or plugging it in. Though it could connect to a PC via USB, the way they communicated was unknown and the only way to get at it was to watch the Kinect and Xbox 360 speaking to one another.
Since the Kinect didn't have PC drivers, we needed this piece of hardware called a USB sniffer," Machulis tells The Verge. A colloquial term for a protocol analyzer, a USB sniffer is a tool that could record the data passed between the Kinect and Xbox 360. In 2010, that cost $1,200 and, Machulis says, "I really didn't want to buy it."
Some information could be gleaned by simply connecting the Kinect to the PC, but it was mostly unhelpful - power consumption, packet sizes, and confirming the Kinect is, in fact, a camera. Hackers could start sending random packets and possibly work something out, but it was just as liable to brick the Kinect completely.
Hackers and reverse engineers around the world were raring to go. But it appeared that whoever got their hands on a sniffer would win the bounty almost by default. That race wasn't just for the money, however, but also the cachet of being the first to hack such a high-profile device. With the community stalled over the massive expense - almost half the bounty - it opened the door for someone outside the community to potentially snatch the glory away.
To keep the contest equitable - and, perhaps, to try and maintain the bounty's and the company's momentum - Adafruit offered an alternative: anyone who could demonstrate a working Kinect on any operating system would receive the bounty. This led to a flurry of creative problem-solving, as hackers from all corners of the globe worked together to crack the code.
In the end, it was a team effort that succeeded in opening up the Kinect to the world. And what a world it has become - one where robots map environments in real time, surgeons examine scans contactless, rapid 3D models of rooms and objects became a real possibility, teachers used the Kinect as an interactive learning device, and, if you really wanted, someone could now control a sex toy over a video call.
This thing on the front camera," Watson says, pointing to the black bar at the top of his iPhone's screen, "that, I think, is a miniature Kinect."
He's almost wistful. Apple purchased PrimeSense, the Israeli company behind the Kinect's sensor technology, in 2013. "I was so disappointed," he says, "because I just knew that was the end of the Kinect technology."
The sale prompted Microsoft to explore a new system for its next Kinect - OpenKinect went and hacked that one too - discontinuing Kinect for Windows shortly after its release in 2014 and shutting down manufacturing for the Kinect in 2017 as sales diminished and it focused on the Kinect 2 and development of the third-generation Kinect Azure. Yet, the technology has lived on, incorporated into countless Apple devices as part of its facial recognition and 3D mapping to the point of being ubiquitous.
That sense of loss extends, in part, to the internet from which OpenKinect emerged. "It was way more punk rock!" Watson laughs. "No one had really established the rules."
In 2010, the internet was unruly; it had yet to coalesce around the hubs it has today. Piracy was in its heyday, pre-AlexNet - a major neural network architecture that paved the way for modern AI models like Stable Diffusion - with GitHub, now an online staple, having released only three years before (the same year as Tumblr and the iPhone's reveal). "We were only four or five years into the maker movement," Machulis says. "The idea of a product like this that has taken a massive amount of R&D cost to be put out and hacked this quickly - it was basically unheard of."
Now, with better tools, it's far more common. Which is part of why we don't hear about it as much as before - that, and not being attached to, as Machulis puts it, a "shining sun" of a product. "It is in general easier to make some of this stuff," Machulis continues. "There's way more communities online, there's more content creators talking about this stuff." The kind of effort surrounding opening the Kinect has now lost some of its buccaneering flavor, some of its sense of counterculture, simply by virtue of becoming more mainstream and, in many ways, more frequent. "I don't think anything fizzled out," Machulis adds. "I think it just got quieter and spread out."
Still, there is a sense that how we approach technology has changed irreparably. "I think technology has become more of a product now and less something that you get involved with. That's kind of sad," Watson adds. "I kind of fear that the current generation is growing up just thinking the internet is inflexible. It is the way it is and nothing will ever change. We were constantly surrounded by that change. And it really made things feel more free and more open."
Rather, similar communities to OpenKinect may feel invisible without a subject as high-profile as the Kinect. As the economic bubble inflating around AI grows more opaque as corporate interests scramble to make the technology a profit-turning industry, hackers have turned their attention to open sourcing its models. Aligned to a sprawling technology constantly in the public eye, it may well be that AI grants us our next big communal reverse engineering effort to echo OpenKinect.
"When the original Kinect came out, it took what might have been 100 hours of me writing computer vision code with a standard black-and-white infrared camera and gave me something that would shave that time off our development for a project and give better-quality results," Watson says. "AI with code is doing a similar thing; they just take away the painful aspects of the work and let us focus on the creative part."
Now, 15 years after hackers opened the Kinect to computer vision creatives, AI can do everything it did better, faster, and using standard RGB cameras. Watson shows The Verge a video of AI's real-time tracking, its superior occlusion of limbs and digits blitzing across the screen as members of a K-pop group weave around one another, each marked by a colored skeleton on the screen - all pulled from an ordinary camera.
"AI is made to make decisions about many things very quickly, and we need a decision about every pixel in an image," Machulis says. "Since we can tell so much just from images now we may not need all the extra hardware, with methods like gaussian splatting we're already seeing that ability to, what looks like, create information from thin air."
"Next time we chat, we might have gone back to infrared cameras," Watson says, before adding: "AI might kill the Kinect.
Written by: Slick Manchetz | The Citizen Edition
“Wubba lubba dub dub.”