“I learnt about it from one of the tech guys, Chris from Xsens. Strassburger did not know about the Real-Time Live event, even though he had been going to SIGGRAPH for 15 years. My first tests showed they managed to miniaturise the whole core of their face capture tech into the iPhoneX and that was very, very exciting.” I got curious about how much of their tech made it into the iPhoneX and ARKit. “I was desperate for a way to bring our game’s immortal Beby characters to life, and facial capture was the big hurdle…I knew Apple had bought a company called Faceshift who at the time was already democratizing facial capture on the desktop. We caught up with Strassburger recently to discuss how Bebyface and the Bebylon game are progressing ?īebylon was the inspiration for the Real-Time Live! demonstration, explains Strassburger. Since SIGGRAPH 2018, the team has been working on their game Bebylon Battle Royale and they are now making a short film (or as they call it a ‘very short feature film’).Ĭory Strassburger, co-founded Kite & Lightning with Ikrima Elhassan. Their Bebylon presentation, which won Best Real-Time Graphics and Interactivity, featured a live performance driving fully expressive, real-time rendered CG characters, proving how an ingenious combination of readily available technology can level the playing field creatively and produce entertaining results. Their “Democratizing MoCap: Real-Time Full-Performance Motion Capture with an iPhone X, Xsens, IKINEMA, and Unreal Engine” presentation showed a real-time character “Beby” driven by a combination of body and facial capture, in UE4.
By taking home the winning award, they beat an impressive field of entries. Last year Los Angeles-based Kite & Lightning won 2018’s Real-Time Live!, SIGGRAPH’s showcase of live demonstrations.