Cloud Imperium’s much-delayed multiplayer space-trading title Star Citizen has been in development for well over a decade, but now it is seemingly nearing completion, after having raised more than $867 million through crowdfunding.
In a recent interview with La Presse (via PC Gamer), Cloud Imperium’s CEO Chris Roberts said Star Citizen is aiming for a 2027 or 2028 launch window. Its single-player spin-off title, Squadron 42, is expected to release next year.
Squadron 42 features a superstar cast, including Gary Oldman, Gillian Anderson, Mark Hamill, Andy Serkis, and Henry Cavill – to name a few.
To create digital replicas of these actors, as well as player characters and other NPCs, Cloud Imperium utilised new 4D technology created by a unique partnership including a Hollywood VFX company.
Clear Angle Studio specialises in utilising scanning technology for films and television, including its head scanning system Dorothy, which features 76 cameras and 1,500 lights to capture both 3D and 4D textures.
Last August, it partnered with 4D facial performance capture service DI4D and post-production company TexturingXYZ to launch a new 4D facial mocap service.
GamesIndustry.biz visited Clear Angle’s headquarters at Pinewood Studios to learn more about this collaboration, including a hands-on look at Dorothy.
The interview below has been edited for brevity and clarity.
How did the companies come together to create the Dorothy setup?
Dominic Ridley, Clear Angle founder and director: We did the capture and the processing of raw data, which was passed along to DI4D for mesh tracking. After DI4D was done with it, the data came back to us. It was textured, then sent to Jeremy Celeste at TexturingXYZ, who did the map enhancements and the texture work.
The three companies worked pretty seamlessly, we all got on really well. We all had defined sections that we wanted to showcase to the world. All in all, it took around a month to put the video together.
The end visuals are largely down to TexturingXYZ and their render pipeline. The data was ours, the render pipeline and enhancements were from them.

How did each company combine their areas of expertise?
Ridley: We’ve worked with and alongside DI4D regularly over the years. We have our scanning system on set, and DI4D had theirs on set next to ours.
But since we developed Dorothy and we do the 3D and 4D scanning, the collaboration we’d like to pursue with DI4D is that we capture and process the data, and they do the mesh tracking.
So if they have jobs that come up in LA or London, they get us to do the capture, and then they do the processing. It’s a very synergistic way of working together as two companies. And then, as part of that, if people want data enhancement, it then goes to TexturingXYZ.
Everyone stands to gain, and at the same time the client stands to gain as well, because they get an end product whose barrier to entry is quite high to hit.
Jordan Fisher, Clear Angle training manager: Everyone’s piece of work is neatly defined, we don’t do any of the stuff that DI4D does.
Ridley: It’s a harmonious relationship. No one’s treading on each other’s toes in any part of the process. It works really, really well. We have no intention of doing mesh tracking, and there’s no way we can do what [TexturingXYZ CEO] Jeremy Celeste does. And Jeremy has no interest in doing scanning, and neither does DI4D. All we want to do is the scanning, so it’s a great partnership.

What’s it like working with Cloud Imperium?
Ali Ingham, Clear Angle producer: [Cloud Imperium] are very much at the forefront of what they’re doing.
It was fantastic working together. It’s really nice seeing how much collaboration there can be between these companies who do different things. Everyone’s much more open these days about their tech and trying to work together, rather than keeping everything secret.
Ridley: Cloud Imperium are quite happy to tell people what they’re doing, because they feel like what they’re doing is quite unique. Whereas companies like, for instance Epic Games, other companies like that – they’re a lot more protective.
Cloud Imperium were fantastic collaborators, because they were super open. They told us exactly what they wanted, and although it was challenging getting to the high level that they were demanding, it was clear and concise. We could do that because we had good instructions, but they’ve got a very strong leadership team. They know what they want – they’re very focused on their goals, and it was nice to have that clarity and to work to a very high level.
How much further can photorealistic graphics go in a way that’s noticeable and affordable for consumers?
Sean Tracy, Cloud Imperium senior director of tools, tech and content: The potential for photorealistic graphics is far from exhausted. While the industry has achieved astonishing results, especially with the use of photogrammetry, there’s still room for innovation – particularly in performance capture.
What’s just as important as visual fidelity, however, is representation. Ensuring the diversity of human experiences and appearances is authentically captured. This goes beyond pure realism; it’s about empowering players to see themselves accurately reflected in the game world.
In short, there’s still a path forward in advancing photorealism, inclusivity, and accessibility, all while keeping the technology affordable for players.
Will all players be able to benefit from this technology? Will they be able to see this level of detail on base-price consoles in comparison to the PlayStation 5 Pro and high-end PCs?
Tracy: Absolutely. This technology is designed to benefit all players, regardless of their platform. We’ve developed a system that other companies license to other developers, but instead of offering a third-party solution, we integrate it directly into Star Citizen and Squadron 42.
Players gain full access to these advanced capabilities within the game itself, ensuring everyone can experience the level of detail, no matter their setup.

How has Cloud Imperium implemented this technology?
Ingham: For what Cloud Imperium is doing with Star Citizen, it’s having a range of people because everyone’s faces move differently.
Ridley: A lot of what they do in games is that they’ll capture a 3D scan in Dorothy, and then they’ll do a 4D capture and use head mounted camera data. That camera would track the way the face moves and it would [also] move the 3D scan.
The high-res scan would be driven by the performance from the head mounted camera. Because there, you can run around with a gun. You’ve got this thing on your head, but at least you’re free to move and jump around, and your face will jiggle depending on how you’re moving which drives the high-res scan – that’s often how it works.
How affordable is performance capture becoming for smaller or indie studios, or is it still in the realm of AAA devs only?
Tracy: Performance capture is more accessible now than ever. Advances in technology have led to more affordable hardware and software options, making it viable for smaller studios. While high-end setups for large-scale shoots with multiple actors can still be costly, there are scalable solutions available for indie developers.
Tools and software have been democratised to work across a wide range of configurations, meaning mocap is no longer exclusive to AAA studios. For most developers, it’s now easier than ever to integrate this technology into their projects, regardless of budget.
Would Clear Angle like to collaborate with more games studios in the future?
Ridley: That would be the goal. We do a lot of collaborations across all the productions we’re on.
It’s certainly what we’re trying to do, advancing more into the gaming side of things. If there were more games companies interested in this type of high level scanning then yes, we’re up for it. We want them to know about us, that’s the key here.
That’s the kind of narrative that we would like the world to have about us, so that we can open up this tech to everyone, even, with environment scanning.
Add comment