zLense Depth revolutionises 3D virtual studio FX and graphics, enabling powerful new augmented reality (AR) techniques and seamless interaction with AR elements
09, September 2015, London – zLense, a specialist provider of standalone real-time depth sensing and modelling platforms to the film, broadcast and gaming industries, today announced the release of zLense Depth, a standalone digital lens add-on to professional film and broadcast cameras that enables virtual studio production teams to generate shots that are unachievable with traditional layer keying solutions.
Revolutionary from both a technical and commercial perspective, zLense Depth simplifies layering workflows, generating the all-important z composite images that make it possible for presenters to freely move within, interact with and control the virtual environment.
Significantly streamlining the 3D compositing process, the solution enables broadcasters to extend the functionality of their existing VR studio set-up with ease to achieve impressive image-processing and dynamic virtual reality (VR) and augmented reality (AR) capabilities.
“Traditional layer keying solutions are, by their very nature, extremely restrictive in terms of the creative process,” explains Bruno P. Gyorgy, President of zLense. “zLense Depth enriches production values by augmenting the existing VR set with real-time data driven 3D objects or graphics and animations that blend virtual and real elements.”
Supporting the creation of more photorealistic virtual elements, the zLense Depth production platform features a matte box sensor unit, which can be mounted on any camera rig, and a built-in rendering engine which generates the z layer that is integrated into the existing VR studio layering system.
The standalone solution, which combines depth-sensing and image processing technologies, allows for a full 360 degree freedom of camera movement, giving presenters and anchor-men a significantly greater liberty of performance. Supporting live transmissions or pre-recorded productions, the platform can be used alongside pre-existing rendering engines, VR systems and tracking technologies.
“zLense Depth eliminates the technical limitations and complexities that typically hamper AR studio productions, making it possible for presenters to interact with the virtual environment with ease,” continues Bruno P. Gyorgy. “Representing the ideal complementary technology for today’s AR studios, zLense Depth makes it possible to embed augmented reality simulations, infographics and visualisations with real people or places, and enable seamless interaction between the talent and AR elements.”
The zLense Depth solution features and capabilities include:
- Capturing 3D space in real-time, generating a high resolution depth map that is in sync with the camera’s RGB frames
- Can be attached as a matte-box to any camera type (rigged with standard 15mm bars)
- Features an SDK that enables interfacing to any rendering software; the SDK is currently being implemented by all major rendering engine software producers
- Is usable in any studio environment
“This is a one-stop solution that’s set to revolutionise the capabilities of VR studios when it comes to the delivery of enticing and immersive images for viewers,” concludes Bruno P. Gyorgy.
The zLense Depth platform can be seen in action at the IBC 2015 Exhibition 11-15 September 2015 at the RAI in Amsterdam at Hall 1, Stand #1.A03a
Zinemath, a leader in developing the re-invention of how professional moving images are going to be processed in the future, is the producer of zLense, a revolutionary standalone real-time depth sensing and modelling platform that adds third dimensional to the filming process. zLense is the first depth mapping camera accessory optimized for broadcasters and cinema previsualization. With an R&D center in Budapest, Zinemath, part of the Luxemburg-based Docler Group, is spreading this new vision to all industries in the television, video game, film and mobile technology sectors. For more information please visit http://www.zlense.com.
For further information please contact
James Cooper or Penny Flood
Email; firstname.lastname@example.org / email@example.com