Why Did the Lytro Cinema Camera Fail?
When introduced, the Lytro Cinema Camera felt like it had traveled from the future, sure to revolutionize filmmaking forever. What happened?
In 2016, we wrote an article called This Technology Is Going to Change Filmmaking Forever, and well, that headline didn’t come true.
Understanding light field technology is like going back to school to take a difficult physics lesson. But, its promise and the inventiveness of the experimenters behind it, especially in the computer age, have guaranteed its legacy and, hopefully, its eventual commercial success.
That being said, if this technology is so compelling, why (so far) has it failed to succeed and thrive?
Brief History of Light Fields
Experiments on the journey of light officially go back to British inventor Michael Faraday, crystallized in an 1846 lecture called “Thoughts on Ray Vibrations.” His premise was that light should be interpreted as a field. “Light traveling in every direction through every point in space” was how researcher Arun Gershun described a light field in 1936.
But, it wasn’t until the ’90s that the term computational photography was first used as a way to measure a light field using and encouraged by computers. This was the study of light fields, photography, and high-performance imaging, which today has morphed into the general use of computer power to enable new image technology—like we see in smartphones and increasingly in the latest mirrorless cameras.
The technology was so seductive that even engineers were to look to it for solutions. Famously, Malaysian engineer Ren Ng, while attempting to take a photograph of a particularly lively youngster, wanted a way of taking the picture without the difficulty of focusing on this tricky moving subject. Why couldn’t he choose his focus later on? Ren took this “shoot first, focus later” idea as the tenet for his invention, Lytro.
At the time, the University of Stanford’s own light field technology experiment was a huge device comprising of an array of 100 digital cameras linked to supercomputers. Ng was to attempt to miniaturize this concept to produce a consumer version.
His idea attracted $50 million of venture capital and over forty engineers, and his company was born in 2006.
It was to take Lytro another six years to launch a consumer camera but, in the meantime, the image world had radically changed. Add several iPhone launches, photo-sharing websites, and extensive R&D from an aggressively marketed photography industry, Lytro was launching into a market that was looking beyond a neat trick that allowed you to focus after the fact.
Goodbye Lytro Photo, Hello Lytro Video
Lytro cameras offered living pictures that created a new level of immersion by simply clicking on different parts of a picture to re-focus. There was no delay in waiting for an autofocus sensor to work, as with Lytro, you just pointed your camera at a scene to get everything in focus later in software.
However, in 2012, and later 2014, when the new Illum camera was launched, this clever, but ultimately limited, sleight of hand trick wasn’t enough for a market looking to swap and emotionally connect with 2D pictures online as quickly as possible—the world had turned.
The industry had rejected Lytro and pigeonholed it as photographically limited, with awkward post-production software when computational photography on your phone offered instant gratification.
Other companies like Adobe had also tried to market a light field camera and suffered similar rejections from a capricious audience.
Yet, there was a professional market that perhaps was more mature in its acceptance of new technology and would possibly welcome light field science. Still, with its Lytro name, the company aimed for introduction into the pro video world where the gamut of advantages that light fields held could be apportioned.
Light Field Cameras for filmmaking
When Premium Beat first looked at the Lytro Cinema Camera in 2016, the debut was still fresh in its mind from its unveiling a month earlier at the NAB show in Las Vegas.
Lytro had introduced their new capture system in a standard Las Vegas convention room off the main show. Unfortunately, they severely miscalculated the numbers that would turn up for the demonstration, triggering fire safety protocols for such a small suite, with its 600 knowledge-hungry attendees spilling into the corridor to get a glimpse of Lytro.
The demo was extremely exciting, lifting a veil on what light fields could offer the industry. So much was explained, that it was hard to pinpoint where Lytro Cinema would be welcomed first.
For the first time, cinematographers were offered something more than working with light intensity. Light fields would allow you to work in 3D space with virtual apertures to change f-stops, shutter angles, and create stereo footage with ease, and with only one camera.
We haven’t even touched on the customization of motion blur through virtual shutter angles or even re-lighting.
You could, in effect, ignore the exposure triangle of shutter speed vs. aperture vs. ISO. The aperture is always completely fixed and open; you don’t have to worry about how the shutter speed and aperture interacts with ISO.
Also, because you have an instant dynamic range, you don’t have to worry about ISO. You have to make sure your sensor is adequately exposed. Beyond that, everything is part of a post-production pipeline, including your artistic cinematographic needs.
That means that you could represent any lens with any parameter. So, the practice of always using a particular brand of lens with a certain character would be substituted by this computational lens ability with light field.
The VFX Argument
One of the camera’s major draws was its use in VFX. You could work with footage on a slice of time basis as you know the exact X, Y, and Z coordinates of where every pixel exists in space.
This meant that you could dial-in coordinates that split the before and after extent of a shot. You then used a depth extraction process to separate the foreground and background in extraordinary detail. You’re also taking into account the light spill so your edges are perfect—nothing is approximated. You could even extract details in front of the action.
Every single shot could be a green screenshot, making a VFX workflow incredibly efficient. You can also semi-automate rotoscoping by creating roto-splines that are dimensional as you know every movement of light over time. This could give you this automated process as the splines would automatically propagate over an entire sequence.
Why Lytro Cinema Failed
However, in the cold light of day, as a capture device, the camera’s specs were wholly unworkable. One second of footage from its 755 MegaPixel Raw sensor would need 400GB of storage—and back in 2016, this type of storage was costly.
The camera itself was giant; it extended from around 6 feet to as much as 11 feet, depending on the type of shot being captured. The camera would have been prohibitively expensive if the company had ever priced it to sell. No wonder the only option to use it was rental.
However, while the technology was captivating, ironically, there might have been too much choice on offer. You didn’t need to have a sensor and processing that offered 300fps with all the high data resolution problems that came with it. Why not concentrate on a much smaller camera and just worked in 2K, which was the limit of cinema masters at the time.
Giving Lytro Cinema to VFX artists for keying was also a leap of faith; it’s one thing being able to use a multi-plane approach, but keying is so much more about finely separating blurry and transparent things. In practice, would it have worked to that extent?
VFX Supervisor Adam Valdez, at the time, was still encouraged with the technology, but admitted that it needed to adhere to what the industry could deal with,
They need to take their big, beautiful prototype and think about today’s world, today’s movies. Right now, even moving to 4K, acquiring 4K is a really big load—working 4K in post is a really big load.
There are very few projectors that project 4K, consumers can’t see the difference between 2K and 4K at home, and economically, it’s a big strain for us in post. Let’s hope Lytro Cinema 2 will be smaller.”
– Adam Valdez, VFX Supervisor
Lytro’s Next Chapter
As many manufacturers have found, making hardware, especially a camera, is hard. Add such an inscrutable technology as light field to the mix, and you may have too many potential failure points.
Will this technology find its place in the future? I hope so. The key technologists from Lytro Cinema are currently driving the development of light-field display technology at a new company called Light Field Lab.
But, as the Lytro still cameras found, technology will always march on despite your best efforts. For cinema, aren’t we seeing roughly the same techniques happening with game engines within virtual production as were promised with light fields? You can change any lens, any lighting, or any VFX element live on set or at a later stage. You don’t need to re-tool your camera channels and can use off-the-shelf camera tracking abilities.
Light field technology has a science-fiction aura about it which is enticing. Now the genie is truly out of the bottle, who knows if such a camera is still a possibility.
For more filmmaking tips and tricks, check out these beauties:
Cover mage via Lytro Group.
Studionics The Best Media Production house in Coimbatore
Category: Photography, Production House