Nicknamed “the godmother of VR”, Nonny de la Peña has been a pioneer in new forms of journalism for years, aiming to put audiences inside the stories, using virtual reality.
A long time journalist involved in documentary films, de la Peña crossed paths in 2012 with Oculus founder Palmer Luckey, who provided a first prototype of his virtual reality system while working as an intern at Emblematic Group, her company specializing in innovative forms of journalism.
We interviewed her to explore how the advent of current VR systems is opening up new possibilities for journalists and audiences.
Both pieces you showed this year at Sundance are digital recreations of real life events. What do 3D environments allow that video doesn’t?
The kind of 3D environments that Emblematic Group specializes in — what we call volumetric, or “walk-around” virtual reality — allow the viewer to move around within the scene, to interact with it and change their orientation within it. The result is a fully immersive experience, much more so than regular video, and even than 360° video; 360 lets you see all around you but does not allow you to change your position.
Because walk-around is so much more immersive, it creates a greater sense of presence on the scene, and generates a stronger sense of empathy with the situations and characters portrayed.
Does VR herald the possibility of using our bodies to experience non-fiction stories?
Absolutely. That’s what we’ve always tried to do, and that’s why we call our pieces embodied narratives. The very first piece VR piece I made, Hunger in L.A., captured an event at a food bank in L.A. where a diabetic man waiting in line collapsed and went into a coma, because he couldn’t get food in time and his blood sugar dropped too low. In the piece, you’re there on scene as this happens.
It was the first VR piece to go to Sundance, and the response was incredible; person after person would put on the goggles, start the experience and then, when the man collapsed, they would get down on their knees as if to try and help him, and would be very careful to step around and over him. They literally felt as though they were physically present in the scene; many people came out of the experience in tears, they were so moved.
How do you see the overall journalism field make use of these new technologies beyond the immersive factor? What are some of the best practices journalists should apply to this new medium?
In terms of best practices, we have to address all the questions that have always been germane to every form of journalism. Is this an accurate representation of the truth? Have we distorted reality in describing/recreating it? What is appropriate to show and what should not be shown?
We’re working our way through those issues right now; we just received a grant from the Knight Foundation to work with Frontline on a project that involves both making a series of VR documentaries, and drawing up guidelines and best practices for how this medium should be used for journalism.
In terms of how these technologies should be used, one of the primary strengths of the medium is the ability to tell what we call spatial narratives: VR is uniquely effective at conveying the physical dimensions of a story — the place where an event happened, the distance between protagonists, how movement and action transpired, etc. Nothing comes close to VR in terms of giving you that sense of spatial context.
Your pioneering work in immersive journalism has been at the forefront of the festival circuit for years. Now that consumer headsets are hitting the market, how do you reach out to wider audiences? What’s the distribution strategy for that type of content?
This is such an exciting time for us. We’ve been pioneering the field of walk-around, volumetric VR, but up until very recently we’ve had to build our own hardware to showcase these experiences. Now there are three headsets coming to market — the HTC Vive, the Sony PlayStation VR, and the Oculus Rift — all of which enable walk-around experiences. And each of those products is plugged into an existing community of users: the Steam game platform for the Vive, the PlayStation community for Sony, and Facebook for Oculus. So we’re about to be able to reach many, many more people with our content.
We’re also benefitting from the pioneering efforts of the New York Times and Google, who distributed 1.2 million Cardboard headsets with the newspaper last year. The Times chose one of our pieces, Kiya, as a Sundance Op-Doc selection, and it’s now available on their NYTVR app, which is very exciting for us. We’re also distributing content through the Samsung MilkVR channel and we’re in some pretty exciting talks with other major players about other distribution pathways.
Can you tell us a little about the current and upcoming projects you’re working on at Emblematic?
We recently showcased two pieces, Across the Line and Kiya, at Sundance. Across the Line was made with the Planned Parenthood Foundation of America, in collaboration with 371 Productions, and it blends CG VR and 360° video to put the viewer on scene as a patient has to run the gauntlet of anti-abortion extremists trying to intimidate those seeking reproductive health care.
Kiya is a piece about domestic violence homicide, based on a Fault Lines documentary called Death in Plain Sight. It tells the story of two sisters trying to save their third sibling from being shot by an ex-boyfriend, and it uses recordings of actual 911 calls to anchor the narrative.
Then we have so many things in development: our collaboration with Frontline involves making at least three pieces; we have several other journalistic pieces in the works for media organizations; we’re in talks with a couple of major musical artists about creating experiences they can take on tour with them; and we’re making two major commercial pieces for big luxury brands — those are the more remunerative projects that allow us to keep doing our investigative work at the same time.
Are there any pieces of VR content you have been enthusiastic about recently?
I’ve been blown away by the work of a company called 8i. We call it videogrammetry — doing 3D video scans of actual humans, as well as environments. Up to now the trade-off between 360° video and volumetric VR has been realism: in 360, everything is photo-real but you can’t move around in the environment; in volumetric, you can walk around, but the characters are CG and have a kind of videogame feel. The work of 8i will allow the best of both worlds: fully immersive, walk-around environments peopled by video-quality characters. This is really, really exciting stuff.