The Wireless 3D Hobbit

Dec. 7, 2012
Peter Jackson's The Hobbit: An Unexpected Journey will be out soon and it was filmed using an interesting embedded wireless solution.

I didn't get a preview of The Hobbit: An Unexpected Journey like I did with Skyfall (see Skyfall - James Bond At Fifty Is Better Than Evern) but I did get to have a chat with Steve Schklair, CEO of 3ality Technica, and Howard Postley, 3ality CTO, about some of the technical challenges of shooting the film in 3D. The Hobbit was shot for 3D and IMAX. I have written about 3D movie making for the films Prometheus (see Prometheus Takes Flight With Cutting-Edge VFX Technology) and The Amazing Spiderman (see Spider-Man Swings Through A Virtual 3D World) but this time I want to talk about the wireless aspects of the new film.

Director Peter Jackson shot the Hobbit in 3D using RED cameras and 3ality Technica's 3D rigs (Fig. 1). The 3ality rig is just part of the 3D system. There is a major server component that the rig and cameras work with in real time. The server provides feedback to the cameras to adjust the rig so the system can properly record in 3D. Without this feedback the resulting video can vary from bad to unusable. Check out the other articles for a discussion on 3D video issues.

Figure 1. Director Peter Jackson shot the Hobbit in 3D using RED cameras and 3ality Technica's 3D rig using wireless connections for some of the filming.

Peter Jackson was pushing the state of the art. The film was shot at 48 frames/s at 5K resolution. This allows faster action scenes and helps when dealing with 3D presentation as well where more pixels are better. He used 48 RED EPIC cameras on 17 3D 3ality rigs. Some of these were mobile Steadicams and this is where the wireless link comes into play.

As noted, 3D camera systems do not just consist of a rig that sends back a 3D video stream so adjusting a 3D camera is not like a 2D camera where one fiddles with the f-stop, focus and zoom. Those adjustments are just some of many for needed by 3D cameras.

3ality calls their control box the Stereo Image Processor (SIP). It controls details interaxial distance, convergence angle (placement of depth), and zoom. There is a closed loop control system between the SIP and rig that includes control of the camera servos and settings. The SIP is continuously tweaking the servos and settings so the two cameras can record the desired video.

The actual video is recorded on a flash drive on the rig. In addition, a video stream is sent to the "video village" where the director and others can see what the camera sees. This is more for editorial use. The video stream is also sent to the SIP but this is a one-way trip. The video is analyzed by the SIP as well as part of the control process. The SIP records its information and it is eventually combined with the video from the camera rig using time codes in both sets of data.

Normally the control data is handled by exchanging 6 metadata packets/frame between the rig and the SIP. This is slow but it needs to be isochronous with low latency. It is possible to drop a packet but bad things happen if too many packets get dropped. This is not an issue with a wired connection but it was an issue when it came to wireless operation.

Now to the problems that 3ality had to surmount.

The first was the wireless control communication link. The video streams used COTS wireless support but the isochronous nature of the control link precluded COTS. WiFi was out of the question. It has the bitrate but timingwise it is a poor alternative. Even the packet overhead gets in the way of this particular application.

Bluetooth was tried but there were issues with this approach as well. Its protocol still had too much overhead and then there were transmission distances to contend with in some instances.

At the other end of the spectrum are dumb transmitters and receivers with no protocol support. Eventually 3ality moved in that direction coming up with a proprietary solution that used their own protocol. It is packet based but with little overhead and no error correction, only error detection. This works for the application because dropping a packet is not an issue and the data does not need to be resent. It can be extrapolated from the good packets.

There is also meta data with the stream. The director wanted to see some of this information on his screen so that support was added to the mix as well.

The approach does have challenges as distances between systems increase because error rates tend to increase as well. The radios work very well for distances of about 20m. They can operate at over 1000m but this brings up another issue.

The second issue was radio noise. As it turns out, much of the film was shot in New Zealand (Fig.2), essentially out in the "boonies." This turned out to be a challenge that worked nicely for the wireless aspect of the system because it was not quite devoid of other radio traffic. Frequency hopping radios help but anyone who has tried to set up a WiFi system at a tradeshow can attest to the challenges one faces in a noisy radio environment.

Figure 2. Director Peter Jackson shot much of the film in New Zealand.

In New Zealand, the wireless approach turned out to be very workable. This would not be true even on backlot sets where there is a lot of film being shot at the same time. Unfortunately there is not enough bandwidth to let everyone operate all their cameras wirelessly.

A related issue is troubleshooting problems. We tend to get spoiled with WiFi and scanners that know about channels, packets and all the details that are flying by with WiFi communication. These tools do not exist for custom solutions like 3ality's and others or even other wireless devices for that matter. The challenge is even more difficult because the issues can be transient. What may work now may not work in a hour. This is definitely not something you want when paying actors, stunt people, camera people and so on lots of money to get a film done only to be held up by a camera that does not work.

In the end, things worked out well and the resulting system can be employed elsewhere. There are still additional trade offs. For example, going mobile means batteries that in turn mean limited operational time frames and more weight. On the other hand, eliminating wires makes a camera person's job with a Steadicam easier with one less thing to trip over.

3ality now has the radios and SIP support to handle the new features. They are likely to be used in future films but wireless is not going to replaced the wired solutions that the film industry depends upon. It will be interesting to see if 3D wireless solutions will be usable on Hollywood sets or whether you will need to be in New Zealand to get the wireless 3D connection.

I am looking forward to seeing the Hobbit for a number of reasons. It is impractical to highlight where the wireless 3D camera shots are in the film but this embedded solution did make the job of creating a great film easier.

About the Author

William Wong Blog | Senior Content Director

Bill's latest articles are listed on this author page, William G. Wong

Bill Wong covers Digital, Embedded, Systems and Software topics at Electronic Design. He writes a number of columns, including Lab Bench and alt.embedded, plus Bill's Workbench hands-on column. Bill is a Georgia Tech alumni with a B.S in Electrical Engineering and a master's degree in computer science for Rutgers, The State University of New Jersey.

He has written a dozen books and was the first Director of PC Labs at PC Magazine. He has worked in the computer and publication industry for almost 40 years and has been with Electronic Design since 2000. He helps run the Mercer Science and Engineering Fair in Mercer County, NJ.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!