introduction - wikileaks.org camera v1.… · web viewnetwork camera proposal. introduction. this...

9
NETWORK CAMERA PROPOSAL INTRODUCTION This document describes the next generation network camera which we will call the Camera. It is a reimagining of the state of the art of camera technology specifically designed to be the front end to emerging file based workflows back-ended by a cloud infrastructure. WHAT IS THE CAMERA? A camera provides the interface between the physical world and the digital world. It converts light, the image from the lens, into a data file. In so doing it provides the tools for the Director and Director of Photography (DP) to control the final image. It needs an interface to allow the DP to: Measure and control exposure Manage the color through look up tables (LUTs) To monitor feedback of camera and signal status and levels There is a wealth of other information generated during shooting. In addition to its primary role of capturing the image, the Camera simplifies and automates metadata collection and embeds it in the data stream. The metadata includes: Lens data (focal length, aperture, etc.) Camera setup parameters (exposure, etc) Director of Photography input (LUTs etc.) GPS derived data o Geolocation o Time reference (precision reference to automate TC)

Upload: vunhu

Post on 31-May-2019

221 views

Category:

Documents


0 download

TRANSCRIPT

NETWORK CAMERA PROPOSALINTRODUCTION

This document describes the next generation network camera which we will call the Camera. It is a reimagining of the state of the art of camera technology specifically designed to be the front end to emerging file based workflows back-ended by a cloud infrastructure.

WHAT IS THE CAMERA?

A camera provides the interface between the physical world and the digital world. It converts light, the image from the lens, into a data file. In so doing it provides the tools for the Director and Director of Photography (DP) to control the final image. It needs an interface to allow the DP to:

Measure and control exposure Manage the color through look up tables (LUTs) To monitor feedback of camera and signal status and levels

There is a wealth of other information generated during shooting. In addition to its primary role of capturing the image, the Camera simplifies and automates metadata collection and embeds it in the data stream. The metadata includes:

Lens data (focal length, aperture, etc.) Camera setup parameters (exposure, etc) Director of Photography input (LUTs etc.) GPS derived data

o Geolocationo Time reference (precision reference to automate TC)

Inertial, angular and motion data (similar to iPhone sensors) Slate data received wirelessly Additional production notes as needed

THE CAMERA IS PART OF A COMPLETE SYSTEM

Color and metadata management are critical functions performed by the Camera system.

In the last century, Kodak was the authority in color management. In the 21st century, Sony should be that voice. Sony has the opportunity to redefine color management in a way that captures the creative decisions made during photography, carries those decisions and preserves them, and allows further refinement when needed in the post production process.

The Camera is part of this process – but only a part, the true power is in the system.

The camera is a networked terminal in the system that converts information from the physical world into useable digital information. Done properly, as part of an integrated system, the camera allows Sony to control the images flowing through the post production process.

Sony needs to acquire the strength to be the 21st Century voice of color management, integrate the technology into its cameras and bring to market the systems that leverage the capability it provides.

Control of the camera is essential.

The Camera does not stand alone. It is the front end of the production system and it is a key component in a complete system.

Figure 1. The Camera System

CAMERA

Unlike conventional cameras, the Camera as part of an overall system defers those functions which can be done later to other downstream components. It is a minimalist approach supported by processing power in the rest of the system. The key features are:

A “Raw” signal output No onboard processing in the camera except as needed for local monitoring or

transmission to storage Bundling of metadata into the output

These features result in significantly reduced size, weight, complexity and power consumption.

The Camera has two logical data paths: one for control data and the other for image data. The control data path can be either multiplexed with the data path, or a separate wired or wireless (WiFi, Bluetooth, etc) link.

For real time transfer, the image data path could be either 10GigE or 8Gpbs Fiberchannel (dual link for 4k). 10GigE is the preferred interface because it allows for the use of commodity IT hardware to build the infrastructure. Here data is passed in real time directly to the system.

For operation with a local storage cache, the image data path could be wireless 802.11n with typical data rates of around 145Mbps, and best case data rates of 600Mbps.

In this mode the Camera allows “Opportunistic” download to nearby data storage – a local server that allows “dumping” and backup of the on-board solid state memory while shooting continues. This shortens the time required for data management during and after the shooting day. It is not necessary for this to be real time capable – could run much slower and still be valuable.

REMOTE CONTROL MODULE

The remote control module sets the operational parameters of the camera. It does not actually modify the camera’s signals, the data flow, instead it can:

Allow for conventional “video operation” Create metadata – e.g. LUTs for later rendering Provides the look and feel of traditional video control Transmit the metadata to the camera or another module for multiplexing with the

image file

It can be wirelessly connected to the camera through the cameras control data path.

DIRECTOR OF PHOTOGRAPHY (DP) INTERFACE

The DP interface allows the DP to address the Camera to directly enter information that can be multiplexed with the image to set parameters (e.g. LUTs) to direct monitoring and post production processing.

It would be a handheld wireless device that provided:

Camera status monitoring Image and exposure analysis Control over LUT selection

MONITORS

Camera system monitors receive image data with embedded metadata (e.g. LUTs). They apply LUTs and render the image data for real time display of the corrected image. When used with the remote control, it allows monitoring of the impact of real time camera parameter adjustment.

MODULES FOR RECORDING

Image data is recorded as a lossless raw data stream. Operational considerations (e.g. on location or a studio) will determine whether the recording is done on small portable media such as solid state storage, disk arrays or a modified SR tape deck recording a raw data stream.

MODULES FOR LIVE TRANSMISSION AND/OR LIVE MONITORING

These modules add a fiber transmission add-on module for raw image data and metadata for real time display and transmission. Different versions can provide full quality or monitor quality as required.

RENDER MODULE

This module is used for live production and is inserted in the data path at or before the vision mixer/switcher. It applies the accumulated LUTs in a system such as Ellcami and renders the image into a conventional HD-SDI format.

Used in a variety of Post Production roles the module:

Feeds to non-render capable monitors (e.g. consumer sets in offices or viewing rooms) In preparation of dailies materials for use in editing systems

As well as providing other similar functions.

FOR POST PRODUCED MATERIALS

Since the Camera system tracks metadata with image files throughout the workflow post production is quicker and therefore cheaper. For final release, images are rendered in the finishing stage or Digital Intermediate as appropriate.

NETWORKING THE CAMERA

The sea change between today’s cameras and the Camera system is that image processing is moved from the camera into the system. This is enabled by networking the Camera and re-assessing its role as a source of raw image data rather than a source of preprocessed image data with baked in LUTs.

The next two sections illustrate how the networked Camera is used.

IN THE STUDIO

As an example the following diagram shows how a HDC1500 might be used in a studio configuration where the show is recorded as data files for subsequent post production.

Figure 2. File based configuration for HDC1500

Key aspects of this configuration are:

Multiple levels of image processing Baked in LUT reduces downstream options Data format conversion from HD-SDI to file based Expensive dedicated ingest hardware (Omneon Spectrum in this example)

This next figure shows how this configuration looks with the Camera system.

Figure 3. Camera system for Studio Recording

Aspects of this configuration are:

Versatile control of the Camera by both Remote Control Module and DP Interface Direct data transfer of image files over network to recording module The hardware is either off the shelf IT hardware (e.g. the remote control module could

be a laptop) or can be constructed for commodity IT components (e.g. the recording module could be built around a file server motherboard running specialized Sony software on top of Linux)

ON LOCATION

A key aspect of shooting on location is the difficulty of getting sufficient bandwidth back to the post production facility so that post production can proceed. This next configuration takes advantage of the Camera’s ability to opportunistically dump content from its attached solid state storage while continuing to shoot. In this case the content dump is to cloud storage to which the post production facility is also connected.

Figure 4. The Camera attached to cloud storage

CONCLUSION: WHAT DOES THIS MEAN FOR SONY’S PRODUCTS?

The Camera represents breaks new ground, using modular system design to produce the best possible image quality with advanced workflows. The Camera is designed to reach far beyond where Red has set the bar in file based camera technology. The Camera is a uncluttered image capture device that defers image processing (applying LUTs etc.) downstream modules in the system.

The Camera will be cheaper to make than the equivalent in today’s product line because of the reduced processing in the camera itself however Sony cannot look to this fact to increase margins! Rather it will allow Sony to continue to compete with new cameras like the Red Epic by offering not only better imaging technology but also a superior workflow.

Sony’s product lines are already to moving from dedicated hardware to software processing on multipurpose platforms like Ellcami. The Camera system will go further. For image processing Ellcami has an important role to play but with respect to functions like data recording, Sony’s new products will be software running on hardware made from commodity IT components.