creating a web-based media control system using networked
TRANSCRIPT
Creating a Web-based Media Control System using Networked Appliances
JAMES PRESLY
BACHELOR OF SCIENCE IN COMPUTER SCIENCE WITH HONOURS
THE UNIVERSITY OF BATH
APRIL 2008
Creating a Web-based Media Control System using Networked Appliances
I
This dissertation may be made available for consultation within the University
Library and may be photocopied or lent to other libraries for the purposes of
consultation.
Signed:
Creating a Web-based Media Control System using Networked Appliances
II
Creating a Web-based Media Control System Using Networked Appliances
Submitted by: James Presly
COPYRIGHT
Attention is drawn to the fact that copyright of this dissertation rests with its author. The
Intellectual Property Rights of the products produced as part of the project belong to the
University of Bath (see http://www.bath.ac.uk/ordinances/#intelprop).
This copy of the dissertation has been supplied on condition that anyone who consults it is
understood to recognize that its copyright rests with its author and that no quotation from the
dissertation and no information derived from it may be published without the prior written
consent of the author.
Declaration
This dissertation is submitted to the University of Bath in accordance with the requirements of
the degree of Bachelor of Science in the Department of Computer Science. No portion of the
work in this dissertation has been submitted in support of an application for any other degree
or qualification of this or any other university or institution of learning. Except where
specifically acknowledged, it is the work of the author.
Signed:
Creating a Web-based Media Control System using Networked Appliances
III
Abstract
Streaming media over networks and the internet continues to grow in popularity as
technology develops and improves. This project is an investigation into the domain of media
streaming and the associated technology. Specifically, it addresses the problem of streaming
media between networked devices on a LAN.
After researching into existing systems and ways of encoding and transmitting media, a
possible design and implementation is put forward for an extensible system which will allow
users to stream media using networked devices in their home using the Java Media
Framework (JMF). There is still scope for future work following this project which is
discussed for the potential advancement in this area.
Creating a Web-based Media Control System using Networked Appliances
I
Contents
Contents __________________________________________________________________ I
List of Figures _____________________________________________________________ III
1. Introduction _____________________________________________________________ 1
1.1. Introduction _________________________________________________________ 1
1.2. Aim ________________________________________________________________ 2
1.3. Objectives ___________________________________________________________ 2
2. Literature Survey _________________________________________________________ 3
2.1. Introduction _________________________________________________________ 3
2.2. Existing Systems ______________________________________________________ 6 2.2.1. TwonkyVision _____________________________________________________________ 6 2.2.2. Nexus ___________________________________________________________________ 6 2.2.3. SlimServer _______________________________________________________________ 8
2.3. Streaming Software ___________________________________________________ 9 2.3.1. Music Player Daemon ______________________________________________________ 9 2.3.2. Jack Audio ________________________________________________________________ 9
2.4. Streaming Media _____________________________________________________ 9 2.4.1. Formats and Codecs _______________________________________________________ 10 2.4.2. Multiplexing and Demultiplexing ____________________________________________ 10
2.5. Streaming Protocol ___________________________________________________ 11 2.5.1. Protocol Types ___________________________________________________________ 11 2.5.2. Multicast and Unicast _____________________________________________________ 12
2.6. Implementation Tools and Considerations ________________________________ 13 2.6.1. Client Side _______________________________________________________________ 14 2.6.2. Server Side ______________________________________________________________ 15
2.7. Summary ___________________________________________________________ 16
3. Requirements ___________________________________________________________ 17
3.1. Requirements elicitation ______________________________________________ 17
3.2. Basic Functional Requirements _________________________________________ 17
3.3. Non-functional Requirements __________________________________________ 18
3.4. Advanced Functional Requirements _____________________________________ 18
4. Design _________________________________________________________________ 19
4.1. Approach___________________________________________________________ 19
4.2. Developing a GUI ____________________________________________________ 21
4.3. Core Functionality ___________________________________________________ 23
5. Implementation and Testing _____________________________________________ 26
6. Conclusions ____________________________________________________________ 31
Creating a Web-based Media Control System using Networked Appliances
II
6.1. Evaluation __________________________________________________________ 31 6.1.1. Evaluation against Functional Requirements ___________________________________ 31 6.1.2. Evaluation against Non-Functional Requirements _______________________________ 31
6.2. Successes __________________________________________________________ 32
6.3. Issues ______________________________________________________________ 32
6.4. The Future __________________________________________________________ 32
7. Bibliography ____________________________________________________________ 34
7.1. JMF Supported Formats _______________________________________________ 37
Test 1: - Rendering media _________________________________________________ 56
Test 2: - Transcoding media _______________________________________________ 57
Test 3: - Demultiplexing media _____________________________________________ 57
Test 4: - Processing media _________________________________________________ 58
Test 5: - Encoding media for transmission ____________________________________ 58
Test 6: - Transmitting media _______________________________________________ 58
Creating a Web-based Media Control System using Networked Appliances
III
List of Figures
FIGURE 1 – NETWORK SETUP SCENARIO ............................................................................................................ 3 FIGURE 2 – STREAM ROUTING FLOW CHART ........................................................................................................ 4 FIGURE 3 – NEXUS HUB WITH NETWORKED DEVICES ............................................................................................. 7 FIGURE 4 – SLIM DEVICES: SQUEEZEBOX ............................................................................................................ 8 FIGURE 5 – MULTICAST VS UNICAST................................................................................................................ 12 FIGURE 6 – SYSTEM ARCHITECTURE DIAGRAM ................................................................................................... 20 FIGURE 7 – GUI DESIGN – STREAM TAB .......................................................................................................... 21 FIGURE 8 – GUI DESIGN – CHANNEL TAB ......................................................................................................... 22 FIGURE 9 – STREAM PROCESSING PART. 1 ........................................................................................................ 24 FIGURE 10 – STREAM PROCESSING PART. 2 - INDIVIDUAL OUTPUT STREAMS ........................................................... 25 FIGURE 11 – STREAM PROCESSING PART. 2 – MULTIPLEXING THE STREAM ............................................................. 25 FIGURE 12 – JMSTUDIO – SETUP TO RECEIVE RTP STREAM ................................................................................. 30
Creating a Web-based Media Control System using Networked Appliances
1
1. Introduction
1.1. Introduction
The way in which media is used and propagated has changed vastly with the advances of
technology and the internet. Analogue storage systems such as audio cassettes and VHS
have long been superseded by the digital formats of CD and DVD. This digital era allows
media to be manipulated and stored on computers and has opened up a new realm of
options in their potential uses.
Today, many online systems exist for streaming live media from web servers directly to
users’ computer screens. Online radio stations and Video on Demand (VOD) systems
allow users to have streams delivered to their home PC web browser, set-top box, or
even their mobile phone. As internet bandwidth and infrastructure develops rapidly and
the majority of homes in the UK having broadband connections, the ease and quality of
these services continues to improve.
Storing media in the digital realm has given rise to a great deal of technology
surrounding its encoding and transmission. Many companies have developed their own
media formats for a variety of purposes. Probably the most famous is the MP3, a form of
“lossy” compression allowing music to be stored using much less disk space than the
Pulse Code Modulation (PCM) format found on regular audio CDs. With more advanced
home entertainment technology such as 5.1 surround sound, some media formats
contain a great deal data and channels in order to make best use of the hardware
available. When a stream consists of multiple component streams in this way, the
streams must be separated and recombined in order to process each one individually, a
process known as multiplexing. However, this is only scratching the surface of the world
of media streaming as a variety of complex ways of processing and broadcasting media
exist which will be investigated in this project.
If audio and visual streams can be delivered from one side of the globe to the other in
real-time then it seems logical that similar results could be achieved in the home.
Ethernet adapters for digital sources such as DVD and CD players allow them to be
connected to a local area network (LAN). Wireless technology allows speakers to be
placed in a room arbitrarily and have their positions changed easily without the
restraints imposed by cabling issues. While appliances with built-in network interfaces
are still fairly new on the market, they are the next step in centralising control of home
entertainment systems as well as other appliances in the home.
With networking technology allowing existing appliances to be connected to the same
media server the potential exists to manipulate and control the source media streams
and direct them to any connected output device. A device such as a personal digital
Creating a Web-based Media Control System using Networked Appliances
2
assistant (PDA) could then be used to log in to the system and act as a universal remote
giving the user total freedom to route his appliances together in ways which were never
before possible.
1.2. Aim
The main aim of this project is to design and implement a system to allow users to
control media streams between networked appliances via a web browser. The modern
home contains a variety of devices which produce or render either audio or audio/visual
data. Using network technology and a media server as the central hub, control of these
devices could be centralised to a computer or PDA allowing the user to operate their
home entertainment system in a high-level way. The system should be developed using
knowledge and understanding of the subject acquired in the literature survey.
1.3. Objectives
The proposed objectives for this project are as follows:-
• Research existing media streaming servers and systems and evaluate them
• Gain a sound understanding of media streaming domain and associated
technology
• Use an analysis of the researched material to gather requirements for the
system
• Design and implement a modular, extensible system to fulfil the design
requirements
• Evaluate the work carried out
Creating a Web-based Media Control System using Networked Appliances
2. Literature Survey
2.1. Introduction
This project sets out to develop a versatile and low
media streams between devices on a LAN
hardware is eliminated due to control being obtainable from a standard home PC,
laptop or PDA. The funda
media devices connected to a
user should be able to connect to the server
server is running. Using Eth
network interfaces) can be included in the setup.
To introduce this section a high
figure below illustrates the type of scenario this project
network setup contains several devices which are connected to the media server either
using an Ethernet cable or using wireless technology.
potential exists to stream media from a source, such as the
device, such as the wireless speakers
In addition to this, there is an opportunity to process the streams as they pass through
the server using digital signal processing (DSP) algorithms. These are more com
Figure
based Media Control System using Networked Appliances
3
Literature Survey
develop a versatile and low-cost system to allow users
media streams between devices on a LAN. As it is web-based the need for custom
hardware is eliminated due to control being obtainable from a standard home PC,
fundamental concept for the system is to have various
connected to a media web server running on a PC in the user
be able to connect to the server by navigating to the URL on which the
Using Ethernet adaptors legacy devices (i.e. ones without built
network interfaces) can be included in the setup.
To introduce this section a high-level overview of the projects goals is described.
figure below illustrates the type of scenario this project is addressing. This home
network setup contains several devices which are connected to the media server either
using an Ethernet cable or using wireless technology. Using a setup such as this the
potential exists to stream media from a source, such as the CD player, to an output
device, such as the wireless speakers via the media server.
In addition to this, there is an opportunity to process the streams as they pass through
the server using digital signal processing (DSP) algorithms. These are more com
Figure 1 – Network Setup Scenario
based Media Control System using Networked Appliances
allow users to control
based the need for custom
hardware is eliminated due to control being obtainable from a standard home PC,
concept for the system is to have various networked
server running on a PC in the user’s home. The
by navigating to the URL on which the
legacy devices (i.e. ones without built-in
is described. The
is addressing. This home
network setup contains several devices which are connected to the media server either
Using a setup such as this the
CD player, to an output
In addition to this, there is an opportunity to process the streams as they pass through
the server using digital signal processing (DSP) algorithms. These are more commonly
Creating a Web-based Media Control System using Networked Appliances
4
known as “plug-ins” which perform some sort of useful function on the stream.
Providing a common interface for these modules will allow the system to be extensible
and modular, allowing any third party to write one and add it to the collection of
available processing modules. A wide range of functions can be provided using DSP
algorithms in the audio/visual streaming domain. These include:-
• Digital filtering/EQs
• Picture size and viewing ratio adjustments (i.e. 4:3, 16:9 etc )
• Stereo simulation from a mono signal
• 5.1 sound emulation using a stereo signal
• Graphical/Parametric EQs
• Audio/Visual Effects modules
• Stream Recording/Exporting modules
This concept is similar to Virtual Studio Technology (VST) plug-ins, which was developed
by Steinberg (1) to allow third-parties to develop virtual instruments and effects for use
with Digital Audio Workstations (DAWs). An interface for the VST was defined and a VST
SDK was released which allows programmers to develop modules which seamlessly
integrate with host programs and allowing unlimited scope for expansion.
An example of the modular concept of the system is illustrated in the diagram below.
There will be three main sections to the chain; a source, a modulation section and an
output section. The source could be a networked device such as a CD player, DVD player
or some media on the client or server system. The modulation section consists of any
number of plug-in effects modules to modify and enhance the source input. The output
section will be networked devices such as speakers, TVs or the client system.
Stereo
Speaker
Group
Effects
Group
Left Speaker
(Living Room)
Right Speaker
(Living Room)
Graphical
Equaliser
CD
Player
(Kitchen)
Source Modulation Output
Figure 2 – Stream routing flow chart
Creating a Web-based Media Control System using Networked Appliances
5
While the above diagram is not intended to be a GUI mock up of any sort it is there to
demonstrate the concept of the project which is one of flexibility. The diagram shows
how the source (in this case a CD player in the kitchen) is routed to an effects group
which can have any number of effects attached to it. This is then routed to the stereo
speaker group and outputs to the left and right speakers in the living room. The user
should have the option to add another speaker output to this group (in the kitchen for
example) to allow the stream to be routed to more than one destination in the house.
Graphical routing in this manner is a concept which is used in some software
synthesizers when designing sounds such as Native Instruments Absynth (2).
The purpose of this literary review is to perform research into the relevant domains of
the project and gain an overall understanding of the subject. On completion there
should be sufficient information available to make informed choices for gathering
requirements, and architecting the system. Firstly an analysis is performed on existing
systems in order to ascertain the contrasting methods of approaching the problem and
to advantages and disadvantages of each. This process is also necessary to ensure the
project is indeed innovative and novel. The information gained from this process can be
put to use in gathering requirements and designing the system architecture. Existing
commercial systems as well as open source streaming servers are to be investigated and
contrasted in order to gain a well rounded view of the current situation. Secondly, it was
decided that investigations into media types and their formats should be carried out.
Understanding how streams are composed and manipulated is an important aspect of
the system. Thirdly, technology and tools which may aid in the implementation of the
system is examined.
Creating a Web-based Media Control System using Networked Appliances
6
2.2. Existing Systems
As one would expect, there are systems on the market already which implement some
of the functionality required in this project. Several systems are examined and evaluated
in this section. Useful features and ideas will be noted as well as potential
disadvantages. Analysing these systems will provide information for formulating
requirements.
2.2.1. TwonkyVision
TwonkyVision is a well developed system for streaming media in the home (3). It is a
multi-platform system and designed for UPnP-enabled client devices and supports many
formats for pictures, music and video. An interesting feature of this system is that you
can install TwonkyVision on a networked storage device (products are available with it
pre-installed) and stream media without the need for an online server. It will also
interface will many standard third-party applications such as Windows Media Player,
iTunes and Adobe Photoshop Album.
The server is setup using a wizard on a PC and the user designates folders to use as
targets for media storage. Clients are then free to connect to the server and stream the
media sources which are available. On testing the system over a network it became clear
that the server is lacking in some features. While individual songs could be played,
(which were embedded in a webpage within QuickTime) whole albums and playlists
could not be setup (with the exception of iTunes playlists). This is a major drawback for
the system as queuing up media is something which most users would appreciate.
The interface although web-based is fairly complicated although it compensates for this
with a great deal of functionality. TwonkyVision has built in internet radio support, as
well as the capability of interfacing with several games consoles amongst other standard
hardware.
2.2.2. Nexus
Nexus is a slightly more sophisticated system than TwonkyVision as it requires certain
specialised equipment and is therefore a lot more costly to the user (4). However it has
many beneficial features to compensate which I will elaborate on.
The system requires the user to design a setup for their home. It works by connecting up
to 12 sources and 18 outputs to a special Nexus hub shown below.
Creating a Web-based Media Control System using Networked Appliances
7
The diagram above shows a number of multimedia devices connected to the Nexus hub.
These include:-
• DVD
• Hi-fi
• TV
• Phone
• PC
• Security cameras
The equipment is cabled into the hub rather than wirelessly connected which is costly to
the user either in the time and effort taken to lay the cables or in the financial cost of
hiring a professional to install them. When ordering the system the user is required to
fill out a form selecting the location of all the devices they want to use and their
Figure 3 – Nexus hub with networked devices
www.avnex.com
Creating a Web-based Media Control System using Networked Appliances
8
location. The system is divided into zones; each of these zones has its own remote
control to operate it.
While this is system is clearly fully featured and well designed it is expensive and
bespoke. An ideal system would draw on concepts used in Nexus but allowing the user
to setup cheaply (using only software) and allowing the system to be expandable using
plug-and-play technology.
2.2.3. SlimServer
SlimServer is a multi-platform open source project which allows users to stream audio
across a home network (5). The software is designed to work with custom hardware,
such as the Squeezebox which has a number of audio outputs and built-in wireless
networking capabilities.
The software supports the MP3 and WMA audio formats and integrates with standard
audio playing software such as Windows Media Player and iTunes. It is a web-browser
based system and has many convenient features for audio such as easy play list creation
and tune browsing. SlimServer is also compatible with third-party plug-ins allowing
support for new audio compression formats.
A clear disadvantage with this system is the need for a specific piece of hardware in
order to output to a standard amplifier and speakers, as opposed to a generic network
interface for the speakers themselves. Without this, the system is limited to streaming
audio between PCs which is far less useful. Its ability to integrate with any MP3 playing
software is a good feature of the system though one of less concern to this project as we
are mainly interested in routing appliances to one another. However, a “media preview”
Figure 4 – Slim Devices: Squeezebox
www.slimdevices.com
Creating a Web-based Media Control System using Networked Appliances
9
feature would be useful to the project but it needn’t require third-party software to
accomplish this. The ability to use third-party plug-ins and codecs is also one of interest
to this project as it allows expansion of the software and prevents it being made
obsolete by the latest compression formats.
2.3. Streaming Software
There are a number of streaming servers in existence which perform similar
functionality to that required in this project. I will review several here in order to outline
desirable features and gain an understanding of the systems already on the market.
2.3.1. Music Player Daemon
Music Player Daemon (MPD) is an open source project which provides an interesting
alternative approach to streaming media on networks (6). MPD uses a distinct
client/server model, where the program itself acts as the server and can be connected
to by a number of clients. A wide array of MPD clients are available to satisfy the users
taste, ranging from command line interfaces to rich graphical applications. It is designed
to be easy to use, flexible and with low overhead. It can also output to media streaming
servers such as Jack (see below).
2.3.2. Jack Audio
Jack Audio is an open-source audio server designed to stream audio at a very low
latency (7). It allows applications to transmit audio information to one another easily
and is designed for professional use. The software tackles the problem of the varied sets
of driver standards which currently exist allowing audio applications to stream
regardless of these. Jack Audio is a Linux/UNIX based system and would require porting
in order to use it in Windows XP however it may be a useful resource in bridging the gap
between processes using audio which cannot easily be interconnected.
2.4. Streaming Media
Delivering media across a network from a source to a recipient is known as streaming
media. The data packets arrive sequentially and are usually stored in a buffer before
Creating a Web-based Media Control System using Networked Appliances
10
being rendered and presented. Streaming media can be formatted in a variety of ways.
Common audio formats are:-
Uncompressed
• Wav
• Aiff
• Au
Compressed
• Mp3
• Vorbis
• Wma
• OGG
There are several concepts which must be explored in order to fully understand and
work with media streams.
2.4.1. Formats and Codecs
When an analogue audio signal is encoded digitally on a computer without any type of
compression it is sampled at a specified rate using the DAC in the computers soundcard.
Each sample is assigned numerical value. For a 16-bit soundcard each sample would be
assigned a value between 0 and 65536. These samples are recorded in a file sequentially
to give a digital representation of an audio signal. The standard for traditional audio CDs
is a sampling rate of 44.1 KHz and a 16-bit value per sample. This type of recording is
known as PCM or pulse code modulation. It is digital sound in its raw form. The word
codec is derived from ‘coder-decoder’ and is an algorithm for turning a PCM into a
compressed format and back again. All of the compressed format types mentioned
above require codecs in order for a user to view or listen to the media. This project is
not concerned with producing a system which will be able to use any media type in any
format. Rather, that a framework might be constructed with the capacity to take on new
codecs as they are created so that the system can be added to and kept up to date with
the latest formats. That said, codecs are still important to this project within the realms
of formatting media so that it may be transmitted across a network (See Streaming
Media section).
2.4.2. Multiplexing and Demultiplexing
Creating a Web-based Media Control System using Networked Appliances
11
A media stream may consist of a number of components. An audio file on a CD will
usually have two stereo channels. A DVD with 5.1 surround sound will have six channels
for sound and one channel for the visual component. If the film is subtitled they may be
encoded in another visual channel, a total of eight individual streams in one. In order to
process these channels in useful ways they must be separated. This is the function of a
multiplexer; it takes a composite stream as an input and outputs the components as
individual streams. Once the streams have been isolated effects and processing can take
place. This is a feature which is not offered in the previously described systems and
provides an opportunity to develop a system with a unique feature. After processing the
streams individually they can be encoded and transmitted to different destinations or
alternatively, once all the necessary processing is complete, the stream can be
reassembled using a multiplexer and sent to its destination.
2.5. Streaming Protocol
2.5.1. Protocol Types
Streaming media of any kind over a network has several associated pitfalls. Using
standard UDP for example, there is no guarantee the packets will arrive at their
destination correctly or in the same order that they were sent as no error checking takes
place. Buffer under run is also a media streaming issue. This is where the buffer
receiving the incoming data packets is emptied quicker than it is filled. The result of this
is a “jumpy” stream which will manifest itself in the form of audio or visual dropouts.
There are ways to tackle these problems however. Buffer under run can be resolved by
having a large buffer at the client side which is allowed to fill ahead of time. As long as
the stream is not so slow that the buffer runs out during the transmission, drop outs will
not occur. Error checking can be performed by creating protocol on top of UDP utilising
two-way network communication. In this way, the client can respond when a packet is
received in order to verify it has been received uncorrupted and in the correct order.
There are several protocols like this in existence.
TCP is a widely used internet protocol. This is due to the fact that it will check that
packets have reached their destination correctly, retransmitting them if they have been
corrupted or lost. While on the surface this seems more favourable than UDP, TCP is
better for accurate data transmission and is not as fast as it could be due to the fact it
may wait a long time for lost packets to be retransmitted. This makes TCP a poor choice
for media streaming as a constant data flow is needed to satisfy the user’s needs (i.e. an
uninterrupted program).
RTP (Real-time Transport Protocol) and RCTP (Real-time Transport Control Protocol)
were developed in the 1970s to tackle media streaming issues. The two protocols work
Creating a Web-based Media Control System using Networked Appliances
12
together on consecutive ports to manage the data stream. RTP sends a data stream at a
constant rate. RCTP does not transmit any of the data itself, instead it sends control
packets to monitor the quality of service and provide feedback to the RTP. RTCP has the
following packet types: -
• Sender report packet
• Receiver report packet
• Source Description RTCP packet
• Goodbye RTCP Packet
• Application Specific RTCP packets
The packets contain information on quality of service such as jitter, fraction lost and
total packets lost amongst others. RTP and RTCP are based on UDP rather than TCP so
does not suffer from interrupted service while waiting for lost packets.
2.5.2. Multicast and Unicast
There are two distinct types of data streams in existence in the client server model;
multicast and unicast (8). Unicast protocols send a copy of the data stream to each client
separately over a network. The disadvantage of this is that if many clients are connected
to the media server the amount of network traffic will be very high. If too many clients
connect then the network may be overloaded and be capable of maintaining constant
streams to each client. Multicast protocol sends one copy of the stream over a part of a
network and allows multiple clients to view the same stream. This stops the network
from becoming overloaded when multiple clients connect however the implementation
is more complex.
Figure 5 – Multicast vs Unicast
http://www.karen.net.nz/home/
Creating a Web-based Media Control System using Networked Appliances
13
Multicast protocol must be implemented on every node of the network is it operating
on. Difficulties may also be encountered due to the fact some firewalls will block it. For
these reasons it is usually only practical for large organisations running their own
networks. In the case of this project a unicast protocol will probably suffice as only a
small number of streams will be used at any one time due to the fact this system is
intended for home use. However multicast protocol is something to bear in mind for
expansion of the product, if for example the product was used for a corporate media
server in which case the number of potential clients would be significantly higher.
2.6. Implementation Tools and Considerations
The project is intended to be a web based system therefore it must be capable of
running entirely from a web browser. Web applications such as this are composed of
two distinct parts; a client side and a server side. The client side is the part the user sees
and interacts with. Interactions and inputted data must be processed in some way and
relayed to the server. The server then uses the input to carry out the core functionality
of the project and sends a response back to the client.
The traditional way of producing dynamic content in a webpage was using the Common
Gateway Interface (CGI) (9). It was not a very efficient way of operating as for each
request/response the server was required to create a new process, load an interpreter
and script before executing the script. Upon finishing the server would then destroy the
process and free up the resources again. This way of operating puts a lot of strain on the
web server and doesn’t scale well.
Fortunately there is now a great deal of technology available to help create efficient
client/server interactions which can be easily integrated into a full system. Server side
technologies include:-
• PHP
• Perl
• ASP
• JSP
Client side technologies include:-
• JavaScript
• ActionScript
• Dynamic HTML
• Flash
In assessing which technology to use for this project the following principles must be
kept in mind:-
Creating a Web-based Media Control System using Networked Appliances
14
• The system must be modular in design in order to make it easily maintainable
and extensible. In other words the system must be comprised of distinct blocks
which interact with each other. This follows the object-oriented design concept
which is consistent with writing maintainable code and allowing code to be
reused easily (10).
• The system must be platform independent where possible so it is not tied to
particular hardware or operating systems. This allows the system to be deployed
on a range of systems easily. For example if the user is running the server side of
the system on one operating system (OS) but accessing it from a PDA running on
another then a platform independent setup would allow this to happen.
• The GUI must be fairly complex by the standards of regular web pages
With this in mind the technology assessed will be Java based allowing integration and
platform independence as it will run on the JVM.
2.6.1. Client Side
Java Applets
For the client side functionality an Applet is an obvious choice as it can be embedded in
a webpage and allows the client to connect to and interact with the server. Applets
allow dynamic content and complex GUIs to Applets run on the JVM (Java Virtual
Machine) and are therefore cross-platform i.e. they can be run on a multitude of
browsers and operating systems where Java is enabled.
Applets have certain security issues which must be noted when attempting to access
external network locations. They can only communicate with the server from which they
originate and cannot read or write files on the client system (11).
Java Applets can be implemented in Java amongst other languages. This is useful as
there are a great deal of resources on the Internet and in textbooks on Java
implementation techniques as well as free tools and Integrated Development
Environments such as NetBeans.
GUI considerations
The client side software will be in the form of a Java Applet embedded in a web-
browser. Some considerations must therefore be made due to the variety of ways in
which web pages can be viewed. Firstly there will be some issues related to the
particular web-browser in use. Secondly, the type of machine the using is viewing the
page on will vary i.e. PDA, laptop. The software must be viewable and easy to use on a
PDA as this is the closest a user will have to a remote control. The GUI must be easy to
use, intuitive and clear.
Creating a Web-based Media Control System using Networked Appliances
15
2.6.2. Server Side
Java Servlets
The Java Servlet API allows Java based web services to be written. They run far more
efficiently than CGI, with a new thread being created to for each request using a process
which is always running on the server. Servlets can provide the link to the server from
the client side Applet. Using an Applet and Servlet together in this way means Java
objects can be transmitted between client and server by implementing the Java
Serializable interface in the classes whose objects require transmitting.
Java Server Pages
Using Java Server Pages (JSP) is another way of creating dynamic web content. They
have the useful feature of allowing the programmer to embed JSP code in HTML pages
using XML style tags. They allow rapid development of web applications and separate
the user interface from functionality enabling both to be developed independently. JSP
make use of Java Beans which are classes with a no-argument constructor and public
GET and SET methods.
Java Beans allow client data to be stored in between client requests. There are four
types of bean, each with a different scope. They are as follows: -
• Page
• Request
• Session
• Application
Each bean should be used with its particular scope in mind. For example, to enable client
data to persist for the entire duration of the web application use the application bean.
Java Media Framework
The Java Media Framework is an extension to the standard Java Platform and is
specifically designed with media streaming in mind (12). It provides an abstract layer
between the programmer and the media itself allowing implementation of media
streams without the need for knowledge about the individual format types. It also
provides abstractions for data sources and players.
JMF is compatible with a multitude of audio and visual compression formats (See
Appendix B for full list) and is a suitable candidate for setting up the server side of this
project. It also allows the sending and receiving of RTP streams (see Streaming
Protocols), a useful feature for media streaming. JMF provides several object types to
allow the programmer to create and process media streams.
Creating a Web-based Media Control System using Networked Appliances
16
• Managers
o PlugInManager
o CaptureDeviceManager
• DataSources
• DataSinks
• Processors
• Players
The Manager class provides the fundamentals for media streaming. It can create
instances of the remaining object types. It also provides management of capture devices
and plug-ins. The JMF contains five plug-in types for processing media streams (13).
These are as follows: -
1. Demultiplexer – This parses media streams and allows extraction of the
individual component streams. This is a useful feature as a video and audio
track can be separated from each other, processed separately and sent to
different device sinks on a network.
2. Effect – This takes an input stream, performs some kind of processing on the
content and outputs the stream.
3. Codec – This takes an input stream of a particular format and outputs the
stream in a specified format leaving the content untouched.
4. Multiplexer – This takes several streams and combines them into one
5. Renderer – This processes the media data in a track and delivers it to a
destination such as a screen or speaker.
2.7. Summary
The review of the commercially existing solutions provided only a black-box view point
which can only be of limited use. However the features they provide and manner in
which they are provided is useful for gathering ideas on the front end of the system. The
features provided by these systems in terms of the media processing is fairly limited.
They do not allow streams to be decomposed and processed individually, which allows
an opportunity for a unique feature to be developed in this system. The study on media
streams provides an insight in to a way of accomplishing this task. Using appropriate
codecs the media can be decoded into a raw format. The use of multiplexing modules
will allow the stream to be decomposed into it component streams. From the protocol
study it can be concluded that encoding streams into an RTP compatible format is an
appropriate way to broadcast the media over a network. The JMF provides a good
starting point for manipulating streams and provides many abstractions and useful tools
for working with media.
Creating a Web-based Media Control System using Networked Appliances
17
3. Requirements
3.1. Requirements elicitation
To specify the requirements of the system the research from the literature survey will be
drawn upon. Specifically, the way in which existing systems operate will be used to help
form the structure of the system and the functions it will perform. The research gained
from the study on media streams will be used to determine the order and nature of the
required processing. Using the knowledge gained in the literature survey it can be noted
that existing systems do not exhibit the flexibility this project is attempting to achieve.
Specifically manipulation and control of individual stream components and routing them
to different destinations is not covered by existing systems.
The requirements are separated into three sections. The basic functional requirements
describe the core functionality of the system and are the priority for this project. The
non-functional requirements describe other properties important to the system besides
functional content. The advanced requirements are optional for this project and
describe properties of the full working system.
3.2. Basic Functional Requirements
• Import a media stream into the system
o Rationale: The system must be able to import and handle media streams
in order to process and broadcast them
• Decode the media to a raw format
o Rationale: The media must be decoded to a raw format in order to
decompress it if compression has been applied. This is necessary for
splitting the media and applying effects
• Split media into components
o Rationale: The media must be split into its components in order to
process each part separately and have the option to send the streams to
different destination URLs.
• Process or apply effects to media
Creating a Web-based Media Control System using Networked Appliances
18
o Rationale: This requirement allows the streams to be manipulated in
some useful way before being sent to its destination. E.g. Equalisation of
an audio signal
• Encode media into format suitable for transmission
o Rationale: In order to stream media across a network there must be
some way of encoding it into a format which will make this possible
• Stream media to a destination URL
o Rationale: Once processing is complete the user must be able to
transmit the media to a particular destination on the network
3.3. Non-functional Requirements
• The GUI for the system must be clear, intuitive and easy to use
o Rationale: The GUI must allow the user to use the system easily and
effectively by making the functionality of the system apparent.
• The system must be platform independent
o Rationale: This will allow the system to be used on a variety of browsers
and devices.
• The client side of the system must be efficient and should run on a low
performance system
o Rationale: The user may be using the system using PDA which may have
low memory and processor speed in comparison with a standard
workstation
3.4. Advanced Functional Requirements
• The user must be able to log in to the system
o Rationale: The system is web based therefore some sort of log-in
process should be required so that the system can keep track of
individual settings and make sure two users are not attempting
operations at the same time.
• The user must be able to easily select sources, add effects and stream outputs
using the GUI
o Rationale: The system must be allow easy access to the system
functionality through a well made GUI
Creating a Web-based Media Control System using Networked Appliances
19
4. Design
4.1. Approach
Firstly a high-level break down of the web architecture was made using the
requirements as guidance. Shklar and Rosen(14) put forward a set of guidelines for web
application functionality which are kept in mind in the web application architecture. See
Appendix B.
The required layers are as follows:-
• Web interface – the user must be able to log-on to the system remotely which
will be running on a web server in the user’s home.
• Applet Layer – this allows easy creation of a rich graphical interface and the
execution of Java code on the client machine.
• Servlet layer – in order to run code on the server we must have a means of
communicating with the server. As Applet code is downloaded to the client’s
machine before it is executed, linking it to a Java Servlet on the web server will
allow a means of controlling the server remotely.
• JMF engine – All of the core functionality will be in this layer. The Servlet layer
will make calls into the engine in order to perform the various streaming
functions of the system. The results of these calls will be passed back through
the other layers to the users web interface.
From a high level the application architecture will be as follows:-
Creating a Web-based Media Control System using Networked Appliances
20
The architecture for the system was designed in a modular fashion to enable the system
to be developed in distinct blocks. Communication between the client and server is
carried out using an Applet and Servlet rather than using the hybrid approach of Java
Server Pages. The Servlet in turn makes calls to the JMF engine using the JMF framework
to carry out all of the core functionality and stream management. Using this structure
the core functionality is isolated from the rest of the system and can be developed
independently. The Applet and Servlet must be able to communicate with each other
effectively in order to pass back useful data and formatted responses, which is aided by
the Java Serializable interface. This interface is a way of allowing Java objects to be read
and written to files or streamed across networks using a process called object
serialization (10). Both the client and server code run on the JVM which provides an
independent platform allowing both sides of the system to potentially run on different
operating systems. The system requirements for the Java platform are as follows:-
Windows 98/ME/2000/XP:-
• Pentium 166MHz or faster
• Minimum of 125MB free disk space
• Minimum of 32MB of RAM
On Linux the requirements are similar except less disk space is needed. The
requirements are fairly low and should enable devices with low processor speeds such
as PDAs or even mobile phones to run the necessary code.
Figure 6 – System architecture diagram
Creating a Web-based Media Control System using Networked Appliances
21
4.2. Developing a GUI
A GUI mock-up was created in order to demonstrate a potential way of using the system
and as a design template. The GUI must be simple and easy to use but also provide all
the necessary functionality. Below is the design for the front page of the GUI.
Using this design the user can select a source type using the top left combo box. This can
either be a file, a capture device or a URL. The “Browse” button can be used to choose a
file in the case of a file source. A combo box is used in the case of a capture device
source. A URL can be typed in to the “Location” combo box, or one can be chosen from a
list of previously used URLs. Once a source has been chosen, a list of available formats
will appear in the “Available Formats” list.
The next section is the Stream Control section. Here the user can pick any of the
composite or individual streams offered by the source chosen. To use a stream the user
must highlight it with the mouse then click “Use Stream”. At this point the stream will
appear as a new tab in the tab controller. In Figure 10, Audio Channel 1 and Audio
Channel 2 have been selected for use so tabs have been created for them in the tab
controller. These new tabs allow the user to customize the streams before they are
broadcast. The streams can be customized by using any effect plug-ins the system has
such as a graphical equaliser. The output destinations for the streams must also be
selected. Once this is complete the streams can be transmitted to their destinations by
clicking the “Start Transmission” button and stopped again using the “Stop
Transmission” button.
Figure 7 – GUI Design – Stream tab
Creating a Web-based Media Control System using Networked Appliances
22
The figure above shows the details of the channel tab. When an audio stream has been
selected for use a page like this is created. It can be removed again if the user decides
not to use this stream with the “Remove Stream” button. If the user does wish to use
the stream then a destination must be selected. A destination URL may be typed in, or a
previously used destination may be selected from the list. In theory the user could select
a stream twice and send each copy to a different destination. If the user wishes to apply
effects to the stream, they can be selected from the plug-in list. Only applicable plug-ins
should be shown here depending on the stream type. Clicking on “Use Effect” should
add the plug-in to the “Plug-in Bay”, where an interface allowing control of any
parameters should appear. Plug-ins can be removed from the “Plug-in Bay” by clicking
on “Remove Effect”. The stream will have the appropriate effects applied in the order
they appear in the interface. Once all the streams have been customized in this way the
user can go back to the main page in order to start the transmission. When this happens
the JMF engine should do the following:-
• Import the source stream.
• Decompose the stream where necessary and/or make copies to isolate the
required sub streams.
• Apply the effects to each one.
• Begin transmission of the streams to their destinations at the same time.
Figure 8 – GUI Design – Channel tab
Creating a Web-based Media Control System using Networked Appliances
23
4.3. Core Functionality
The core functionality is the part of the code which deals with the media streams, their
processing and delivery to their final destinations. This code runs on the web server and
is called via the web interface. The following parameters should be passed to the stream
manager:-
• Device source – in the form of a URL corresponding to the location of the
networked source or a media file stored on the web server
• Processing options – the user must choose how to process the streams, either
individually or multiplexed
• Destinations – the user must specify destination URLs for the streams
There are two main stream types to be considered, push and pull. A push stream is one
which is initiated by the server. This could be a locally stored file or a file stored on
another web server. A pull stream is a client initiated stream. This could be in the form
of an external device such as a CD player or microphone. Push and pull streams are
processed slightly differently due to the fact that with a push stream the server knows
how long the stream will be and has a greater degree of control over it. With a pull
stream the server has no idea when or if the stream will end so is processed in chunks.
The user should make selections for the following:-
• Select the source from either a file or device
• Select a stream from the list of available streams. This includes:-
o Individual components of a stream e.g. Audio Channel Left
o Combined streams e.g. Interleaved Stereo
• Select a destination URL
• Select effects on the stream
When all of the above data has been inputted, the system should be able to transmit the
source stream to its destination. Using this sequence we can compose functions to
perform the necessary tasks.
The flow diagram below illustrates a possible chain of processing for an incoming media
stream. Firstly it is separated into its audio and video components using a demultiplexer.
Then both components are passed through a codec to decode them into raw formats
ready for processing. The video component has some optional processing applied and is
ready for the next stage of processing.
Creating a Web-based Media Control System using Networked Appliances
24
The audio channel which is now in raw PCM form can be split further. This is because
stereo audio files have interleaved channels. In other words for a 16-bit audio stream,
the first 16-bits contain the value for the left channel, the second 16-bits contain the
value for the right channel, and so on. Using values obtained from the audio format a
simple algorithm can be written to separate the streams. The necessary parameters for
this are as follows:-
• Number of channels
• Audio resolution in bits
• Whether the data is signed or unsigned
So, for a 2 channel signal using 16-bit signed values two new streams can be created.
The first channel should start with the first 16-bit sample then a 16-bit silence value
should be inserted before proceeding to the third 16-bit sample from the original
stream. Silence values can be derived from the resolution and also depends on whether
the samples are signed or not. For signed samples the calculation is as follows:-
Silence Value = -1 * 2B
Where B is the number of bits. For unsigned samples the silence value is 0. So, for 16-bit
signed samples the silence value is -65536. Below is an example of how the streams
would look.
Original Interleaved Stream:-
Figure 9 – Stream processing Part. 1
Creating a Web-based Media Control System using Networked Appliances
25
1009 -45 65 256 -519 2156 57 -2195
Left Channel:-
1009 -65536 65 -65536 -519 -65536 57 -65536
Right Channel:-
-65536 -45 -65536 256 -65536 2156 -65536 -2195
After this processing is complete there should be three separate, post-processed
streams ready for the next stage. At this point the streams should either be encoded for
transmission and sent to their destinations over the network as shown in the figure
below:-
Alternatively the streams can be reassembled into one stream and transmitted to a
single destination.
Figure 10 – Stream processing Part. 2 -
Individual output streams
Figure 11 – Stream processing Part. 2 –
Multiplexing the stream
Creating a Web-based Media Control System using Networked Appliances
26
5. Implementation and Testing
The implementation was carried out iteratively in order to become familiar with the
tools and technology required for the project. A series of experimental pieces of code
were produced in order to acquire the necessary functionality.
Requirement:- Import a media stream into the system
This requirement can be accomplished by specifying the media source as an inputted
URL, then using JMF to create an instance of the MediaLocator object or the DataSource
object.
MediaLocator mlSource = new MediaLocator(URL);
Alternatively a DataSource object can be created using the Manager:
DataSource ds1 = Manager.CreateDataSource(URL);
DataSource ds2 = Manager.CreateDataSource(MediaLocator);
To prove that a media stream was being imported into the system and to become
familiar with the system, the function PlayMedia was designed. The purpose of this
function was to take a URL as input and render the media at this source.
public void PlayMedia(String URL) { try {
// source file MediaLocator mlSource = new MediaLocator(URL); Processor p = null; // GUI components Component controlComponent = null; Component visualComponent = null; // create processor model from source file // format null - automatically decide // content description null - render to screen ProcessorModel model = new ProcessorModel(mlSource, null, null); // creath the processor from the model
Creating a Web-based Media Control System using Networked Appliances
27
p = Manager.createRealizedProcessor(model); // add control panel if ((controlComponent = p.getControlPanelComponent()) != null) { System.err.println("control panel found"); MainPanel.add("South", controlComponent); } // get the video if there is one if ((visualComponent = p.getVisualComponent()) != null) { System.err.println("visual panel found"); MainPanel.add("Center", visualComponent); } } catch (Exception ex) { Log(Level.SEVERE, "Creating DS went wrong", ex);
} }
This function works successfully and was tested using both audio and video data. See
Appendix E for results.
Requirement:- Decode the media to a raw format
This requirement can be satisfied by a function which takes a URL as input and then
creates a Processor object and returns the output of the processor as a DataSource
object. The output of the processor can be specified using a ContentDescriptor object.
ContentDescriptor outputType = new ContentDescriptor(ContentDescriptor.RAW);
The function GetRawForm was designed and tested successfully.
public Processor GetRawForm(String URL) { Processor proc = null; MediaLocator ml = new MediaLocator(URL); // set output format to raw ContentDescriptor outputType = new ContentDescriptor(ContentDescriptor.RAW); // create processor model and processor using this contentdescriptor ProcessorModel model = new ProcessorModel(ml, null, outputType); try { proc = Manager.createRealizedProcessor(model); } catch (Exception ex) { Log(Level.SEVERE, "Creating Processor went wrong" + ex.toString(), ex); return null; } return proc;
}
Creating a Web-based Media Control System using Networked Appliances
28
Requirement:- Split media into components
There are two main stages to this problem. Firstly, the task of decomposing a stream
into its component tracks. This is carried out using the function below. Several file types
were tested and they were all decomposed successfully. The implementation was
achieved by extending the PushBufferDataSource.
Each stream in the track was separated into an instance of the SplitDataSource class
which is listed in the appendix. The track information was then outputted.
public void SplitMedia(Processor proc) { PushBufferDataSource pbds = (PushBufferDataSource) proc.getDataOutput(); PushBufferStream pbs[] = pbds.getStreams(); SplitSource[] ss = new SplitSource[pbs.length]; for (int i = 0; i < pbs.length; i++) { ss[i] = new SplitSource(proc, i); System.out.println("Track Info: " + ss[i].getStreamFormat().toString()); } }
The second part of this process is separating the channels of an interleaved audio
stream. This functionality was attempted but was not completed.
Requirement:- Process or apply effects to media
This was attempted using a plug-in acquired from the Java Sun website. However
difficulty was encountered when trying to configure the processor to use an effect.
Processors have several stages they must go through before their final realized state.
Requirement:- Encode media into format suitable for transmission
A TransmitMedia class was created to fulfil this requirement. The first function,
EncodeMediaForTransmission was designed to encode the media into an RTP
streamable format.
public void EncodeMediaForTransmission() { try { // desired track format Format[] form = new Format[]{new AudioFormat(AudioFormat.MPEG_RTP)}; // desired output format ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW_RTP); // create processor model using source media, format and content desc ProcessorModel model = new ProcessorModel(mlSource, form, cd);
Creating a Web-based Media Control System using Networked Appliances
29
// use model to create instance of processor p = Manager.createRealizedProcessor(model); // create sink from processor output and dest URL sink = Manager.createDataSink(p.getDataOutput(), mlDest); } catch (Exception ex) { Log(Level.SEVERE, "Creating DS went wrong" + ex.toString(), ex); } }
This function was tested in conjunction with the solution for the next requirement. That
is, by transmitting a stream over the network.
Requirement:- Stream media to a destination URL
To make the TransmitMedia class useful two more functions were added to allow the
transmission to be started and stopped.
public void StartTransmission() { try { // Start transmitting p.start(); sink.open(); sink.start(); } catch (IOException ex) { Log(Level.SEVERE, "Transmission Problem:" + ex.toString(), ex); } } public void StopTransmission() { try { p.stop(); sink.stop(); } catch (IOException ex) { Log(Level.SEVERE, "Transmission Problem:" + ex.toString(), ex); } }
JMStudio is a tool which comes with the JMF package. It has an Open RTP Session
function which allows it to receive an incoming RTP stream. This was used to test the
transmission function.
Creating a Web-based Media Control System using Networked Appliances
30
Figure 12 – JMStudio – Setup to receive RTP
stream
Creating a Web-based Media Control System using Networked Appliances
31
6. Conclusions
6.1. Evaluation
6.1.1. Evaluation against Functional Requirements
• Import a media stream into the system.
This requirement was implemented successfully within the scope of file streams.
• Decode the media to a raw format.
This requirement was implemented successfully.
• Split media into components.
This requirement was met to some degree. While the implementation allows
media streams to be demultiplexed into their component tracks, separating an
interleaved stereo or multichannel signal was not achieved.
• Process or apply effects to media.
Implementation was not entirely successful for this requirement as problems
were encountered when attempting to configure and chain processors before
they are in the realized state.
• Encode media into format suitable for transmission.
This requirement was implemented successfully within the scope of audio files.
For video streaming additional formatting problems were encountered.
• Stream media to a destination URL.
This requirement was implemented successfully, also in the scope of audio files.
6.1.2. Evaluation against Non-Functional Requirements
• The GUI for the system must be clear, intuitive and easy to use
GUI was only partially developed and not linked to main functionality.
• The system must be platform independent
Creating a Web-based Media Control System using Networked Appliances
32
The code was written in Java and therefore runs on the Java Virtual Machine
making it effectively platform independent.
• The client side of the system must be efficient and should run on a low
performance system
6.2. Successes
Even though problems were experienced during the implementation stages of the
project, most of the basic requirements were fulfilled. In particular encoding an audio
source into an RTP stream and transmitting it to multicast IP address was a particular
point of success and is essentially a core part of what this project was trying to achieve.
Decomposing streams into their component tracks was also achieved although the
algorithm for separating an interleaved stereo stream was not. It was noted in the
analysis of existing systems that demultiplexing streams was not offered as a feature
which makes achieving this more valuable. Overall some solid progress was made in this
field.
6.3. Issues
The project implementation did not reach the standard that was desired. This was partly
due to spending a lot of time researching into new, unfamiliar technology where each
piece of the system was a much larger piece of work than estimated. Attempts were
made to develop the full application and with hindsight this was clearly spreading the
available time and resources too thinly. The JMF proved quite difficult to use for
complex problems and as it is not widely used, resources to aid with problems were
relatively scarce. The core part of the project should have been the primary focus from
the outset
6.4. The Future
There is plenty of scope for future work in this project as the implementation achieved
only some of the possible options available. Firstly development of a full-scale web
application would allow the system to work as it was originally intended.
The current design only allows one source stream to be used at a time. A more
advanced version of the code could be multithreaded in order to allow multiple
processors and streams to be setup at once. When JMF creates a new Processor, for
example, it is a blocking call and nothing else can happen during creation, which takes a
few seconds. In order improve system performance a new thread could be created for
each new Processor. The implementation as it stands uses files as the source streams.
Creating a Web-based Media Control System using Networked Appliances
33
Having access to real networked devices would have been a useful asset, as this could
have resulted in a full working demonstration.
Creating a Web-based Media Control System using Networked Appliances
34
7. Bibliography
1. Steinberg. [Online] www.steinberg.net.
2. Native Instuments. [Online] http://www.native-instruments.com/.
3. TwonkyVision. [Online] http://www.twonkyvision.de.
4. Nexus. [Online] http://www.avnex.co.uk.
5. Slim Devices. [Online] http://www.slimdevices.com/.
6. MPD: Music Player Daemon. [Online] http://www.musicpd.org/.
7. JackAudio. [Online] www.jackaudio.org.
8. Realtime control protocol and its improvements for Internet Protocol Television.
BURGET, R and KOMOSNY, D. Brno : Department of Telecommunications, Faculty of
Engineering and Communication, UT Brno, 2006.
9. BERGSTEN, H. JavaServer Pages. s.l. : O'Reilly, 2002.
10. BARNES, B. J. Object-Oriented Programming with Java. s.l. : Prentice-Hall, 2000.
11. Applet FAQ. [Online] Sun Microsystems. http://java.sun.com/sfaq/#prevent.
12. JMF. [Online] Sun Microsystems. http://java.sun.com/products/java-
media/jmf/index.jsp.
13. SUN MICROSYSTEMS. Java Media Framework API Guide. California : s.n., 1999.
14. SHKLAR, L and ROSEN, R. Web Application Architecture. Principles, Protocols and
Practices. s.l. : John Wiley & Sons Ltd, 2003.
15. Analysis of bandwidth redistribution algorithm for single source. KOSMONY, D and
NOVOTNY, V. s.l. : Department of Telecommunications, Brno University of Technology,
2006.
16. RICHARD STEVENS, W. TCP/IP Illustrated, Volume 1. s.l. : Addison-Wesley, 2001.
17. JOHNSON, J. GUI Bloopers. s.l. : Morgan Kaufmann Publishers, 2000.
18. Java Applets. [Online] Sun Microsystems. http://java.sun.com/applets.
19. RIPLEY, M, et al. Content Protection in the Digital Home. Intel Technical Journal.
2002, Vol. 06, 04.
20. SOMMERVILLE, I. Software Engineering. s.l. : Addison-Wesley, 2004.
Creating a Web-based Media Control System using Networked Appliances
35
21. MYATT, A. Pro NetBeans IDE 5.5 Enterprise Edition. s.l. : Apress, 2007.
22. VALACICH, J.S, GEORGE, J.F and HOFFER, J.A. Essentials of Systems Analysis and
Design. s.l. : Prentice Hall, 2004.
23. SLOANE, A. Internet Multimedia. s.l. : Palgrave Macmillan, 2005.
24. Java Developers Journal. 2000, Vol. 5, 4.
25. DAWSON, C. W. The Essence of Computing Projects. A Student's Guide. s.l. : Pearson
Education Limited, 2000.
Creating a Web-based Media Control System using Networked Appliances
36
Appendix A. Glossary
The following glossary of terms is provided to assist the reader to understand specialist
terminology and abbreviations as applicable to this document. It is not intended to be
definitive.
CGI Common Gateway Interface
DAW Digital Audio Workstation
DSP Digital Signal Processing
JMF Java Media Framework
JSF Java Server Faces
JSP Java Server Pages
LAN Local Area Network
OS Operating System
PDA Personal Digital Assistant
RTCP Real-time Transport Control Protocol
RTP Real-time Transport Protocol
TCP Transmission Control Protocol
UDP User Datagram Protocol
VOD Video on Demand
VST Virtual Studio Technology
Creating a Web-based Media Control System using Networked Appliances
37
Appendix B. Literature Review
7.1. JMF Supported Formats
JMF supports audio sample rates from 8 KHz to 48KHz. Note that cross-platform version
of JMF only supports the following rates: 8, 11.025, 11.127, 16, 22.05, 22.254, 32, 44.1,
and 48 KHz.
The JMF 2.1.1 Reference Implementation supports the media types and formats listed in
the table below. In this table:
• D indicates the format can be decoded and presented.
• E indicates the media stream can be encoded in the format.
• read indicates the media type can be used as input (read from a file)
• write indicates the media type can be generated as output (written to a file)
Media Type JMF 2.1.1
Cross Platform
Version
JMF 2.1.1
Solaris/Linux
Performance Pack
JMF 2.1.1
Windows
Performance Pack
AIFF (.aiff) read/write read/write read/write
8-bit mono/stereo linear D,E D,E D,E
16-bit mono/stereo
linear D,E D,E D,E
G.711 (U-law) D,E D,E D,E
A-law D D D
IMA4 ADPCM D,E D,E D,E
AVI (.avi) read/write read/write read/write
Audio: 8-bit mono/stereo
linear D,E D,E D,E
Audio: 16-bit
mono/stereo linear D,E D,E D,E
Audio: DVI ADPCM
compressed D,E D,E D,E
Audio: G.711 (U-law) D,E D,E D,E
Creating a Web-based Media Control System using Networked Appliances
38
Audio: A-law D D D
Audio: GSM mono D,E D,E D,E
Audio: ACM** - - D,E
Video: Cinepak D D,E D
Video: MJPEG (422) D D,E D,E
Video: RGB D,E D,E D,E
Video: YUV D,E D,E D,E
Video: VCM** - - D,E
GSM (.gsm) read/write read/write read/write
GSM mono audio D,E D,E D,E
HotMedia (.mvr) read only read only read only
IBM HotMedia D D D
MIDI (.mid) read only read only read only
Type 1 & 2 MIDI - D D
MPEG-1 Video (.mpg) - read only read only
Multiplexed System
stream - D D
Video-only stream - D D
MPEG Layer II Audio
(.mp2) read only read/write read/write
MPEG layer 1, 2 audio D D,E D,E
QuickTime (.mov) read/write read/write read/write
Audio: 8 bits
mono/stereo linear D,E D,E D,E
Audio: 16 bits
mono/stereo linear D,E D,E D,E
Audio: G.711 (U-law) D,E D,E D,E
Audio: A-law D D D
Audio: GSM mono D,E D,E D,E
Audio: IMA4 ADPCM D,E D,E D,E
Video: Cinepak D D,E D
Video: H.261 - D D
Video: H.263 D D,E D,E
Creating a Web-based Media Control System using Networked Appliances
39
Video: JPEG (420, 422,
444) D D,E D,E
Video: RGB D,E D,E D,E
Sun Audio (.au) read/write read/write read/write
8 bits mono/stereo
linear D,E D,E D,E
16 bits mono/stereo
linear D,E D,E D,E
G.711 (U-law) D,E D,E D,E
A-law D D D
Wave (.wav) read/write read/write read/write
8-bit mono/stereo linear D,E D,E D,E
16-bit mono/stereo
linear D,E D,E D,E
G.711 (U-law) D,E D,E D,E
A-law D D D
GSM mono D,E D,E D,E
DVI ADPCM D,E D,E D,E
MS ADPCM D D D
ACM** - - D,E
Notes:
• ACM** - Window's Audio Compression Manager support. Tested for these
formats: A-law, GSM610, MSNAudio, MSADPCM, Truespeech, mp3, PCM,
Voxware AC8, Voxware AC10.
• VCM** - Window's Video Compression Manager support. Tested for these
formats: IV41, IV51, VGPX, WINX, YV12, I263, CRAM, MPG4.
Creating a Web-based Media Control System using Networked Appliances
40
Appendix C. Design
Shklar and Rosen(14) put forward the following functionality which should be performed
by a web application:-
• Interpreting and routing user requests:
The web server takes responsibility for processing the request. This
could mean carrying out some complex functionality by executing a
script or Servlet which make further calls to code on the server
• Controlling access to the web application
This means providing some sort of authentication to restrict access to
the site, unless the site is an unrestricted one
• Enabling data access
Allowing users to retrieve information from the server
• Accessing and modifying content
As well as accessing and displaying content, the application must be able
to perform content updates and feedback to the user. This can be the
result of user actions or produced automatically
• Customizing responses
Transform generated responses based on some criteria. This could be
for the specific user or the browser the user is running.
• Transmitting formatted responses
This means processing the response using technology such as an XSL
sylesheet or other form of presentation prior to transmission.
• Recording and logging application activity
Keeping records of application usage for administrative purposes.
Creating a Web-based Media Control System using Networked Appliances
41
Appendix D. Implementation
TransmitMedia.java
package Proto; import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import javax.media.*; import javax.media.format.AudioFormat; import javax.media.protocol.ContentDescriptor; public class TransmitMedia { Processor p = null; DataSink sink = null; // source file MediaLocator mlSource = null; // destination IP MediaLocator mlDest = null; public TransmitMedia(String medURL, String destURL) { this.mlSource = new MediaLocator(medURL); this.mlDest = new MediaLocator(destURL); } public void EncodeMediaForTransmission() { try { // desired track format Format[] form = new Format[]{new AudioFormat(AudioFormat.MPEG_RTP)}; // desired output format ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW_RTP); // create processor model using source media, format and content desc ProcessorModel model = new ProcessorModel(mlSource, form, cd); // use model to create instance of processor p = Manager.createRealizedProcessor(model); // create sink from processor output and dest URL sink = Manager.createDataSink(p.getDataOutput(), mlDest); } catch (Exception ex) { Log(Level.SEVERE, "Creating DS went wrong" + ex.toString(), ex); } } public void StartTransmission() { try { // Start transmitting p.start(); sink.open(); sink.start(); } catch (IOException ex) { Log(Level.SEVERE, "Transmission Problem:" + ex.toString(), ex); } }
Creating a Web-based Media Control System using Networked Appliances
42
public void StopTransmission() { try { p.stop(); sink.stop(); } catch (IOException ex) { Log(Level.SEVERE, "Transmission Problem:" + ex.toString(), ex); } } public void Log(Level l, String msg) { Logger.getLogger(Prototype.class.getName()).log(l, msg); } public void Log(Level l, String msg, Object ex) { Logger.getLogger(Prototype.class.getName()).log(l, msg, ex); } }
SplitDataSource.java
package Proto; import java.io.IOException; import javax.media.Control; import javax.media.Format; import javax.media.MediaLocator; import javax.media.Processor; import javax.media.Time; import javax.media.protocol.ContentDescriptor; import javax.media.protocol.PushBufferDataSource; import javax.media.protocol.PushBufferStream; public class SplitDataSource extends PushBufferDataSource { Processor p; PushBufferDataSource ds; PushBufferStream pbs[]; SplitStream streams[]; int idx; boolean done = false; public SplitDataSource(Processor p, int idx) { this.p = p; this.ds = (PushBufferDataSource) p.getDataOutput(); this.idx = idx; pbs = ds.getStreams(); streams = new SplitStream[1]; streams[0] = new SplitStream(pbs[idx]); } public Format getStreamFormat() { return pbs[idx].getFormat(); } public MediaLocator getLocator() { return ds.getLocator(); } @Override public PushBufferStream[] getStreams() { return streams;
Creating a Web-based Media Control System using Networked Appliances
43
} @Override public String getContentType() { return ContentDescriptor.RAW; } @Override public void connect() throws IOException { throw new UnsupportedOperationException("Not supported yet."); } @Override public void disconnect() { throw new UnsupportedOperationException("Not supported yet."); } @Override public void start() throws IOException { p.start(); ds.start(); } @Override public void stop() throws IOException { p.stop(); ds.stop(); } @Override public Object getControl(String arg0) { return new Control[0]; } @Override public Object[] getControls() { return null; } @Override public Time getDuration() { return ds.getDuration(); } }
SplitStream.java
package Proto; import javax.media.Buffer; import javax.media.Control; import javax.media.Format; import javax.media.protocol.BufferTransferHandler; import javax.media.protocol.ContentDescriptor; import javax.media.protocol.PushBufferStream; class SplitStream implements PushBufferStream, BufferTransferHandler { PushBufferStream pbs; BufferTransferHandler bth; Format format;
Creating a Web-based Media Control System using Networked Appliances
44
public SplitStream(PushBufferStream pbs) { this.pbs = pbs; pbs.setTransferHandler(this); } public void read(Buffer buf) { } public ContentDescriptor getContentDescriptor() { return new ContentDescriptor(ContentDescriptor.RAW); } public boolean endOfStream() { return pbs.endOfStream(); } public long getContentLength() { return LENGTH_UNKNOWN; } public Format getFormat() { return pbs.getFormat(); } public void setTransferHandler(BufferTransferHandler bth) { this.bth = bth; } public Object getControl(String name) { // No controls return null; } public Object[] getControls() { // No controls return new Control[0]; } public synchronized void transferData(PushBufferStream pbs) { if (bth != null) { bth.transferData(pbs); } } }
AudioChannelSplitter.java
package prototype;
Creating a Web-based Media Control System using Networked Appliances
45
import javax.media.Buffer; import javax.media.format.AudioFormat; import javax.media.protocol.PushBufferStream; public class AudioChannelSplitter { private AudioFormat f; private int iChannels; private int iSilenceValue; private int iBits; private PushBufferStream pbs; public AudioChannelSplitter(PushBufferStream pbs) { f = (AudioFormat)pbs.getFormat(); this.pbs = pbs; this.iChannels = f.getChannels(); this.iBits = f.getSampleSizeInBits(); if (f.getSigned() == 1) { iSilenceValue = 0; } else { iSilenceValue = (int)(-1 * Math.pow(2.0, (double)iBits)); } System.err.println("CHANNELS: " + iChannels); System.err.println("SILENCE: " + iSilenceValue); System.err.println("BITS: " + iBits); } public synchronized PushBufferStream[] StereoToMono() { Buffer left = new Buffer(); Buffer right = new Buffer(); try { pbs.read(left); right.copy(left); } catch (Exception ex) { System.err.println("AudioSplit buffer problem"); } return null; } }
Creating a Web-based Media Control System using Networked Appliances
46
PrototypeApplet.java
package Proto; import java.awt.Component; import java.util.logging.Level; import java.util.logging.Logger; import javax.media.DataSink; import javax.media.Format; import javax.media.Manager; import javax.media.MediaLocator; import javax.media.Processor; import javax.media.ProcessorModel; import javax.media.format.AudioFormat; import javax.media.protocol.ContentDescriptor; import javax.media.protocol.PushBufferDataSource; import javax.media.protocol.PushBufferStream; /** * * @author James Presly */ public class PrototypeApplet extends javax.swing.JApplet { public void Main() { } public void SplitMedia(Processor proc) { try { PushBufferDataSource pbds = (PushBufferDataSource) proc.getDataOutput(); PushBufferStream[] pbs = pbds.getStreams(); SplitDataSource[] ss = new SplitDataSource[pbs.length]; for (int i = 0; i < pbs.length; i++) { ss[i] = new SplitDataSource(proc, i); System.out.println("Track Info: " + ss[i].getStreamFormat().toString()); } } catch (Exception ex) { Log(Level.SEVERE, "Creating DS went wrong" + ex.toString(), ex); } } public void TransmitMedia(String medURL, String destURL){ try { // source file MediaLocator mlSource = new MediaLocator(medURL); // destination IP MediaLocator mlDest = new MediaLocator(destURL); Processor p = null; DataSink sink = null; // desired track format Format[] form = new Format[] {new AudioFormat(AudioFormat.MPEG_RTP)};
Creating a Web-based Media Control System using Networked Appliances
47
// desired output format ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW_RTP); // create processor model using source media, format and content desc ProcessorModel model = new ProcessorModel(mlSource, form, cd); // use model to create instance of processor p = Manager.createRealizedProcessor(model); // create sink from processor output and dest URL sink = Manager.createDataSink(p.getDataOutput(),mlDest); // Start transmitting p.start(); sink.open(); sink.start(); } catch (Exception ex) { Log(Level.SEVERE, "Creating DS went wrong" + ex.toString(), ex); } } public void PlayMedia(String URL) { try { // source file MediaLocator mlSource = new MediaLocator("file://c:/airplane.wav"); Processor p = null; // GUI components Component controlComponent = null; Component visualComponent = null; // create processor model from source file // format null - automatically decide // content description null - render to screen ProcessorModel model = new ProcessorModel(mlSource, null, null); // creath the processor from the model p = (Processor) Manager.createRealizedProcessor(model); // add control panel if ((controlComponent = p.getControlPanelComponent()) != null) { System.err.println("control panel found"); MainPanel.add("South", controlComponent); } // get the video if there is one if ((visualComponent = p.getVisualComponent()) != null) { System.err.println("visual panel found"); MainPanel.add("Center", visualComponent); } } catch (Exception ex) { Log(Level.SEVERE, "Creating DS went wrong" + ex.toString(), ex); } } public Processor GetRawForm(String URL) { Processor proc = null; MediaLocator ml = new MediaLocator(URL); // set output format to raw Format[] form = new Format[] {new AudioFormat(AudioFormat.LINEAR)}; ContentDescriptor outputType = new ContentDescriptor(ContentDescriptor.RAW);
Creating a Web-based Media Control System using Networked Appliances
48
// create processor model and processor using this contentdescriptor ProcessorModel model = new ProcessorModel(ml, null, outputType); try { proc = Manager.createRealizedProcessor(model); } catch (Exception ex) { Log(Level.SEVERE, "Creating Processor went wrong" + ex.toString(), ex); return null; } return proc; } public void Log(Level l, String msg) { Logger.getLogger(Prototype.class.getName()).log(l, msg); } public void Log(Level l, String msg, Object ex) { Logger.getLogger(Prototype.class.getName()).log(l, msg, ex); } /** Initializes the applet Prototype */ public void init() { try { java.awt.EventQueue.invokeAndWait(new Runnable() { public void run() { initComponents(); Main(); } }); } catch (Exception ex) { ex.printStackTrace(); } }
Prototype_Effect.java
package Proto; import java.awt.Component; import java.util.logging.Level; import java.util.logging.Logger; import javax.media.Codec; import javax.media.Effect; import javax.media.Manager; import javax.media.MediaLocator; import javax.media.Processor; import javax.media.ProcessorModel; import javax.media.control.TrackControl; import javax.media.format.AudioFormat; import javax.media.protocol.ContentDescriptor; import javax.media.protocol.DataSource; public class Prototype_Effect extends javax.swing.JApplet { MediaLocator ml = new MediaLocator("file://c:/airplane.wav"); Processor p = null; Processor q = null; DataSource ds = null;
Creating a Web-based Media Control System using Networked Appliances
49
Component controlComponent = null; Component visualComponent = null; public void Main() { try { Effect e = new GainEffect(); AudioFormat afs[] = new AudioFormat[1]; afs[0] = new AudioFormat(AudioFormat.LINEAR, 44100, 16, 2); ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW); ProcessorModel model = new ProcessorModel(ml, afs, null); q = Manager.createRealizedProcessor(model); ds = q.getDataOutput(); p = getConfiguredProc(ds); TrackControl[] tc = p.getTrackControls(); Codec[] c = new Codec[1]; c[0] = e; tc[0].setCodecChain(c); // add control panel if ((controlComponent = p.getControlPanelComponent()) != null) { System.err.println("control panel found"); MainPanel.add("South", controlComponent); } // get the video if there is one if ((visualComponent = p.getVisualComponent()) != null) { System.err.println("visual panel found"); MainPanel.add("Center", visualComponent); } } catch (Exception ex) { Log(Level.SEVERE, "Creating DS went wrong " + ex.toString(), ex); } } public Processor getConfiguredProc(DataSource ds) throws InterruptedException { Processor p = null; try { p = Manager.createProcessor(ds); } catch (Exception e) { return null; } p.configure(); return p; } public void Log(Level l, String msg) { Logger.getLogger(Prototype.class.getName()).log(l, msg); } public void Log(Level l, String msg, Object ex) {
Creating a Web-based Media Control System using Networked Appliances
50
Logger.getLogger(Prototype.class.getName()).log(l, msg, ex); } /** Initializes the applet Prototype */ public void init() { try { java.awt.EventQueue.invokeAndWait(new Runnable() { public void run() { initComponents(); Main(); } }); } catch (Exception ex) { ex.printStackTrace(); } }
GainEffect.java – http://java.sun.com/
/* * @(#)GainEffect.java 1.4 01/03/13 * * Copyright (c) 1999-2001 Sun Microsystems, Inc. All Rights Reserved. * * Sun grants you ("Licensee") a non-exclusive, royalty free, license to use, * modify and redistribute this software in source and binary code form, * provided that i) this copyright notice and license appear on all copies of * the software; and ii) Licensee does not utilize the software in a manner * which is disparaging to Sun. * * This software is provided "AS IS," without a warranty of any kind. ALL * EXPRESS OR IMPLIED CONDITIONS, REPRESENTATIONS AND WARRANTIES, INCLUDING ANY * IMPLIED WARRANTY OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE OR * NON-INFRINGEMENT, ARE HEREBY EXCLUDED. SUN AND ITS LICENSORS SHALL NOT BE * LIABLE FOR ANY DAMAGES SUFFERED BY LICENSEE AS A RESULT OF USING, MODIFYING * OR DISTRIBUTING THE SOFTWARE OR ITS DERIVATIVES. IN NO EVENT WILL SUN OR ITS * LICENSORS BE LIABLE FOR ANY LOST REVENUE, PROFIT OR DATA, OR FOR DIRECT, * INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL OR PUNITIVE DAMAGES, HOWEVER * CAUSED AND REGARDLESS OF THE THEORY OF LIABILITY, ARISING OUT OF THE USE OF * OR INABILITY TO USE SOFTWARE, EVEN IF SUN HAS BEEN ADVISED OF THE * POSSIBILITY OF SUCH DAMAGES. * * This software is not designed or intended for use in on-line control of * aircraft, air traffic, aircraft navigation or aircraft communications; or in * the design, construction, operation or maintenance of any nuclear * facility. Licensee represents and warrants that it will not use or * redistribute the Software for such purposes. */ package Proto; import javax.media.*; import javax.media.format.*; import javax.media.format.*; public class GainEffect implements Effect { /** The effect name **/ private static String EffectName = "GainEffect";
Creating a Web-based Media Control System using Networked Appliances
51
/** chosen input Format **/ protected AudioFormat inputFormat; /** chosen output Format **/ protected AudioFormat outputFormat; /** supported input Formats **/ protected Format[] supportedInputFormats = new Format[0]; /** supported output Formats **/ protected Format[] supportedOutputFormats = new Format[0]; /** selected Gain **/ protected float gain = 1.2F; /** * initialize the formats */ public GainEffect() { supportedInputFormats = new Format[] { new AudioFormat( AudioFormat.LINEAR, Format.NOT_SPECIFIED, 16, Format.NOT_SPECIFIED, AudioFormat.BIG_ENDIAN, AudioFormat.SIGNED, 16, Format.NOT_SPECIFIED, Format.byteArray ) }; supportedOutputFormats = new Format[] { new AudioFormat( AudioFormat.LINEAR, Format.NOT_SPECIFIED, 16, Format.NOT_SPECIFIED, AudioFormat.BIG_ENDIAN, AudioFormat.SIGNED, 16, Format.NOT_SPECIFIED, Format.byteArray ) }; System.err.println("Set gain by: " + gain); } /** * get the resources needed by this effect */ public void open() throws ResourceUnavailableException { } /** * free the resources allocated by this codec */
Creating a Web-based Media Control System using Networked Appliances
52
public void close() { } /** * reset the codec */ public void reset() { } /** * no controls for this simple effect */ public Object[] getControls() { return (Object[])new Control[0]; } /** * Return the control based on a control type for the effect. */ public Object getControl(String controlType) { try { Class cls = Class.forName(controlType); Object cs[] = getControls(); for (int i = 0; i < cs.length; i++) { if (cls.isInstance(cs[i])) return cs[i]; } return null; } catch (Exception e) { // no such controlType or such control return null; } } /************** format methods *************/ /** * set the input format */ public Format setInputFormat(Format input) { // the following code assumes valid Format inputFormat = (AudioFormat)input; return (Format)inputFormat; } /** * set the output format */ public Format setOutputFormat(Format output) { // the following code assumes valid Format
Creating a Web-based Media Control System using Networked Appliances
53
outputFormat = (AudioFormat)output; return (Format)outputFormat; } /** * get the input format */ protected Format getInputFormat() { return inputFormat; } /** * get the output format */ protected Format getOutputFormat() { return outputFormat; } /** * supported input formats */ public Format[] getSupportedInputFormats() { return supportedInputFormats; } /** * output Formats for the selected input format */ public Format[] getSupportedOutputFormats(Format in) { if (!(in instanceof AudioFormat)) return new Format[0]; AudioFormat iaf = (AudioFormat)in; if (!iaf.matches(supportedInputFormats[0])) return new Format[0]; AudioFormat oaf = new AudioFormat( AudioFormat.LINEAR, iaf.getSampleRate(), 16, iaf.getChannels(), AudioFormat.BIG_ENDIAN, AudioFormat.SIGNED, 16, Format.NOT_SPECIFIED, Format.byteArray ); return new Format[] { oaf }; } /**
Creating a Web-based Media Control System using Networked Appliances
54
* gain accessor method */ public void setGain(float newGain) { gain = newGain; } /** * return effect name */ public String getName() { return EffectName; } /** * do the processing */ public int process(Buffer inputBuffer, Buffer outputBuffer) { // == prolog byte[] inData = (byte[])inputBuffer.getData(); int inLength = inputBuffer.getLength(); int inOffset = inputBuffer.getOffset(); byte[] outData = validateByteArraySize(outputBuffer, inLength); int outOffset = outputBuffer.getOffset(); int j = outOffset; int outLength = inLength; int samplesNumber = inLength / 2; // == main int tempH, tempL; short sample; for (int i = 0; i < samplesNumber; i++) { tempH = inData[inOffset++] & 0xff; tempL = inData[inOffset++] & 0xff; sample = (short)((tempH << 8) | tempL); sample = (short)(sample * gain); outData[j++] = (byte)(sample >> 8); outData[j++] = (byte)(sample & 0xff); } // == epilog updateOutput(outputBuffer, outputFormat, outLength, outOffset); return BUFFER_PROCESSED_OK; } /** * Utility: validate that the Buffer object's data size is at least * newSize bytes. * @return array with sufficient capacity **/
Creating a Web-based Media Control System using Networked Appliances
55
protected byte[] validateByteArraySize(Buffer buffer, int newSize) { Object objectArray = buffer.getData(); byte[] typedArray; if (objectArray instanceof byte[]) { // is correct type AND not null typedArray = (byte[])objectArray; if (typedArray.length >= newSize) { // is sufficient capacity return typedArray; } } typedArray = new byte[newSize]; buffer.setData(typedArray); return typedArray; } /** * utility: update the output buffer fields */ protected void updateOutput(Buffer outputBuffer, Format format, int length, int offset) { outputBuffer.setFormat(format); outputBuffer.setLength(length); outputBuffer.setOffset(offset); } }
Creating a Web-based Media Control System using Networked Appliances
56
Appendix E. Testing
Test 1: - Rendering media
Requirement: - Import a media stream into the system
Input: 1
"file://c:/airplane.wav"
Result: 1
Input: 2
file://c:/angel.avi
Result: 2
Creating a Web-based Media Control System using Networked Appliances
57
Test 2: - Transcoding media
Requirement: - Decode the media to a raw format
Input: 1
file://c:/angel.avi
Result: 1
com.sun.media.multiplexer.RawBufferMux$RawBufferDataSource@88df60
Test 3: - Demultiplexing media
Requirement: - Split media into components
Input: 1
"file://c:/creme.mov"
Result: 1
Track Info: AVC1, 640x360, FrameRate=29.9, Length=10339
Track Info: mp4a, 22050.0 Hz, 16-bit, Mono, BigEndian, Signed, FrameSize=16 bits
Track Info: mp4a/rtp, 22050.0 Hz, 8-bit, Mono
Input: 2
Creating a Web-based Media Control System using Networked Appliances
58
file://c:/intro.mov
Result: 2
Track Info: SVQ3, 720x480, FrameRate=15.0, Length=95470
Track Info: mp4a, 44100.0 Hz, 16-bit, Stereo, BigEndian, Signed, FrameSize=32 bits
Input: 3
"file://c:/angel.avi"
Result: 3
Track Info: DIVX, 480x360, FrameRate=25.0, Length=518400 0 extra bytes
Track Info: mpeglayer3, 44100.0 Hz, 0-bit, Stereo, Unsigned, 15999.0 frame rate,
FrameSize=9216 bits
Test 4: - Processing media
Requirement: - Process or apply effects to media
Test 5: - Encoding media for transmission
Requirement: - Encode media into format suitable for transmission
Test 6: - Transmitting media
Requirement: - Stream media to a destination URL
Input: 1
file://c:/airplane.wav
rtp://224.0.0.1:4000/audio
Result: 1
Media opened by JMStudio on IP 224.0.0.1:4000
Creating a Web-based Media Control System using Networked Appliances
59
Creating a Web-based Media Control System using Networked Appliances
60