haiku-website/static/legacy-docs/bebook/TheMediaKit_Overview_Introd...

569 lines
74 KiB
HTML
Raw Permalink Blame History

This file contains invisible Unicode characters

This file contains invisible Unicode characters that are indistinguishable to humans but may be processed differently by a computer. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml"><head><meta http-equiv="Content-Type" content="text/html; charset=UTF-8" /><title>The Be Book - System Overview - The Media Kit</title><link rel="stylesheet" href="be_book.css" type="text/css" media="all" /><link rel="shortcut icon" type="image/vnd.microsoft.icon" href="./images/favicon.ico" /><!--[if IE]>
<link rel="stylesheet" type="text/css" href="be_book_ie.css" />
<![endif]--><meta name="generator" content="DocBook XSL Stylesheets V1.73.2" /><meta name="keywords" content="Access, BeOS, BeBook, API" /><link rel="start" href="index.html" title="The Be Book" /><link rel="up" href="TheMediaKit_Overview.html" title="The Media Kit" /><link rel="prev" href="TheMediaKit_Overview.html" title="The Media Kit" /><link rel="next" href="TheMediaKit_Overview_ReadingWriting.html" title="Reading and Writing Media Files" /></head><body><div id="header"><div id="headerT"><div id="headerTL"><a accesskey="p" href="TheMediaKit_Overview.html" title="The Media Kit"><img src="./images/navigation/prev.png" alt="Prev" /></a> <a accesskey="u" href="TheMediaKit_Overview.html" title="The Media Kit"><img src="./images/navigation/up.png" alt="Up" /></a> <a accesskey="n" href="TheMediaKit_Overview_ReadingWriting.html" title="Reading and Writing Media Files"><img src="./images/navigation/next.png" alt="Next" /></a></div><div id="headerTR"><div id="navigpeople"><a href="http://www.haiku-os.org"><img src="./images/People_24.png" alt="haiku-os.org" title="Visit The Haiku Website" /></a></div><div class="navighome" title="Home"><a accesskey="h" href="index.html"><img src="./images/navigation/home.png" alt="Home" /></a></div><div class="navigboxed" id="navigindex"><a accesskey="i" href="ClassIndex.html" title="Index">I</a></div><div class="navigboxed" id="naviglang" title="English">en</div></div><div id="headerTC">The Be Book - System Overview - The Media Kit</div></div><div id="headerB">Prev: <a href="TheMediaKit_Overview.html">The Media Kit</a>  Up: <a href="TheMediaKit_Overview.html">The Media Kit</a>  Next: <a href="TheMediaKit_Overview_ReadingWriting.html">Reading and Writing Media Files</a></div><hr /></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h2 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_Overview_Introduction"></a>Introduction</h2></div></div></div><p>The Media Kit provides powerful support for all forms of media
(including, but not limited to, audio and video), including both playback
and recording from and to a wide variety of media and devices.</p><p>There are two levels of Media Kit programming: the high level, where an
application accesses the Media Kit to play and record sound and video,
and the low level, which involves actually creating the nodes that
manipulate media data.</p><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><hr /><div xmlns:d="http://docbook.org/ns/docbook"><h3 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_Concepts"></a>Media Kit Concepts</h3></div></div></div><p>This section is a general overview of key Media Kit concepts.</p><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h4 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_Nodes"></a>Nodes</h4></div></div></div><p>The first thing that you need to understand as a media programmer working
with the BeOS is the concept of a node. Generically speaking, a node is a
specialized object in the media system that processes buffers of media
data. All nodes are indirectly derived from the
<a class="link" href="BMediaNode.html" title="BMediaNode"><code class="classname">BMediaNode</code></a>
class (but never directly from
<a class="link" href="BMediaNode.html" title="BMediaNode"><code class="classname">BMediaNode</code></a>;
instead, nodes are derived from specialized node type classes).</p><p>Nodes can be loaded from add-on modules, or they can be created within an
application itself. See the
<a class="link" href="BMediaAddOn.html" title="BMediaAddOn"><code class="classname">BMediaAddOn</code></a>
class for details.</p><p>The node kind (defined by the
<a class="link" href="TheMediaKit_Constants.html#Enums_node_kind" title="node_kind"><span class="type">node_kind</span></a>
type) is a description of the basic capabilities of a node.
See "<a class="xref" href="TheMediaKit_Overview_Introduction.html#TheMediaKit_TypesOfNodes" title="Types of Nodes">Types of Nodes</a>".</p><p>A <a class="link" href="TheMediaKit_DefinedTypes.html#media_node" title="media_node"><span class="type">media_node</span></a>
is a structure that an application uses when working with
media nodes; interactions with the Media Roster are almost always done
using this structure instead of an object actually derived from
<a class="link" href="BMediaNode.html" title="BMediaNode"><code class="classname">BMediaNode</code></a>.
The reason for this that applications will often need to
share nodes, and due to protected memory, it's not really feasible to
pass a
<a class="link" href="BMediaNode.html" title="BMediaNode"><code class="classname">BMediaNode</code></a>
pointer among the applications.</p><div class="admonition note"><div class="title">Note</div><div class="graphic"><img class="icon" alt="Note" width="32" src="./images/admonitions/Info_32.png" /><div class="text"><p>No portion of the media node protocol is optional; if you don't follow
all the rules precisely, you risk hurting media performance, and since
BeOS is the Media OS, that would be a bad thing.</p></div></div></div><p>For more detailed information about the architecture of the Media Kit (in
particular, how nodes relate to one another), please see the
<a class="link" href="BMediaNode.html" title="BMediaNode"><code class="classname">BMediaNode</code></a>
class.</p><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_TypesOfNodes"></a>Types of Nodes</h5></div></div></div><p>There are several basic kinds of nodes. Each of them is derived
originally from the <a class="link" href="BMediaNode.html" title="BMediaNode"><code class="classname">BMediaNode</code></a>
class. Any nodes that you might implement
will be derived, in turn, from one or more of these node types. The node
kind indicates which of these types a node implements.</p><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h6 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_TypesOfNodes_Producers"></a>Producers</h6></div></div></div><p>A producer (a node derived from the
<a class="link" href="BBufferProducer.html" title="BBufferProducer"><code class="classname">BBufferProducer</code></a> class) outputs media
buffers, which are then received by consumers. A producer might be
generating nodes on its own (for example, a tone generator might be using
a mathematical formula to generate a sound, or an audio file player might
be loading data from disk and sending buffers containing its audio data).
Other producers might be responsible for acquiring data from media
hardware (such as a video camera) and passing the media buffers to
consumers down the line.</p><p>In the example in the
"<a class="xref" href="TheMediaKit_Overview_Introduction.html#TheMediaKit_CommunicatingWithNodes" title="Communicating With Nodes">Communicating With Nodes</a>" section, the sound file
reader node is an example of a producer.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h6 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_TypesOfNodes_Consumers"></a>Consumers</h6></div></div></div><p>Consumers (nodes derived from
<a class="link" href="BBufferConsumer.html" title="BBufferConsumer"><code class="classname">BBufferConsumer</code></a>,
receive buffers from a
producer and process them in some manner. For example, a sound card's
software would provide a consumer node that receives audio buffers and
plays them through the card's hardware.</p><p>In the example in the
"<a class="xref" href="TheMediaKit_Overview_Introduction.html#TheMediaKit_CommunicatingWithNodes" title="Communicating With Nodes">Communicating With Nodes</a>"
section, the sound player node is an example of a consumer.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h6 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_TypesOfNodes_Filters"></a>Consumer/Producers (Filters)</h6></div></div></div><p>A consumer/producer (a node that derives from both
<a class="link" href="BBufferConsumer.html" title="BBufferConsumer"><code class="classname">BBufferConsumer</code></a> and
<a class="link" href="BBufferProducer.html" title="BBufferProducer"><code class="classname">BBufferProducer</code></a>)
is also called a filter. A filter accepts buffers (like
a consumer), processes them in some manner, then sends them back out
again (like a producer). This can be used to alter sound or video data.</p><p>For example, an audio filter might add a reverb effect to sound buffers,
or a video filter might add captioning to a video stream.</p><p>In the example in the
"<a class="xref" href="TheMediaKit_Overview_Introduction.html#TheMediaKit_CommunicatingWithNodes" title="Communicating With Nodes">Communicating With Nodes</a>"
section, the equalizer node is an example of a filter node.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h6 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_Controllables"></a>Controllables</h6></div></div></div><p>If a node wishes to provide the user options for configuring its
functionality, the node can derive from
<a class="link" href="BControllable.html" title="BControllable"><code class="classname">BControllable</code></a>. This provides
features for creating a network of controllable parameters, and for
publishing this information to Media Kit-savvy applications (including
the Media preference applications).</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h6 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_TimeSources"></a>Time Sources</h6></div></div></div><p>A time source node broadcasts timing information that can be used by
other nodes. All nodes are slaved to a time source, which provides
synchronization among all nodes slaved to that time source. Typically,
applications won't need to worry about this, because any node created
through the
<a class="link" href="BMediaRoster.html" title="BMediaRoster"><code class="classname">BMediaRoster</code></a>
class are automatically slaved to the system
(default) time source.</p><div class="admonition note"><div class="title">Note</div><div class="graphic"><img class="icon" alt="Note" width="32" src="./images/admonitions/Info_32.png" /><div class="text"><p>A node can be derived from any of these types
(<a class="link" href="BBufferProducer.html" title="BBufferProducer"><code class="classname">BBufferProducer</code></a>,
<a class="link" href="BBufferConsumer.html" title="BBufferConsumer"><code class="classname">BBufferConsumer</code></a>,
<a class="link" href="BTimeSource.html" title="BTimeSource"><code class="classname">BTimeSource</code></a>, and
<a class="link" href="BControllable.html" title="BControllable"><code class="classname">BControllable</code></a>),
in any combination, as appropriate.</p></div></div></div></div></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_TheNodeKind"></a>The Node Kind</h5></div></div></div><p>The <a class="link" href="TheMediaKit_Constants.html#Enums_node_kind" title="node_kind"><span class="type">node_kind</span></a>
type is used to identify which of these interfaces a node
implements; this lets the Media Kit know which APIs your node supports.
These flags include <code class="constant">B_BUFFER_PRODUCER</code>,
<code class="constant">B_BUFFER_CONSUMER</code>, <code class="constant">B_TIME_SOURCE</code>,
and <code class="constant">B_FILE_INTERFACE</code>.</p><p>There are other flags available in the node kind; these indicate special
node functions supported by the node. These are <code class="constant">B_PHYSICAL_INPUT</code>, which
indicates that the node is a physical input device, <code class="constant">B_PHYSICAL_OUTPUT</code>,
which indicates that the device is a physical output device, and
<code class="constant">B_SYSTEM_MIXER</code>, which indicates that the node is the system mixer.</p><p>The primary purpose to these flags is to help user interfaces determine
how to draw the interface for these nodes; they get special icons in the
Media preference application, for example.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_DerivingFromMultipleClasses"></a>Deriving From Multiple Classes</h5></div></div></div><p>For example, if you're creating a sound card node that plays audio to
stereo speakers, you would need to derive from
<a class="link" href="BBufferConsumer.html" title="BBufferConsumer"><code class="classname">BBufferConsumer</code></a> (in order
to receive audio buffers). You could also derive from
<a class="link" href="BTimeSource.html" title="BTimeSource"><code class="classname">BTimeSource</code></a> if your
sound card has the ability to provide timing information to others. And
if you want the user to be able to control the volume, balance, and so
forth, you would also derive from
<a class="link" href="BControllable.html" title="BControllable"><code class="classname">BControllable</code></a>.</p><p>If your sound card also provides a digitizer input, you would actually
create a second node to support that feature. It would inherit from
<a class="link" href="BBufferProducer.html" title="BBufferProducer"><code class="classname">BBufferProducer</code></a>
(so it can generate audio buffers for other nodes to
use). It might also derive from
<a class="link" href="BTimeSource.html" title="BTimeSource"><code class="classname">BTimeSource</code></a> and
<a class="link" href="BControllable.html" title="BControllable"><code class="classname">BControllable</code></a>.</p><p>But not all nodes necessarily represent a physical hardware device. If
you want to create a filter—for example, a noise-reduction
filter—you can create a node to do this too. Simply derive from
both <a class="link" href="BBufferConsumer.html" title="BBufferConsumer"><code class="classname">BBufferConsumer</code></a>
(so you can receive buffers) and
<a class="link" href="BBufferProducer.html" title="BBufferProducer"><code class="classname">BBufferProducer</code></a>
(so you can send back out the altered buffers).</p></div></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h4 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_SourceDestinationvsOutputInput"></a>Source &amp; Destination vs. Output &amp; Input</h4></div></div></div><p>Beginning Media Kit programmers may have trouble understanding the
difference between a
<a class="link" href="TheMediaKit_DefinedTypes.html#media_source" title="media_source"><span class="type">media_source</span></a> and a
<a class="link" href="TheMediaKit_DefinedTypes.html#media_output" title="media_output"><span class="type">media_output</span></a>, and a
<a class="link" href="TheMediaKit_DefinedTypes.html#media_destination" title="media_destination"><span class="type">media_destination</span></a>
and a
<a class="link" href="TheMediaKit_DefinedTypes.html#media_input" title="media_input"><span class="type">media_input</span></a>.</p><p>The <a class="link" href="TheMediaKit_DefinedTypes.html#media_source" title="media_source"><span class="type">media_source</span></a> and
<a class="link" href="TheMediaKit_DefinedTypes.html#media_destination" title="media_destination"><span class="type">media_destination</span></a>
structures describe a "socket" of
sorts (much like in networking). These are the ends of the connection,
much like the jacks you might plug cables into to connect various
components of a stereo system. They're relatively small, lightweight
descriptions containing only the information needed during real-time
manipulation of nodes. Buffers travel from the source to the destination.
You can use the basic operators (=, ==, and !=) on these objects.</p><p><span class="code"><span class="type">media_source</span>::<code class="constant">null</code></span> and
<span class="code"><span class="type">media_destination</span>::<code class="constant">null</code></span>
represent uninitialized endpoints.</p><p>The <a class="link" href="TheMediaKit_DefinedTypes.html#media_output" title="media_output"><span class="type">media_output</span></a>
and <a class="link" href="TheMediaKit_DefinedTypes.html#media_input" title="media_input"><span class="type">media_input</span></a>
structures describe an actual connection between a
<a class="link" href="TheMediaKit_DefinedTypes.html#media_source" title="media_source"><span class="type">media_source</span></a>
and a
<a class="link" href="TheMediaKit_DefinedTypes.html#media_destination" title="media_destination"><span class="type">media_destination</span></a>,
including the source and
destination, the connetion's name, and the format of the data the
connection is intended to handle. These are larger, and contain
additional information needed when presenting a user interface describing
the connections between nodes.</p><p>The <a class="link" href="TheMediaKit_DefinedTypes.html#media_input" title="media_input"><span class="type">media_input</span></a>
structure describes the receiving (upstream) end of a
connection;
<a class="link" href="TheMediaKit_DefinedTypes.html#media_output" title="media_output"><span class="type">media_output</span></a>
describes the sender (the downstream end).</p><p>Although <a class="link" href="TheMediaKit_DefinedTypes.html#media_output" title="media_output"><span class="type">media_output</span></a>
and <a class="link" href="TheMediaKit_DefinedTypes.html#media_input" title="media_input"><span class="type">media_input</span></a>
contain all the information of the
<a class="link" href="TheMediaKit_DefinedTypes.html#media_source" title="media_source"><span class="type">media_source</span></a> and
<a class="link" href="TheMediaKit_DefinedTypes.html#media_destination" title="media_destination"><span class="type">media_destination</span></a>
structures, the latter structures
exist because when you're doing real-time manipulation of media data, you
don't want to be tossing large blocks of data around unless you have to.
And you don't have to.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h4 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_MediaFormats"></a>Media Formats</h4></div></div></div><p>The <a class="link" href="TheMediaKit_DefinedTypes.html#media_format" title="media_format"><span class="type">media_format</span></a>
structure describes the type of media being passed
through a connection. The application and the nodes with which it's
working negotiate the format through a sequence of calls and callbacks.</p><p>This structure contains a basic media type (such as <code class="constant">B_MEDIA_RAW_AUDIO</code> or
<code class="constant">B_MEDIA_ENCODED_VIDEO</code>) and a union containing additional information
depending on the basic type.</p><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_RawAudio"></a>Raw Audio</h5></div></div></div><p>The media_format structure describing <code class="constant">B_MEDIA_RAW_AUDIO</code> media contains
the following information:</p><table class="variablelist"><tbody><tr><td><p><span class="term">Frame rate</span></p></td><td><p>The number of frames of audio per second.</p></td></tr><tr><td><p><span class="term">Format</span></p></td><td><p>The data type of each audio sample.</p></td></tr><tr><td><p><span class="term">Channel count</span></p></td><td><p>The number of samples per frame. For example, stereo
sound has two channels (left and right).</p></td></tr><tr><td><p><span class="term">Byte order</span></p></td><td><p>Indicates whether the data is big-endian or little-endian.</p></td></tr><tr><td><p><span class="term">Buffer size</span></p></td><td><p>The number of bytes of data per buffer.</p></td></tr></tbody></table></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_RawVideo"></a>Raw Video</h5></div></div></div><p>The <a class="link" href="TheMediaKit_DefinedTypes.html#media_format" title="media_format"><span class="type">media_format</span></a>
structure describing <code class="constant">B_MEDIA_RAW_VIDEO</code> media contains
the following information:</p><table class="variablelist"><tbody><tr><td><p><span class="term">Field rate</span></p></td><td><p>The number of fields (buffers) of video per second.</p></td></tr><tr><td><p><span class="term">Interlace</span></p></td><td><p>The number of fields per frame.</p></td></tr><tr><td><p><span class="term">Display information</span></p></td><td><p>The color space, line width, line count, bytes
per row, and so forth.</p></td></tr></tbody></table></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_FormatWildcards"></a>Format Wildcards</h5></div></div></div><p>A format wildcard indicates unspecified parts of the format. This is used
during negotiation so a node or application can say "I don't care about
this partifular part of the format; I'm flexible." For example, if your
application is negotiating with a video node, and you can handle any bit
depth of video, you would specify a wildcard in the color space field of
the format.</p><p>Each format type has one wildcard object.</p><p>For example, if you're preparing to negotiate with an audio node, and
don't care about the frame rate, the following code will set up the
media_format structure for negotiation:</p><pre class="programlisting example cpp"><span class="type">media_format</span> <code class="varname">format</code>;
<span class="type">media_raw_audio_format</span> <code class="varname">wc</code>;
<code class="varname">wc</code> = <span class="type">media_raw_audio_format</span>::<code class="varname">wildcard</code>;
<code class="varname">format</code>.<code class="varname">type</code> = <code class="constant">B_MEDIA_RAW_AUDIO</code>;
<code class="varname">format</code>.<code class="varname">u</code>.<code class="varname">raw_audio</code>.<code class="varname">frame_rate</code> = <code class="varname">wc</code>.<code class="varname">frame_rate</code>;</pre></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_Audio"></a>Audio: Samples and Frames and Buffers (Oh my)</h5></div></div></div><p>A sample is a single value that defines the amplitude of an audio
waveform at a given instant in time. This value can be represented in a
number of ways: as a one-byte integer, as a two-byte integer, as a
floating-point value, or in other ways. The native audio sample format in
BeOS is floating-point, where a value of 0.0 represents a wave point with
no amplitude. Positive values indicate a sample whose amplitude is above
the zero point, and negative values represent samples below the zero
point.</p><p>A frame is the set of samples that describes a sound at a given instant
in time. If the audio contains multiple channels, a frame contains
multiple samples, one for each channel. A monaural audio stream contains
one sample per frame, while a stereo audio stream contains two samples
per frame. Surround sound formats contain even more samples per frame of
audio.</p><p>A buffer is a more efficient way of sending audio from one node to
another. Instead of beaming around thousands of individual frames, frames
are grouped into buffers that are then transferred en masse along the
chain of nodes. This improves throughput and reduces overhead.</p><p>See
"<a class="link" href="TheMediaKit_DefinedTypes.html#media_raw_audio_format" title="media_raw_audio_format"><span class="type">media_raw_audio_format</span></a>"
for more information about describing audio formats.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_Video"></a>Video: Interlacing, Fields, and Frames</h5></div></div></div><p>Video can be represented either as non-interlaced or interlaced data.
Both <acronym class="acronym">NTSC</acronym>
video (the standard format for television video in the United
States) and <acronym class="acronym">PAL</acronym> video (the standard in many other countries) are
interlaced.</p><p>Typically, a video stream will have either one or two fields per frame.
If the video isn't interlaced, there's only one field, which contains all
the scan lines in the video stream. If the video is interlaced, there are
usually two fields per frame (it might be more, but two is most common).
One field contains the even lines of the frame, the other contains the
odd lines.</p><p>Video buffers in the BeOS Media Kit always contain either a complete
frame or a complete field; buffers never contain partial frames or fields.</p><p>See
"<a class="link" href="TheMediaKit_DefinedTypes.html#media_raw_video_format" title="media_raw_video_format"><span class="type">media_raw_video_format</span></a>"
for more information about describing the format of video data.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_ReferenceMaterial"></a>Reference Material</h5></div></div></div><p>This chapter doesn't pretend to be a tutorial on the intricacies of audio
and video data formats. There are plenty of good reference books on these
subjects. Here's a list of books our engineers suggest:</p><ul class="itemizedlist"><li><p>The Art of Digital Audio by John Watkinson (ISBN 0240513207)</p></li><li><p>Compression in Audio and Video by John Watkinson (ISBN 0240513940)</p></li><li><p>Digital Audio Signal Processing by Udo Zolzer (ISBN 0471972266)</p></li><li><p>Introduction to Signal Processing by Sophocles Orphanidis (ISBN )</p></li><li><p>Principles of Digital Audio by Ken C. Pohlmann (ISBN 0070504695)</p></li><li><p>A Programmer's Guide to Sound by Tim Kientzle (ISBN 0201419726)</p></li><li><p>Video Demystified by Keith Jack (ISBN 18780723X)</p></li></ul></div></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h4 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_Buffers"></a>Buffers</h4></div></div></div><p>A <a class="link" href="BBuffer.html" title="BBuffer"><code class="classname">BBuffer</code></a>
is a packet of media data that can be passed between nodes. It
contains a header that describes the data and the actual data being
transmitted.</p><p>The buffer header contains inforrmation that tells you what the data is:</p><table class="variablelist"><tbody><tr><td><p><span class="term">Start time</span></p></td><td><p>The time at which the data should be performed.</p></td></tr><tr><td><p><span class="term">Size used</span></p></td><td><p>The number of valid data bytes in the buffer.</p></td></tr></tbody></table><p>Additional information based on the media type.</p><p>For optimal performance, the media system stores buffer data in shared
memory areas so that different applications and nodes can access the same
buffer, without having to copy it around in memory. Buffers are passed
from one node to the next by giving each node a
<a class="link" href="BBuffer.html" title="BBuffer"><code class="classname">BBuffer</code></a> that references
the same area of memory.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h4 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_CommunicatingWithNodes"></a>Communicating With Nodes</h4></div></div></div><p>The <a class="link" href="BMediaRoster.html" title="BMediaRoster"><code class="classname">BMediaRoster</code></a>
class provides the application interface to all
available nodes (whether they're created by the application or by an
add-on). An application instantiates the nodes it needs, then establishes
the connections between them that will accomplish the desired task.</p><p>For example, if an application wants to play a sound file through a
graphic equalizer, it might first instantiate a node that reads sound
data from a disk file and outputs audio buffers, then instantiate a node
that performs filtering on audio buffers, and finally, a node that plays
sound buffers to the speakers.</p><p>Once these three nodes are instantiated, the application creates the
links between them. The output of the audio file reading node is
connected to the input of the equalizer node, then the equalizer node's
output is connected to the sound player node's input.:</p><div class="mediaobject"><img src="./images/TheMediaKit/intro1.png" alt="Node Tree" /></div><p>Once these connections are established, the application can then begin
playing the sound file, by telling the first node what sound file to
play, and then starting all of the nodes running. The sound file reader
creates buffers containing audio data, and passes those along to the
equalizer node, which alters them, then passes them along to the sound
player node, which plays the buffers and recycles them for reuse. A more
detailed example of how to work with nodes to play back media data is
given in the
<a class="link" href="BMediaRoster.html" title="BMediaRoster"><code class="classname">BMediaRoster</code></a> class.</p><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_ConnectionsAtTheLowLevel"></a>Connections at the Low Level</h5></div></div></div><p>Each media node maintains a control port. The Media Kit interacts with
the node by sending messages to the node's control port.</p></div></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h4 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_Time"></a>Time</h4></div></div></div><p>One of the most important issues in manipulating media data is to
properly track and synchronize precisely-timed events. There are several
kinds of time (and you don't even have to be Stephen Hawking to
understand them):</p><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_MediaTime"></a>Media Time</h5></div></div></div><p>Media time is time relative to a particular media file, and is
represented in terms of microseconds since the beginning of the file.
Seek operations are performed using media time.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_RealTime"></a>Real Time</h5></div></div></div><p>Real time is the time reported by the system clock, and is represented in
terms of microseconds since the computer was booted. It's used in
conjunction with system calls, such as
<a class="link" href="TheKernelKit_ThreadsAndTeams.html#snooze" title="snooze(), snooze_until()"><code class="function">snooze()</code></a>,
<a class="link" href="TheKernelKit_Ports.html#read_port_etc"><code class="function">read_port_etc()</code></a>, and so
forth.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_PerformanceTime"></a>Performance Time</h5></div></div></div><p>Performance time is the time at which events should occur at the final
output. It's reported by a time source node, which establishes the
relationship between real time and performance time.</p><p>Usually a single time source is the master clock for all the nodes in a
node chain.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_PerformanceTimevsRealTime"></a>Performance Time vs. Real Time</h5></div></div></div><p>There are two reasons why performance time and real time may differ: if
the master clock isn't the system time, there may be drift over time.
Also, latency caused by the time it takes for data to pass from one node
to another can cause drift as well.</p></div></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h4 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_Latency"></a>Latency</h4></div></div></div><p>Latency is the amount of time it takes to do something. There are four
kinds of latency:</p><div class="orderedlist"><ol><li><p>Algorithmic latency is any intentional or unavoidable delay added to
the processing of buffers, such as retimestamping buffers so they're
performed later. This also includes delays caused, for example, by a
filter node requiring that it hold samples for a while to compute delay
or echo effects.</p></li><li><p>Processing latency is the time it takes for a node to process a
buffer.</p></li><li><p>Scheduling latency is the amount of time it takes to schedule a
buffer.</p></li><li><p>Downstream latency is the amount of time that passes between the time
a node sends a buffer to the time at which the buffer can be performed.
This includes the internal latencies of every downstream node.</p></li></ol></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_ALatencyExample"></a>A Latency Example</h5></div></div></div><p>Let's consider a case in which three nodes are connected. The first node
has a processing latency of 3 microseconds, the second has a processing
latency of 2 microseconds, and the last has a processing latency of 1
microsecond.</p><div class="admonition note"><div class="title">Note</div><div class="graphic"><img class="icon" alt="Note" width="32" src="./images/admonitions/Info_32.png" /><div class="text"><p>This example uses the term "microseconds" because the Media Kit
measures time in microseconds; however, the latencies used in this
example may not be indicative of a real system.</p></div></div></div><p>In addition, 2 microseconds is required for buffers to pass from one node
to the next. The total latency of this chain of nodes, then, is 3 + 2 + 2
+ 2 + 1 = 10 microseconds.</p><p>A buffer is scheduled to be played at a performance time of 50
microseconds. In order to get this buffer to the last node in time to be
played at the right time, it needs to begin being processed at 40
microseconds. We see this in the diagram below.</p><div class="mediaobject"><img src="./images/TheMediaKit/latency1.png" alt="Node Tree" /></div><p>After the buffer has been processed by Node 1 and has been passed along
to Node 2, 5 microseconds have passed. Node 2 will take 2 microseconds to
process the buffer. This gets us to a performance time of 45 microseconds:</p><div class="mediaobject"><img src="./images/TheMediaKit/latency2.png" alt="Node Tree" /></div><p>Node 2 processes the buffer, and passes it along to Node 3. This takes a
total of 4 microseconds (2 microseconds of processing time plus 2
microseconds to be sent to Node 3). We arrive at the performance time of
49 microseconds:</p><div class="mediaobject"><img src="./images/TheMediaKit/latency3.png" alt="Node Tree" /></div><p>Finally, Node 3 processes the buffer; this requires 1 microsecond of
processing time. At this point, the performance time of 50 microseconds
has been reached, and the buffer has been performed on schedule.</p><div class="mediaobject"><img src="./images/TheMediaKit/latency4.png" alt="Node Tree" /></div></div></div></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><hr /><div xmlns:d="http://docbook.org/ns/docbook"><h3 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_UsingTheMediaKit"></a>Using the Media Kit</h3></div></div></div><p>If you're writing an application that wants to record or play back some
form of media data (such as a sound or a video file), all your media
needs are served by the
<a class="link" href="BMediaRoster.html" title="BMediaRoster"><code class="classname">BMediaRoster</code></a>
class. This class provides access to
the various nodes, and lets you establish the relationships among them
that are necessary to perform the tasks you'd like to accomplish.</p><div class="admonition note"><div class="title">Note</div><div class="graphic"><img class="icon" alt="Note" width="32" src="./images/admonitions/Info_32.png" /><div class="text"><p>The <a class="link" href="BMediaNode.html" title="BMediaNode"><code class="classname">BMediaNode</code></a>
is an abstract class; you don't call its functions
directly. Instead, you use
<a class="link" href="BMediaRoster.html" title="BMediaRoster"><code class="classname">BMediaRoster</code></a>
calls to issue requests to the
various nodes available on the BeOS system on which your application is
running. In addition, you can't derive a new class directly from
<a class="link" href="BMediaNode.html" title="BMediaNode"><code class="classname">BMediaNode</code></a>;
instead, derive from one of the system-defined subclasses
(<a class="link" href="BBufferConsumer.html" title="BBufferConsumer"><code class="classname">BBufferConsumer</code></a>,
<a class="link" href="BBufferProducer.html" title="BBufferProducer"><code class="classname">BBufferProducer</code></a>,
<a class="link" href="BControllable.html" title="BControllable"><code class="classname">BControllable</code></a>, and so forth).</p></div></div></div><p>Media Kit error code constants can be found in
<code class="filename">media/MediaDefs.h</code>.</p><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h4 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_TheMediaRoster"></a>The Media Roster</h4></div></div></div><p>The Media Roster manages an application's communication with the media
system. Each application has at most one instance of the media roster.
The roster is obtained by calling
<span class="code"><a class="link" href="BMediaRoster.html#BMediaRoster_Roster" title="Roster(), CurrentRoster()"><code class="classname">BMediaRoster</code>::<code class="methodname">Roster()</code></a></span>;
if it already exists, the current roster object is returned.</p><p>This section briefly summarizes some of the functions served by the Media
Roster; for more detailed information, see the
<a class="link" href="BMediaRoster.html" title="BMediaRoster"><code class="classname">BMediaRoster</code></a> class.</p><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_FindingTheRightNodes"></a>Finding the Right Nodes</h5></div></div></div><p>There are several standard nodes, which the user configures using the
Media preference application, plus the system mixer. The
<a class="link" href="BMediaRoster.html" title="BMediaRoster"><code class="classname">BMediaRoster</code></a>
class provides convenience routines to quickly get references to these
nodes, such as
<a class="link" href="BMediaRoster.html#BMediaRoster_GetAudioMixer"><code class="methodname">GetAudioMixer()</code></a> and
<a class="link" href="BMediaRoster.html#BMediaRoster_GetVideoOutput"><code class="methodname">GetVideoOutput()</code></a>.
See the <a class="link" href="BMediaRoster.html" title="BMediaRoster"><code class="classname">BMediaRoster</code></a>
class for details.</p><p>If you need some other node, you can browse through the available nodes
to find the one best-suited for your needs. Nodes are created from
dormant nodes, which live inside media add-ons. Each dormant node is a
reference to a node flavor, a structure that describes the nodes the
dormant node can create.</p><p>Once you've identified the best node for your purposes, you negotiate and
establish a connection to the node. This is discussed in the
<a class="link" href="BMediaRoster.html" title="BMediaRoster"><code class="classname">BMediaRoster</code></a>
overview.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_ControllingNodes"></a>Controlling Nodes</h5></div></div></div><p>Once your nodes are created and connected to each other, you can control
them by using the
<a class="link" href="BMediaNode.html" title="BMediaNode"><code class="classname">BMediaNode</code></a> functions
<a class="link" href="BMediaNode.html#BMediaNode_Preroll" title="Preroll()"><code class="methodname">Preroll()</code></a>,
<a class="link" href="BMediaNode.html#BMediaNode_Seek" title="Seek()"><code class="methodname">Seek()</code></a>,
<a class="link" href="BMediaNode.html#BMediaNode_Start" title="Start()"><code class="methodname">Start()</code></a>, and
<a class="link" href="BMediaNode.html#BMediaNode_Stop" title="Stop()"><code class="methodname">Stop()</code></a>.
These let you move to specific points in the media file and start
and stop playback or recording.</p><p>You can also set the nodes' time sources, run modes, and play rates.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_DisplayingAUserInterface"></a>Displaying a User Interface</h5></div></div></div><p><a class="link" href="BControllable.html" title="BControllable"><code class="classname">BControllable</code></a>
nodes can present a user interface representing the aspects
of itself that the user can configure. Each of these configurable aspects
are called a parameter. The
<a class="link" href="BMediaRoster.html" title="BMediaRoster"><code class="classname">BMediaRoster</code></a>
provides functions that let you
create a user interface for a node's parameters. See
<a class="link" href="BMediaRoster.html#BMediaRoster_StartControlPanel" title="StartControlPanel()"><code class="methodname">BMediaRoster::StartControlPanel()</code></a>
for the easiest way to do this.</p></div></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h4 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_MediaFiles"></a>Media Files</h4></div></div></div><p>The approved way to access files containing media data is via the
<a class="link" href="BMediaFile.html" title="BMediaFile"><code class="classname">BMediaFile</code></a> and
<a class="link" href="BMediaTrack.html" title="BMediaTrack"><code class="classname">BMediaTrack</code></a>
classes. If you're using a node-based playback
or recording system, and you want to have easy access to media files, you
can get access to the node used by the
<a class="link" href="BMediaFile.html" title="BMediaFile"><code class="classname">BMediaFile</code></a> class by calling
<a class="link" href="BMediaRoster.html#BMediaRoster_SniffRef" title="SniffRef(), SniffRefFor()"><code class="methodname">BMediaRoster::SniffRef()</code></a>.</p><p>See "<a class="xref" href="TheMediaKit_Overview_ReadingWriting.html" title="Reading and Writing Media Files">Reading and Writing Media Files</a>"
for an example of how to access media files the right way.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h4 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_TheAudioMixer"></a>The Audio Mixer</h4></div></div></div><p>The audio mixer accepts as input audio data which it then mixes and
outputs to the audio output device or devices the user has selected in
the Audio preference application. Your application can get a media_node
referencing the audio mixer using the
<a class="link" href="BMediaRoster.html#BMediaRoster_GetAudioMixer"><code class="methodname">BMediaRoster::GetAudioMixer()</code></a>
function. You can't intercept audio being output by the audio mixer; they
go directly to the output device.</p><p>Buffers containing any standard raw audio format can be sent to the audio
mixer; the mixer will convert the data into the appropriate format for
playback.</p><p>The audio mixer is always running, and is slaved to the most appropriate
time source. You should never change its time source or start or stop the
audio mixer (in other words, don't call the
<a class="link" href="BMediaNode.html" title="BMediaNode"><code class="classname">BMediaNode</code></a> calls
<a class="link" href="BMediaNode.html#BMediaNode_SetTimeSource" title="SetTimeSource(), TimeSource()"><code class="methodname">SetTimeSourceFor()</code></a>,
<a class="link" href="BMediaNode.html#BMediaNode_Start" title="Start()"><code class="methodname">Start()</code></a>, or
<a class="link" href="BMediaNode.html#BMediaNode_Stop" title="Stop()"><code class="methodname">Stop()</code></a>
on the audio mixer).</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h4 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_TheAudioInput"></a>The Audio Input</h4></div></div></div><p>The audio input creates audio buffers from external sources, such as
microphones or line in ports. The physical hardware device from which the
sound is input is configured by the user using the Audio preference
application.</p><p>In the current implementation of the Media Kit, the audio input doesn't
let you change the sampling rate. This may change in the future. To
ensure that your application will continue to work in the future, don't
assume that the current sampling rate will remain in effect; instead, you
should look at the
<a class="link" href="TheMediaKit_DefinedTypes.html#media_format" title="media_format"><span class="type">media_format</span></a>
structure in the
<a class="link" href="TheMediaKit_DefinedTypes.html#media_output" title="media_output"><span class="type">media_output</span></a>
you're using for your connection to the audio input:</p><pre class="programlisting example cpp"><span class="comment">/* it's the wrong frame rate */</span>
if (<code class="varname">input</code>-&gt;<code class="varname">format</code>.<code class="varname">media_raw_audio_format</code>.<code class="varname">frame_rate</code> !=
<code class="constant">MY_FRAME_RATE</code>) {
}</pre><p>The audio input is exclusive: only one connection to it is allowed at a
time. If you need to receive buffers from the input from two consumers,
you'll need to create a special node that receives audio buffers, then
sends copies of them to all the consumers that are attached to it.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h4 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_AudioPlaybackMadeEasy"></a>Audio Playback Made Easy</h4></div></div></div><p>If all you want to do is play back raw audio (such as
<acronym class="acronym">AIFF</acronym> or <acronym class="acronym">WAVE</acronym>
files), the Media Kit provides the
<a class="link" href="BSoundPlayer.html" title="BSoundPlayer"><code class="classname">BSoundPlayer</code></a>
class to simplify this process.
<a class="link" href="BSoundPlayer.html" title="BSoundPlayer"><code class="classname">BSoundPlayer</code></a>
hides the inner workings of the Media Kit from you
to make your life simple. See these two classes for more information; an
example on how to play audio files is given in the
<a class="link" href="BSoundPlayer.html" title="BSoundPlayer"><code class="classname">BSoundPlayer</code></a>
<a class="link" href="BSoundPlayer_Overview.html" title="BSoundPlayer">class overview</a>.</p><p>You might also want to consider the various sound playback classes
provided by the Game Kit, such as
<a class="link" href="BSimpleGameSound.html" title="BSimpleGameSound"><code class="classname">BSimpleGameSound</code></a> and
<a class="link" href="BFileGameSound.html" title="BFileGameSound"><code class="classname">BFileGameSound</code></a>.</p><p>If this is still too much for you, you can use the
<a class="link" href="TheMediaKit_Functions.html#play_sound" title="play_sound()"><code class="function">play_sound()</code></a>
global C function to play a sound file. The
<a class="link" href="TheMediaKit_Functions.html#stop_sound" title="stop_sound()"><code class="function">stop_sound()</code></a>
function can be used to
stop a sound started using
<a class="link" href="TheMediaKit_Functions.html#play_sound" title="play_sound()"><code class="function">play_sound()</code></a>, and
<a class="link" href="TheMediaKit_Functions.html#wait_for_sound" title="wait_for_sound()"><code class="function">wait_for_sound()</code></a>
lets you block until the sound finishes playing.</p></div></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><hr /><div xmlns:d="http://docbook.org/ns/docbook"><h3 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_CreatingNewNodeClasses"></a>Creating New Node Classes</h3></div></div></div><p>You can create your own nodes to perform different types of media
processing. Nodes can be provided in add-ons that the Media Kit can load
dormant nodes from, or in the application itself. This is discussed in
detail in the sections on
<a class="link" href="BMediaNode.html" title="BMediaNode"><code class="classname">BMediaNode</code></a>,
<a class="link" href="BBufferConsumer.html" title="BBufferConsumer"><code class="classname">BBufferConsumer</code></a>, and
<a class="link" href="BBufferProducer.html" title="BBufferProducer"><code class="classname">BBufferProducer</code></a>.</p><div class="admonition note"><div class="title">Note</div><div class="graphic"><img class="icon" alt="Note" width="32" src="./images/admonitions/Info_32.png" /><div class="text"><p>If your node uses multiple threads, make sure you thread-protect calls
to other nodes (in particular, be sure you thread-protect calls to the
<a class="link" href="BTimeSource.html" title="BTimeSource"><code class="classname">BTimeSource</code></a>).
Use a semaphore or other appropriate protection mechanism.</p></div></div></div><p>As a general rule, you should use the
<a class="link" href="BMediaEventLooper.html" title="BMediaEventLooper"><code class="classname">BMediaEventLooper</code></a>
class to handle the low-level scheduling and queuing of media events. See
"A BMediaEventLooper Example"
for an example of how this is done, including
an explanation of the key points of creating a new media node.</p><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h4 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_CreatingAMediaAddOn"></a>Creating a Media Add-on</h4></div></div></div><p>This is discussed in detail in the
<a class="link" href="BMediaAddOn.html" title="BMediaAddOn"><code class="classname">BMediaAddOn</code></a>
class overview.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h4 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_ApplicationBasedNodes"></a>Application-based Nodes</h4></div></div></div><p>You can create your own node subclasses in an application if your
application has special needs; just derive from the appropriate base
class (such as
<a class="link" href="BBufferConsumer.html" title="BBufferConsumer"><code class="classname">BBufferConsumer</code></a>)
as normal. Note, however, that your
application should never directly call any of your subclass' functions;
instead, you should register the node with the media roster, and control
it via <a class="link" href="BMediaRoster.html" title="BMediaRoster"><code class="classname">BMediaRoster</code></a>
calls, just like any other node, by using the
<a class="link" href="TheMediaKit_DefinedTypes.html#media_node" title="media_node"><span class="type">media_node</span></a>
that describes your node.</p><p>Once you've written the code for your node class, you can register it
with the Media Server by calling
<a class="link" href="BMediaRoster.html#BMediaRoster_RegisterNode" title="RegisterNode(), UnregisterNode()"><code class="methodname">BMediaRoster::RegisterNode()</code></a>. When
you're done with the node, you need to unregister it by calling
<a class="link" href="BMediaRoster.html#BMediaRoster_UnregisterNode"><code class="methodname">BMediaRoster::UnregisterNode()</code></a>.
The easiest way to do this is just have
the node class unregister itself when it's deleted.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h4 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_TimingIssues"></a>Timing Issues</h4></div></div></div><p>When dealing with a number of nodes cooperating in processing data, there
are always important timing concerns. This section covers how various
types of nodes need to behave in order to maintain proper timing.</p><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_CalculatingBufferStartTimes"></a>Calculating Buffer Start Times</h5></div></div></div><p>To calculate the presentation time at which a buffer should be performed,
you should keep track of how many frames have been played, then multiply
that value by 1000000LL/sample_rate (and, if your calculation is being
done using floating-point math, you should <code class="function">floor()</code> the result). You can
then apply whatever offset you want to
<a class="link" href="BTimeSource.html#BTimeSource_Seek" title="Seek()"><code class="methodname">Seek()</code></a> to.</p><pre class="programlisting example cpp"><code class="varname">buf</code>-&gt;<code class="methodname">Header</code>()-&gt;<code class="varname">size_used</code> = <code class="varname">your_buf_frames</code> * <code class="varname">your_frame_size</code>;
<code class="varname">buf</code>-&gt;<code class="methodname">Header()</code>-&gt;<code class="varname">start_time</code> =
<code class="varname">your_total_frames</code>*1000000LL/<code class="varname">your_format</code>.<code class="varname">frame_rate</code>;
<code class="varname">your_total_frames</code> += <code class="varname">your_buf_frames</code>;</pre><p>You shouldn't compute the start time by adding the previous buffer's
duration to its start time; the accumulation of rounding errors over time
will cause dropped samples about three times per second if you do.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_TimingIssues_Producers"></a>Producers</h5></div></div></div><p>Producers that produce buffers intended for output need to stamp each
buffer it creates with a <code class="parameter">startTime</code>, which indicates the performance time
at which the buffer should be played. If the producer is playing media
from a file, or synchronizing sound, this is the time at which the media
should become analog.</p><p>In order to compute this <code class="parameter">startTime</code>
properly, the producer must prepare
the buffers in advance, by the amount of time reported by
<a class="link" href="BBufferProducer.html#BBufferProducer_FindLatencyFor" title="FindLatencyFor()"><code class="methodname">BBufferProducer::FindLatencyFor()</code></a>.
The producer also needs to respond to
the <a class="link" href="BBufferProducer.html#BBufferProducer_LateNoticeReceived" title="LateNoticeReceived()"><code class="methodname">BBufferProducer::LateNoticeReceived()</code></a>
hook function by at least
updating the time stamps it's putting on the buffers it's sending out, so
they'll be played by the downstream nodes, which should be checking those
times to play them at the correct time (and may be dropping buffers if
they're late). If this isn't done, things will tend to get further and
further behind.</p><div class="admonition note"><div class="title">Note</div><div class="graphic"><img class="icon" alt="Note" width="32" src="./images/admonitions/Info_32.png" /><div class="text"><p>In general, it's best to try to produce buffers as late as possible
without actually causing them to arrive at their destination late (ie,
they should be sent at or before the time <span class="code"><code class="varname">presentationTime</code> -
<code class="varname">downstreamLatency</code></span>). This will ensure the best overall performance by
reducing the number of buffers that are pending (especially if the user
starts playing with time such that your node gets seeked or stopped).
Also, if you're producing buffers that have a real-world connection, such
as to a video display, producing them too early might cause them to be
displayed early.</p></div></div></div><p>If a producer is producing buffers that are being generated by a physical
input (such as a microphone jack, for example) handle things somewhat
differently. They stamp the generated buffers with the performance time
at which they were captured (this should be the time at which the first
sample in the buffer was taken). This means that when these buffers are
transmitted downstream, they'll always be "late" in the eyes of any node
they arrive at.</p><p>This also means you can't easily hook a physical input to a physical
output, because buffers will always arrive at the output later than the
timestamped value. You need to insert another node between the two to
adjust the time stamps appropriately so they won't be "late" anymore.</p><div class="admonition note"><div class="title">Note</div><div class="graphic"><img class="icon" alt="Note" width="32" src="./images/admonitions/Info_32.png" /><div class="text"><p>You can call
<a class="link" href="BMediaRoster.html#BMediaRoster_SetProducerRunModeDelay" title="SetProducerRunModeDelay()"><code class="methodname">BMediaRoster::SetProducerRunModeDelay()</code></a>
on a physical
input producer to make it appropriately retimestamp buffers it generates
automatically when recording.</p></div></div></div><p>Additionally, nodes that record data (such as file-writing nodes), in the
<code class="constant">B_RECORDING</code> run mode, shouldn't care about buffers that arrive late; this
lets data be recorded without concern for this issue.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_TimingIssues_Consumers"></a>Consumers</h5></div></div></div><p>If the consumer is the device that recognizes the media (ie, it plays the
audio or video contained in the buffers it receives), it needs to report
the correct latency back to the producer for the time it takes buffers to
reach the analog world (ie, the amount of time it takes to present the
data in the buffer to the user, whether it's audio or video). Buffers
that are received shouldn't be played until the <code class="parameter">startTime</code> stamped on the
buffers arrives. If buffers arrive late, the consumer should send a late
notice to the producer, so it can make the necessary adjustments, and not
pass the buffer along at all; be sure to
<a class="link" href="BBuffer.html#BBuffer_Recycle" title="Recycle()"><code class="methodname">Recycle()</code></a>
the buffers so they can be reused.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_TimingIssues_Filters"></a>Consumer/Producers (Filters)</h5></div></div></div><p>A consumer/producer (filter) must report the correct latency for the time
a buffer takes to pass through the filter from the time it's received to
the time it's retransmitted, plus the downstream latency. It shouldn't
change the time stamp, unless this explicitly part of the filter's
purpose. The filter should also handle late packets as described under
Producers and Consumers above.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_MediaApplications"></a>Media Applications</h5></div></div></div><p>The application that starts the nodes and the time source to which
they're slaved needs to provide them with the correct starting times. For
example, if several nodes have been connected, they've all been slaved to
an appropriate time source, and you want to start them all up, you need
to take the following steps:</p><div class="orderedlist"><ol><li><p>Find the latency of the entire network, so you can give it time to
come up to speed.</p></li><li><p>Start all the nodes with the performance time at which they should
start playing.</p></li><li><p>Seek the time source to the performance time and real time that you
want the performance time to be related to. This is crucial: it
establishes the relationship between performance time and real time; if
you forget to do this, things will look and sound very unpleasant as
the media tries desperately to adjust to the actual time.</p></li></ol></div><pre class="programlisting example cpp"><span class="type">bigtime_t</span> <code class="varname">latency</code>;
<code class="varname">Roster</code>-&gt;<code class="methodname">GetLatencyFor</code>(<code class="varname">node1</code>, &amp;<code class="varname">latency</code>);
<code class="varname">Roster</code>-&gt;<code class="methodname">PrerollNode</code>(<code class="varname">node1</code>);
<code class="varname">Roster</code>-&gt;<code class="methodname">PrerollNode</code>(<code class="varname">node2</code>);
<code class="varname">Roster</code>-&gt;<code class="methodname">PrerollNode</code>(<code class="varname">node3</code>);
<code class="varname">Roster</code>-&gt;<code class="methodname">StartNode</code>(<code class="varname">node1</code>, 0);
<code class="varname">Roster</code>-&gt;<code class="methodname">StartNode</code>(<code class="varname">node2</code>, 0);
<code class="varname">Roster</code>-&gt;<code class="methodname">StartNode</code>(<code class="varname">node3</code>, 0);
<span class="type">bigtime_t</span> <code class="varname">now</code> = <code class="function">system_time</code>();
<code class="varname">Roster</code>-&gt;<code class="methodname">SeekNode</code>(<code class="varname">timesourceNode</code>, -<code class="varname">latency</code>, <code class="varname">now</code> + 10000);
<code class="varname">Roster</code>-&gt;<code class="methodname">StartNode</code>(<code class="varname">timesourceNode</code>, <code class="varname">now</code> + 10000);</pre><p>The extra 10,000 microseconds is added in case the code gets preempted
while preparing to start the <code class="varname">timesourceNode</code>; this gives us a little fudge
factor so we don't start out behind.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h5 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_B_OFFLINERunMode"></a>B_OFFLINE Run Mode</h5></div></div></div><p>Nodes that run in offline mode are a special case in the timing world.</p><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h6 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_OfflineConsumers"></a>Consumers</h6></div></div></div><p>Consumers in <code class="constant">B_OFFLINE</code> mode derive their current time from the arrival of buffers
on their inputs. The current performance time is the minumum of all times
received on its active inputs. Active inputs are those inputs that are
connected and haven't received a
<a class="link" href="BBufferConsumer.html#BBufferConsumer_ProducerDataStatus" title="ProducerDataStatus()"><code class="methodname">ProducerDataStatus()</code></a>
call indicating that there are no buffers coming. You receive a time when you get a
buffer <code class="varname">start_time</code> or a
<a class="link" href="BBufferConsumer.html#BBufferConsumer_ProducerDataStatus" title="ProducerDataStatus()"><code class="methodname">ProducerDataStatus()</code></a>
call with <code class="constant">B_PRODUCER_STOPPED</code>.</p><p>Consumers in offline mode should call
<a class="link" href="BBufferConsumer.html#BBufferConsumer_RequestAdditionalBuffer" title="RequestAdditionalBuffer()"><code class="methodname">RequestAdditionalBuffer()</code></a>
once it's received and processed a buffer on one of its inputs in order to obtain
further buffers.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><div xmlns:d="http://docbook.org/ns/docbook"><h6 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_OfflineProducers"></a>Producers</h6></div></div></div><p>Just send buffers in sequence in <code class="constant">B_OFFLINE</code> mode. The recommended behavior
is to send the first buffer, then wait for an
<a class="link" href="BBufferProducer.html#BBufferProducer_AdditionalBufferRequested" title="AdditionalBufferRequested()"><code class="methodname">AdditionalBufferRequested()</code></a>
call before sending the next buffer. If this request doesn't arrive
within a reasonable amount of time (a second or so, depending on your
application), your node should accept that it's working with a
not-so-bright consumer and start sending buffers at your convenience.</p><div class="admonition warning"><div class="title">Warning</div><div class="graphic"><img class="icon" alt="Warning" width="32" src="./images/admonitions/Stop_32.png" /><div class="text"><p>Don't call the time source from your producer in <code class="constant">B_OFFLINE</code> mode.</p></div></div></div><p>If a producer has ever received an
<a class="link" href="BBufferProducer.html#BBufferProducer_AdditionalBufferRequested" title="AdditionalBufferRequested()"><code class="methodname">AdditionalBufferRequested()</code></a>
call, it should assume that the consumer knows what it's doing and should only
send buffers on request.</p></div></div></div></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><hr /><div xmlns:d="http://docbook.org/ns/docbook"><h3 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_InstallingMediaNodesAndDrivers"></a>Installing Media Nodes and Drivers</h3></div></div></div><p>Media node add-ons should be installed in the
<code class="filename">/boot/home/config/add-ons/media</code> directory.</p><p>Media drivers should be installed in
<code class="filename">/boot/home/config/add-ons/kernel/drivers/bin</code>.
Then create a simlink to the driver in
<code class="filename">/boot/home/config/add-ons/kernel/drivers/dev/<em class="replaceable"><code>type</code></em></code>, where
<em class="replaceable"><code>type</code></em> is the type of driver you're installing (audio, video, etc).</p><p>After installing a media node add-on, you have to restart the Media
Server for it to become available for use.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><hr /><div xmlns:d="http://docbook.org/ns/docbook"><h3 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_AboutEnumMembersOfClasses"></a>About enum Members of Classes</h3></div></div></div><p>The Media Kit has several classes (most notably,
<a class="link" href="BMediaNode.html" title="BMediaNode"><code class="classname">BMediaNode</code></a>) that
contain, as members, enums. For instance, in
<a class="link" href="BMediaNode.html" title="BMediaNode"><code class="classname">BMediaNode</code></a>,
you'll find the following:</p><pre class="programlisting example cpp">class <code class="classname">BMediaNode</code> {
...
enum <span class="type">run_mode</span> {
<code class="constant">B_OFFLINE</code> = 1,
<code class="constant">B_DECREASE_PRECISION</code>,
<code class="constant">B_INCREASE_LATENCY</code>,
<code class="constant">B_DROP_DATA</code>,
<code class="constant">B_RECORDING</code>
};
...
};</pre><p>In this case, you can freely use <code class="constant">B_OFFLINE</code> and so forth from within
objects derived from
<a class="link" href="BMediaNode.html" title="BMediaNode"><code class="classname">BMediaNode</code></a>,
but if you want to use these values from
other classes (or outside any class), you need to use the notation
<span class="code"><code class="classname">BMediaNode</code>::<code class="constant">B_OFFLINE</code></span> to use these constants. This is true of any enum
defined within a class; this will be called-out specifically in the
descriptions of any constants in this chapter.</p></div><div class="section"><div xmlns="" xmlns:d="http://docbook.org/ns/docbook" class="titlepage"><div><hr /><div xmlns:d="http://docbook.org/ns/docbook"><h3 xmlns="http://www.w3.org/1999/xhtml" class="title"><a id="TheMediaKit_AboutMultipleVirtualInheritance"></a>About Multiple Virtual Inheritance</h3></div></div></div><p>Virtual inheritance is slightly different from regular inheritance in
C++. The constructor for the virtual base class has to be explicitly (or
implicitly) called from the most-derived class being instantiated, rather
than being called from the direct descendant class actually defining the
virtual inheritance.</p><p>In simple terms, this means that whenever you derive a new class from a
class that uses virtual inheritance, your derived class's constructor
should explicitly call the parent class's constructor.</p></div></div><div id="footer"><hr /><div id="footerT">Prev: <a href="TheMediaKit_Overview.html">The Media Kit</a>  Up: <a href="TheMediaKit_Overview.html">The Media Kit</a>  Next: <a href="TheMediaKit_Overview_ReadingWriting.html">Reading and Writing Media Files</a> </div><div id="footerB"><div id="footerBL"><a href="TheMediaKit_Overview.html" title="The Media Kit"><img src="./images/navigation/prev.png" alt="Prev" /></a> <a href="TheMediaKit_Overview.html" title="The Media Kit"><img src="./images/navigation/up.png" alt="Up" /></a> <a href="TheMediaKit_Overview_ReadingWriting.html" title="Reading and Writing Media Files"><img src="./images/navigation/next.png" alt="Next" /></a></div><div id="footerBR"><div><a href="http://www.haiku-os.org"><img src="./images/People_24.png" alt="haiku-os.org" title="Visit The Haiku Website" /></a></div><div class="navighome" title="Home"><a accesskey="h" href="index.html"><img src="./images/navigation/home.png" alt="Home" /></a></div></div><div id="footerBC"><a href="http://www.access-company.com/home.html" title="ACCESS Co."><img alt="Access Company" src="./images/access_logo.png" /></a></div></div></div><div id="licenseFooter"><div id="licenseFooterBL"><a rel="license" href="http://creativecommons.org/licenses/by-nc-nd/3.0/" title="Creative Commons License"><img alt="Creative Commons License" style="border-width:0" src="https://licensebuttons.net/l/by-nc-nd/3.0/88x31.png" /></a></div><div id="licenseFooterBR"><a href="./LegalNotice.html">Legal Notice</a></div><div id="licenseFooterBC"><span id="licenseText">This work is licensed under a
<a rel="license" href="http://creativecommons.org/licenses/by-nc-nd/3.0/">Creative
Commons Attribution-Non commercial-No Derivative Works 3.0 License</a>.</span></div></div></body></html>