manpagez: man pages & more
html files: gst-plugins-base-plugins-1.0
Home | html | info | man

playbin

playbin

Properties

GstElement * audio-sink Read / Write
GstElement * audio-stream-combiner Read / Write
gint64 av-offset Read / Write
gint64 buffer-duration Read / Write
gint buffer-size Read / Write
guint64 connection-speed Read / Write
gint current-audio Read / Write
gchar * current-suburi Read
gint current-text Read / Write
gchar * current-uri Read
gint current-video Read / Write
GstPlayFlags flags Read / Write
gboolean force-aspect-ratio Read / Write
gboolean mute Read / Write
gint n-audio Read
gint n-text Read
gint n-video Read
guint64 ring-buffer-max-size Read / Write
GstSample * sample Read
GstElement * source Read
gchar * subtitle-encoding Read / Write
gchar * subtitle-font-desc Write
gchar * suburi Read / Write
GstElement * text-sink Read / Write
GstElement * text-stream-combiner Read / Write
gchar * uri Read / Write
GstElement * video-sink Read / Write
GstElement * video-stream-combiner Read / Write
GstElement * vis-plugin Read / Write
gdouble volume Read / Write
GstElement * audio-filter Read / Write
GstElement * video-filter Read / Write
GstVideoMultiviewFlags video-multiview-flags Read / Write
GstVideoMultiviewFramePacking video-multiview-mode Read / Write

Types and Values

struct GstPlayBin

Object Hierarchy

    GObject
    ╰── GInitiallyUnowned
        ╰── GstObject
            ╰── GstElement
                ╰── GstBin
                    ╰── GstPipeline
                        ╰── GstPlayBin

Implemented Interfaces

GstPlayBin implements GstChildProxy, GstStreamVolume, GstVideoOverlay, GstNavigation and GstColorBalance.

Description

Playbin provides a stand-alone everything-in-one abstraction for an audio and/or video player.

Playbin can handle both audio and video files and features

  • automatic file type recognition and based on that automatic selection and usage of the right audio/video/subtitle demuxers/decoders

  • visualisations for audio files

  • subtitle support for video files. Subtitles can be store in external files.

  • stream selection between different video/audio/subtitles streams

  • meta info (tag) extraction

  • easy access to the last video sample

  • buffering when playing streams over a network

  • volume control with mute option

Usage

A playbin element can be created just like any other element using gst_element_factory_make(). The file/URI to play should be set via the “uri” property. This must be an absolute URI, relative file paths are not allowed. Example URIs are file:///home/joe/movie.avi or http://www.joedoe.com/foo.ogg

Playbin is a GstPipeline. It will notify the application of everything that's happening (errors, end of stream, tags found, state changes, etc.) by posting messages on its GstBus. The application needs to watch the bus.

Playback can be initiated by setting the element to PLAYING state using gst_element_set_state(). Note that the state change will take place in the background in a separate thread, when the function returns playback is probably not happening yet and any errors might not have occured yet. Applications using playbin should ideally be written to deal with things completely asynchroneous.

When playback has finished (an EOS message has been received on the bus) or an error has occured (an ERROR message has been received on the bus) or the user wants to play a different track, playbin should be set back to READY or NULL state, then the “uri” property should be set to the new location and then playbin be set to PLAYING state again.

Seeking can be done using gst_element_seek_simple() or gst_element_seek() on the playbin element. Again, the seek will not be executed instantaneously, but will be done in a background thread. When the seek call returns the seek will most likely still be in process. An application may wait for the seek to finish (or fail) using gst_element_get_state() with -1 as the timeout, but this will block the user interface and is not recommended at all.

Applications may query the current position and duration of the stream via gst_element_query_position() and gst_element_query_duration() and setting the format passed to GST_FORMAT_TIME. If the query was successful, the duration or position will have been returned in units of nanoseconds.

Advanced Usage: specifying the audio and video sink

By default, if no audio sink or video sink has been specified via the “audio-sink” or “video-sink” property, playbin will use the autoaudiosink and autovideosink elements to find the first-best available output method. This should work in most cases, but is not always desirable. Often either the user or application might want to specify more explicitly what to use for audio and video output.

If the application wants more control over how audio or video should be output, it may create the audio/video sink elements itself (for example using gst_element_factory_make()) and provide them to playbin using the “audio-sink” or “video-sink” property.

GNOME-based applications, for example, will usually want to create gconfaudiosink and gconfvideosink elements and make playbin use those, so that output happens to whatever the user has configured in the GNOME Multimedia System Selector configuration dialog.

The sink elements do not necessarily need to be ready-made sinks. It is possible to create container elements that look like a sink to playbin, but in reality contain a number of custom elements linked together. This can be achieved by creating a GstBin and putting elements in there and linking them, and then creating a sink GstGhostPad for the bin and pointing it to the sink pad of the first element within the bin. This can be used for a number of purposes, for example to force output to a particular format or to modify or observe the data before it is output.

It is also possible to 'suppress' audio and/or video output by using 'fakesink' elements (or capture it from there using the fakesink element's "handoff" signal, which, nota bene, is fired from the streaming thread!).

Retrieving Tags and Other Meta Data

Most of the common meta data (artist, title, etc.) can be retrieved by watching for TAG messages on the pipeline's bus (see above).

Other more specific meta information like width/height/framerate of video streams or samplerate/number of channels of audio streams can be obtained from the negotiated caps on the sink pads of the sinks.

Buffering

Playbin handles buffering automatically for the most part, but applications need to handle parts of the buffering process as well. Whenever playbin is buffering, it will post BUFFERING messages on the bus with a percentage value that shows the progress of the buffering process. Applications need to set playbin to PLAYING or PAUSED state in response to these messages. They may also want to convey the buffering progress to the user in some way. Here is how to extract the percentage information from the message:

1
2
3
4
5
6
7
8
9
switch (GST_MESSAGE_TYPE (msg)) {
  case GST_MESSAGE_BUFFERING: {
    gint percent = 0;
    gst_message_parse_buffering (msg, &percent);
    g_print ("Buffering (%u percent done)", percent);
    break;
  }
  ...
}

Note that applications should keep/set the pipeline in the PAUSED state when a BUFFERING message is received with a buffer percent value < 100 and set the pipeline back to PLAYING state when a BUFFERING message with a value of 100 percent is received (if PLAYING is the desired state, that is).

Embedding the video window in your application

By default, playbin (or rather the video sinks used) will create their own window. Applications will usually want to force output to a window of their own, however. This can be done using the GstVideoOverlay interface, which most video sinks implement. See the documentation there for more details.

Specifying which CD/DVD device to use

The device to use for CDs/DVDs needs to be set on the source element playbin creates before it is opened. The most generic way of doing this is to connect to playbin's "source-setup" (or "notify::source") signal, which will be emitted by playbin when it has created the source element for a particular URI. In the signal callback you can check if the source element has a "device" property and set it appropriately. In some cases the device can also be set as part of the URI, but it depends on the elements involved if this will work or not. For example, for DVD menu playback, the following syntax might work (if the resindvd plugin is used): dvd://[/path/to/device]

Handling redirects

Some elements may post 'redirect' messages on the bus to tell the application to open another location. These are element messages containing a structure named 'redirect' along with a 'new-location' field of string type. The new location may be a relative or an absolute URI. Examples for such redirects can be found in many quicktime movie trailers.

Examples

1
gst-launch-1.0 -v playbin uri=file:///path/to/somefile.mp4

This will play back the given AVI video file, given that the video and audio decoders required to decode the content are installed. Since no special audio sink or video sink is supplied (via playbin's audio-sink or video-sink properties) playbin will try to find a suitable audio and video sink automatically using the autoaudiosink and autovideosink elements.

1
gst-launch-1.0 -v playbin uri=cdda://4

This will play back track 4 on an audio CD in your disc drive (assuming the drive is detected automatically by the plugin).

1
gst-launch-1.0 -v playbin uri=dvd://

This will play back the DVD in your disc drive (assuming the drive is detected automatically by the plugin).

Synopsis

Element Information

plugin

playback

author

Wim Taymans <wim.taymans@gmail.com>

class

Generic/Bin/Player

Element Pads

Functions

Types and Values

struct GstPlayBin

struct GstPlayBin;

playbin element structure

Property Details

The “audio-sink” property

  “audio-sink”               GstElement *

the audio output element to use (NULL = default sink).

Flags: Read / Write


The “audio-stream-combiner” property

  “audio-stream-combiner”    GstElement *

Get or set the current audio stream combiner. By default, an input-selector is created and deleted as-needed.

Flags: Read / Write


The “av-offset” property

  “av-offset”                gint64

Control the synchronisation offset between the audio and video streams. Positive values make the audio ahead of the video and negative values make the audio go behind the video.

Flags: Read / Write

Default value: 0


The “buffer-duration” property

  “buffer-duration”          gint64

Buffer duration when buffering network streams.

Flags: Read / Write

Allowed values: >= G_MAXULONG

Default value: -1


The “buffer-size” property

  “buffer-size”              gint

Buffer size when buffering network streams.

Flags: Read / Write

Allowed values: >= G_MAXULONG

Default value: -1


The “connection-speed” property

  “connection-speed”         guint64

Network connection speed in kbps (0 = unknown).

Flags: Read / Write

Allowed values: <= 18446744073709551

Default value: 0


The “current-audio” property

  “current-audio”            gint

Get or set the currently playing audio stream. By default the first audio stream with data is played.

Flags: Read / Write

Allowed values: >= G_MAXULONG

Default value: -1


The “current-suburi” property

  “current-suburi”           gchar *

The currently playing subtitle uri.

Flags: Read

Default value: NULL


The “current-text” property

  “current-text”             gint

Get or set the currently playing subtitle stream. By default the first subtitle stream with data is played.

Flags: Read / Write

Allowed values: >= G_MAXULONG

Default value: -1


The “current-uri” property

  “current-uri”              gchar *

The currently playing uri.

Flags: Read

Default value: NULL


The “current-video” property

  “current-video”            gint

Get or set the currently playing video stream. By default the first video stream with data is played.

Flags: Read / Write

Allowed values: >= G_MAXULONG

Default value: -1


The “flags” property

  “flags”                    GstPlayFlags

Control the behaviour of playbin.

Flags: Read / Write

Default value: Render the video stream|Render the audio stream|Render subtitles|Use software volume|Deinterlace video if necessary|Use software color balance


The “force-aspect-ratio” property

  “force-aspect-ratio”       gboolean

When enabled, scaling will respect original aspect ratio.

Flags: Read / Write

Default value: TRUE


The “mute” property

  “mute”                     gboolean

Mute the audio channel without changing the volume.

Flags: Read / Write

Default value: FALSE


The “n-audio” property

  “n-audio”                  gint

Get the total number of available audio streams.

Flags: Read

Allowed values: >= 0

Default value: 0


The “n-text” property

  “n-text”                   gint

Get the total number of available subtitle streams.

Flags: Read

Allowed values: >= 0

Default value: 0


The “n-video” property

  “n-video”                  gint

Get the total number of available video streams.

Flags: Read

Allowed values: >= 0

Default value: 0


The “ring-buffer-max-size” property

  “ring-buffer-max-size”     guint64

The maximum size of the ring buffer in bytes. If set to 0, the ring buffer is disabled. Default 0.

Flags: Read / Write

Allowed values: <= G_MAXUINT

Default value: 0


The “sample” property

  “sample”                   GstSample *

Get the currently rendered or prerolled sample in the video sink. The GstCaps in the sample will describe the format of the buffer.

Flags: Read


The “source” property

  “source”                   GstElement *

Source element.

Flags: Read


The “subtitle-encoding” property

  “subtitle-encoding”        gchar *

Encoding to assume if input subtitles are not in UTF-8 encoding. If not set, the GST_SUBTITLE_ENCODING environment variable will be checked for an encoding to use. If that is not set either, ISO-8859-15 will be assumed.

Flags: Read / Write

Default value: NULL


The “subtitle-font-desc” property

  “subtitle-font-desc”       gchar *

Pango font description of font to be used for subtitle rendering.

Flags: Write

Default value: NULL


The “suburi” property

  “suburi”                   gchar *

Set the next subtitle URI that playbin will play. This property can be set from the about-to-finish signal to queue the next subtitle media file.

Flags: Read / Write

Default value: NULL


The “text-sink” property

  “text-sink”                GstElement *

the text output element to use (NULL = default subtitleoverlay).

Flags: Read / Write


The “text-stream-combiner” property

  “text-stream-combiner”     GstElement *

Get or set the current text stream combiner. By default, an input-selector is created and deleted as-needed.

Flags: Read / Write


The “uri” property

  “uri”                      gchar *

Set the next URI that playbin will play. This property can be set from the about-to-finish signal to queue the next media file.

Flags: Read / Write

Default value: NULL


The “video-sink” property

  “video-sink”               GstElement *

the video output element to use (NULL = default sink).

Flags: Read / Write


The “video-stream-combiner” property

  “video-stream-combiner”    GstElement *

Get or set the current video stream combiner. By default, an input-selector is created and deleted as-needed.

Flags: Read / Write


The “vis-plugin” property

  “vis-plugin”               GstElement *

the visualization element to use (NULL = default).

Flags: Read / Write


The “volume” property

  “volume”                   gdouble

Get or set the current audio stream volume. 1.0 means 100%, 0.0 means mute. This uses a linear volume scale.

Flags: Read / Write

Allowed values: [0,10]

Default value: 1


The “audio-filter” property

  “audio-filter”             GstElement *

the audio filter(s) to apply, if possible.

Flags: Read / Write


The “video-filter” property

  “video-filter”             GstElement *

the video filter(s) to apply, if possible.

Flags: Read / Write


The “video-multiview-flags” property

  “video-multiview-flags”    GstVideoMultiviewFlags

Override details of the multiview frame layout.

Flags: Read / Write


The “video-multiview-mode” property

  “video-multiview-mode”     GstVideoMultiviewFramePacking

Re-interpret a video stream as one of several frame-packed stereoscopic modes.

Flags: Read / Write

Default value: GST_VIDEO_MULTIVIEW_FRAME_PACKING_NONE

Signal Details

The “about-to-finish” signal

void
user_function (GstPlayBin *playbin,
               gpointer    user_data)

This signal is emitted when the current uri is about to finish. You can set the uri and suburi to make sure that playback continues.

This signal is emitted from the context of a GStreamer streaming thread.

Parameters

playbin

a GstPlayBin

 

user_data

user data set when the signal handler was connected.

 

Flags: Run Last


The “audio-changed” signal

void
user_function (GstPlayBin *playbin,
               gpointer    user_data)

This signal is emitted whenever the number or order of the audio streams has changed. The application will most likely want to select a new audio stream.

This signal may be emitted from the context of a GStreamer streaming thread. You can use gst_message_new_application() and gst_element_post_message() to notify your application's main thread.

Parameters

playbin

a GstPlayBin

 

user_data

user data set when the signal handler was connected.

 

Flags: Run Last


The “audio-tags-changed” signal

void
user_function (GstPlayBin *playbin,
               gint        stream,
               gpointer    user_data)

This signal is emitted whenever the tags of an audio stream have changed. The application will most likely want to get the new tags.

This signal may be emitted from the context of a GStreamer streaming thread. You can use gst_message_new_application() and gst_element_post_message() to notify your application's main thread.

Parameters

playbin

a GstPlayBin

 

stream

stream index with changed tags

 

user_data

user data set when the signal handler was connected.

 

Flags: Run Last


The “convert-sample” signal

GstSample*
user_function (GstPlayBin *playbin,
               GstCaps    *caps,
               gpointer    user_data)

Action signal to retrieve the currently playing video frame in the format specified by caps . If caps is NULL, no conversion will be performed and this function is equivalent to the “sample” property.

Parameters

playbin

a GstPlayBin

 

caps

the target format of the frame

 

user_data

user data set when the signal handler was connected.

 

Returns

a GstSample of the current video frame converted to caps. The caps on the sample will describe the final layout of the buffer data. NULL is returned when no current buffer can be retrieved or when the conversion failed.

Flags: Action


The “get-audio-pad” signal

GstPad*
user_function (GstPlayBin *playbin,
               gint        stream,
               gpointer    user_data)

Action signal to retrieve the stream-combiner sinkpad for a specific audio stream. This pad can be used for notifications of caps changes, stream-specific queries, etc.

Parameters

playbin

a GstPlayBin

 

stream

an audio stream number

 

user_data

user data set when the signal handler was connected.

 

Returns

a GstPad, or NULL when the stream number does not exist.

Flags: Action


The “get-audio-tags” signal

GstTagList*
user_function (GstPlayBin *playbin,
               gint        stream,
               gpointer    user_data)

Action signal to retrieve the tags of a specific audio stream number. This information can be used to select a stream.

Parameters

playbin

a GstPlayBin

 

stream

an audio stream number

 

user_data

user data set when the signal handler was connected.

 

Returns

a GstTagList with tags or NULL when the stream number does not exist.

Flags: Action


The “get-text-pad” signal

GstPad*
user_function (GstPlayBin *playbin,
               gint        stream,
               gpointer    user_data)

Action signal to retrieve the stream-combiner sinkpad for a specific text stream. This pad can be used for notifications of caps changes, stream-specific queries, etc.

Parameters

playbin

a GstPlayBin

 

stream

a text stream number

 

user_data

user data set when the signal handler was connected.

 

Returns

a GstPad, or NULL when the stream number does not exist.

Flags: Action


The “get-text-tags” signal

GstTagList*
user_function (GstPlayBin *playbin,
               gint        stream,
               gpointer    user_data)

Action signal to retrieve the tags of a specific text stream number. This information can be used to select a stream.

Parameters

playbin

a GstPlayBin

 

stream

a text stream number

 

user_data

user data set when the signal handler was connected.

 

Returns

a GstTagList with tags or NULL when the stream number does not exist.

Flags: Action


The “get-video-pad” signal

GstPad*
user_function (GstPlayBin *playbin,
               gint        stream,
               gpointer    user_data)

Action signal to retrieve the stream-combiner sinkpad for a specific video stream. This pad can be used for notifications of caps changes, stream-specific queries, etc.

Parameters

playbin

a GstPlayBin

 

stream

a video stream number

 

user_data

user data set when the signal handler was connected.

 

Returns

a GstPad, or NULL when the stream number does not exist.

Flags: Action


The “get-video-tags” signal

GstTagList*
user_function (GstPlayBin *playbin,
               gint        stream,
               gpointer    user_data)

Action signal to retrieve the tags of a specific video stream number. This information can be used to select a stream.

Parameters

playbin

a GstPlayBin

 

stream

a video stream number

 

user_data

user data set when the signal handler was connected.

 

Returns

a GstTagList with tags or NULL when the stream number does not exist.

Flags: Action


The “source-setup” signal

void
user_function (GstPlayBin *playbin,
               GstElement *source,
               gpointer    user_data)

This signal is emitted after the source element has been created, so it can be configured by setting additional properties (e.g. set a proxy server for an http source, or set the device and read speed for an audio cd source). This is functionally equivalent to connecting to the notify::source signal, but more convenient.

This signal is usually emitted from the context of a GStreamer streaming thread.

Parameters

playbin

a GstPlayBin

 

source

source element

 

user_data

user data set when the signal handler was connected.

 

Flags: Run Last


The “text-changed” signal

void
user_function (GstPlayBin *playbin,
               gpointer    user_data)

This signal is emitted whenever the number or order of the text streams has changed. The application will most likely want to select a new text stream.

This signal may be emitted from the context of a GStreamer streaming thread. You can use gst_message_new_application() and gst_element_post_message() to notify your application's main thread.

Parameters

playbin

a GstPlayBin

 

user_data

user data set when the signal handler was connected.

 

Flags: Run Last


The “text-tags-changed” signal

void
user_function (GstPlayBin *playbin,
               gint        stream,
               gpointer    user_data)

This signal is emitted whenever the tags of a text stream have changed. The application will most likely want to get the new tags.

This signal may be emitted from the context of a GStreamer streaming thread. You can use gst_message_new_application() and gst_element_post_message() to notify your application's main thread.

Parameters

playbin

a GstPlayBin

 

stream

stream index with changed tags

 

user_data

user data set when the signal handler was connected.

 

Flags: Run Last


The “video-changed” signal

void
user_function (GstPlayBin *playbin,
               gpointer    user_data)

This signal is emitted whenever the number or order of the video streams has changed. The application will most likely want to select a new video stream.

This signal is usually emitted from the context of a GStreamer streaming thread. You can use gst_message_new_application() and gst_element_post_message() to notify your application's main thread.

Parameters

playbin

a GstPlayBin

 

user_data

user data set when the signal handler was connected.

 

Flags: Run Last


The “video-tags-changed” signal

void
user_function (GstPlayBin *playbin,
               gint        stream,
               gpointer    user_data)

This signal is emitted whenever the tags of a video stream have changed. The application will most likely want to get the new tags.

This signal may be emitted from the context of a GStreamer streaming thread. You can use gst_message_new_application() and gst_element_post_message() to notify your application's main thread.

Parameters

playbin

a GstPlayBin

 

stream

stream index with changed tags

 

user_data

user data set when the signal handler was connected.

 

Flags: Run Last


The “element-setup” signal

void
user_function (GstPlayBin *playbin,
               GstElement *element,
               gpointer    user_data)

This signal is emitted when a new element is added to playbin or any of its sub-bins. This signal can be used to configure elements, e.g. to set properties on decoders. This is functionally equivalent to connecting to the deep-element-added signal, but more convenient.

This signal is usually emitted from the context of a GStreamer streaming thread, so might be called at the same time as code running in the main application thread.

Parameters

playbin

a GstPlayBin

 

element

an element that was added to the playbin hierarchy

 

user_data

user data set when the signal handler was connected.

 

Flags: Run Last

Since: 1.10

© manpagez.com 2000-2024
Individual documents may contain additional copyright information.