Top |
void | about-to-finish | Run Last |
void | audio-changed | Run Last |
void | audio-tags-changed | Run Last |
GstSample* | convert-sample | Action |
GstPad* | get-audio-pad | Action |
GstTagList* | get-audio-tags | Action |
GstPad* | get-text-pad | Action |
GstTagList* | get-text-tags | Action |
GstPad* | get-video-pad | Action |
GstTagList* | get-video-tags | Action |
void | source-setup | Run Last |
void | text-changed | Run Last |
void | text-tags-changed | Run Last |
void | video-changed | Run Last |
void | video-tags-changed | Run Last |
GObject ╰── GInitiallyUnowned ╰── GstObject ╰── GstElement ╰── GstBin ╰── GstPipeline ╰── GstPlayBin
GstPlayBin implements GstChildProxy, GstStreamVolume, GstVideoOverlay, GstNavigation and GstColorBalance.
Playbin provides a stand-alone everything-in-one abstraction for an audio and/or video player.
Playbin can handle both audio and video files and features
A playbin element can be created just like any other element using
gst_element_factory_make()
. The file/URI to play should be set via the “uri”
property. This must be an absolute URI, relative file paths are not allowed.
Example URIs are file:///home/joe/movie.avi or http://www.joedoe.com/foo.ogg
Playbin is a GstPipeline. It will notify the application of everything
that's happening (errors, end of stream, tags found, state changes, etc.)
by posting messages on its GstBus. The application needs to watch the
bus.
Playback can be initiated by setting the element to PLAYING state using
gst_element_set_state()
. Note that the state change will take place in
the background in a separate thread, when the function returns playback
is probably not happening yet and any errors might not have occured yet.
Applications using playbin should ideally be written to deal with things
completely asynchroneous.
When playback has finished (an EOS message has been received on the bus)
or an error has occured (an ERROR message has been received on the bus) or
the user wants to play a different track, playbin should be set back to
READY or NULL state, then the “uri” property should be set to the
new location and then playbin be set to PLAYING state again.
Seeking can be done using gst_element_seek_simple()
or gst_element_seek()
on the playbin element. Again, the seek will not be executed
instantaneously, but will be done in a background thread. When the seek
call returns the seek will most likely still be in process. An application
may wait for the seek to finish (or fail) using gst_element_get_state()
with
-1 as the timeout, but this will block the user interface and is not
recommended at all.
Applications may query the current position and duration of the stream
via gst_element_query_position()
and gst_element_query_duration()
and
setting the format passed to GST_FORMAT_TIME. If the query was successful,
the duration or position will have been returned in units of nanoseconds.
By default, if no audio sink or video sink has been specified via the
“audio-sink” or “video-sink” property, playbin will use the autoaudiosink
and autovideosink elements to find the first-best available output method.
This should work in most cases, but is not always desirable. Often either
the user or application might want to specify more explicitly what to use
for audio and video output.
If the application wants more control over how audio or video should be
output, it may create the audio/video sink elements itself (for example
using gst_element_factory_make()
) and provide them to playbin using the
“audio-sink” or “video-sink” property.
GNOME-based applications, for example, will usually want to create
gconfaudiosink and gconfvideosink elements and make playbin use those,
so that output happens to whatever the user has configured in the GNOME
Multimedia System Selector configuration dialog.
The sink elements do not necessarily need to be ready-made sinks. It is
possible to create container elements that look like a sink to playbin,
but in reality contain a number of custom elements linked together. This
can be achieved by creating a GstBin and putting elements in there and
linking them, and then creating a sink GstGhostPad for the bin and pointing
it to the sink pad of the first element within the bin. This can be used
for a number of purposes, for example to force output to a particular
format or to modify or observe the data before it is output.
It is also possible to 'suppress' audio and/or video output by using
'fakesink' elements (or capture it from there using the fakesink element's
"handoff" signal, which, nota bene, is fired from the streaming thread!).
Most of the common meta data (artist, title, etc.) can be retrieved by watching for TAG messages on the pipeline's bus (see above). Other more specific meta information like width/height/framerate of video streams or samplerate/number of channels of audio streams can be obtained from the negotiated caps on the sink pads of the sinks.
1 2 3 4 5 6 7 8 9 |
switch (GST_MESSAGE_TYPE (msg)) { case GST_MESSAGE_BUFFERING: { gint percent = 0; gst_message_parse_buffering (msg, &percent); g_print ("Buffering (%%u percent done)", percent); break; } ... } |
Some elements may post 'redirect' messages on the bus to tell the application to open another location. These are element messages containing a structure named 'redirect' along with a 'new-location' field of string type. The new location may be a relative or an absolute URI. Examples for such redirects can be found in many quicktime movie trailers.
1 |
gst-launch -v playbin uri=file:///path/to/somefile.avi |
1 |
gst-launch -v playbin uri=cdda://4 |
1 |
gst-launch -v playbin uri=dvd:// |
plugin |
playback |
author |
Wim Taymans <wim.taymans@gmail.com> |
class |
Generic/Bin/Player |
Extra flags to configure the behaviour of the sinks.
Enable rendering of the video stream |
||
Enable rendering of the audio stream |
||
Enable rendering of subtitles |
||
Enable rendering of visualisations when there is no video stream. |
||
Use software volume |
||
only allow native audio formats, this omits configuration of audioconvert and audioresample. |
||
only allow native video formats, this omits configuration of videoconvert and videoscale. |
||
enable progressice download buffering for selected formats. |
||
enable buffering of the demuxed or parsed data. |
||
deinterlace raw video (if native not forced). |
||
force audio/video filters to be applied if set. |
“audio-sink”
property“audio-sink” GstElement *
the audio output element to use (NULL = default sink).
Flags: Read / Write
“audio-stream-combiner”
property“audio-stream-combiner” GstElement *
Get or set the current audio stream combiner. By default, an input-selector is created and deleted as-needed.
Flags: Read / Write
“av-offset”
property “av-offset” gint64
Control the synchronisation offset between the audio and video streams. Positive values make the audio ahead of the video and negative values make the audio go behind the video.
Flags: Read / Write
Default value: 0
“buffer-duration”
property “buffer-duration” gint64
Buffer duration when buffering network streams.
Flags: Read / Write
Allowed values: >= G_MAXULONG
Default value: -1
“buffer-size”
property“buffer-size” gint
Buffer size when buffering network streams.
Flags: Read / Write
Allowed values: >= G_MAXULONG
Default value: -1
“connection-speed”
property “connection-speed” guint64
Network connection speed in kbps (0 = unknown).
Flags: Read / Write
Allowed values: <= 18446744073709551
Default value: 0
“current-audio”
property“current-audio” gint
Get or set the currently playing audio stream. By default the first audio stream with data is played.
Flags: Read / Write
Allowed values: >= G_MAXULONG
Default value: -1
“current-suburi”
property“current-suburi” gchar *
The currently playing subtitle uri.
Flags: Read
Default value: NULL
“current-text”
property“current-text” gint
Get or set the currently playing subtitle stream. By default the first subtitle stream with data is played.
Flags: Read / Write
Allowed values: >= G_MAXULONG
Default value: -1
“current-uri”
property“current-uri” gchar *
The currently playing uri.
Flags: Read
Default value: NULL
“current-video”
property“current-video” gint
Get or set the currently playing video stream. By default the first video stream with data is played.
Flags: Read / Write
Allowed values: >= G_MAXULONG
Default value: -1
“flags”
property“flags” GstPlayFlags
Control the behaviour of playbin.
Flags: Read / Write
Default value: Render the video stream|Render the audio stream|Render subtitles|Use software volume|Deinterlace video if necessary|Use software color balance
“force-aspect-ratio”
property“force-aspect-ratio” gboolean
When enabled, scaling will respect original aspect ratio.
Flags: Read / Write
Default value: TRUE
“mute”
property“mute” gboolean
Mute the audio channel without changing the volume.
Flags: Read / Write
Default value: FALSE
“n-audio”
property“n-audio” gint
Get the total number of available audio streams.
Flags: Read
Allowed values: >= 0
Default value: 0
“n-text”
property“n-text” gint
Get the total number of available subtitle streams.
Flags: Read
Allowed values: >= 0
Default value: 0
“n-video”
property“n-video” gint
Get the total number of available video streams.
Flags: Read
Allowed values: >= 0
Default value: 0
“ring-buffer-max-size”
property “ring-buffer-max-size” guint64
The maximum size of the ring buffer in bytes. If set to 0, the ring buffer is disabled. Default 0.
Flags: Read / Write
Allowed values: <= G_MAXUINT
Default value: 0
“sample”
property“sample” GstSample *
Get the currently rendered or prerolled sample in the video sink. The GstCaps in the sample will describe the format of the buffer.
Flags: Read
“subtitle-encoding”
property“subtitle-encoding” gchar *
Encoding to assume if input subtitles are not in UTF-8 encoding. If not set, the GST_SUBTITLE_ENCODING environment variable will be checked for an encoding to use. If that is not set either, ISO-8859-15 will be assumed.
Flags: Read / Write
Default value: NULL
“subtitle-font-desc”
property“subtitle-font-desc” gchar *
Pango font description of font to be used for subtitle rendering.
Flags: Write
Default value: NULL
“suburi”
property“suburi” gchar *
Set the next subtitle URI that playbin will play. This property can be set from the about-to-finish signal to queue the next subtitle media file.
Flags: Read / Write
Default value: NULL
“text-sink”
property“text-sink” GstElement *
the text output element to use (NULL = default subtitleoverlay).
Flags: Read / Write
“text-stream-combiner”
property“text-stream-combiner” GstElement *
Get or set the current text stream combiner. By default, an input-selector is created and deleted as-needed.
Flags: Read / Write
“uri”
property“uri” gchar *
Set the next URI that playbin will play. This property can be set from the about-to-finish signal to queue the next media file.
Flags: Read / Write
Default value: NULL
“video-sink”
property“video-sink” GstElement *
the video output element to use (NULL = default sink).
Flags: Read / Write
“video-stream-combiner”
property“video-stream-combiner” GstElement *
Get or set the current video stream combiner. By default, an input-selector is created and deleted as-needed.
Flags: Read / Write
“vis-plugin”
property“vis-plugin” GstElement *
the visualization element to use (NULL = default).
Flags: Read / Write
“volume”
property“volume” gdouble
Get or set the current audio stream volume. 1.0 means 100%, 0.0 means mute. This uses a linear volume scale.
Flags: Read / Write
Allowed values: [0,10]
Default value: 1
“audio-filter”
property“audio-filter” GstElement *
the audio filter(s) to apply, if possible.
Flags: Read / Write
“video-filter”
property“video-filter” GstElement *
the video filter(s) to apply, if possible.
Flags: Read / Write
“about-to-finish”
signalvoid user_function (GstPlayBin *playbin, gpointer user_data)
This signal is emitted when the current uri is about to finish. You can set the uri and suburi to make sure that playback continues.
This signal is emitted from the context of a GStreamer streaming thread.
Flags: Run Last
“audio-changed”
signalvoid user_function (GstPlayBin *playbin, gpointer user_data)
This signal is emitted whenever the number or order of the audio streams has changed. The application will most likely want to select a new audio stream.
This signal may be emitted from the context of a GStreamer streaming thread.
You can use gst_message_new_application()
and gst_element_post_message()
to notify your application's main thread.
Flags: Run Last
“audio-tags-changed”
signalvoid user_function (GstPlayBin *playbin, gint stream, gpointer user_data)
This signal is emitted whenever the tags of an audio stream have changed. The application will most likely want to get the new tags.
This signal may be emitted from the context of a GStreamer streaming thread.
You can use gst_message_new_application()
and gst_element_post_message()
to notify your application's main thread.
playbin |
||
stream |
stream index with changed tags |
|
user_data |
user data set when the signal handler was connected. |
Flags: Run Last
“convert-sample”
signalGstSample* user_function (GstPlayBin *playbin, GstCaps *caps, gpointer user_data)
Action signal to retrieve the currently playing video frame in the format
specified by caps
.
If caps
is NULL
, no conversion will be performed and this function is
equivalent to the “frame” property.
playbin |
||
caps |
the target format of the frame |
|
user_data |
user data set when the signal handler was connected. |
a GstSample of the current video frame converted to caps.
The caps on the sample will describe the final layout of the buffer data.
NULL
is returned when no current buffer can be retrieved or when the
conversion failed.
Flags: Action
“get-audio-pad”
signalGstPad* user_function (GstPlayBin *playbin, gint stream, gpointer user_data)
Action signal to retrieve the stream-combiner sinkpad for a specific audio stream. This pad can be used for notifications of caps changes, stream-specific queries, etc.
playbin |
||
stream |
an audio stream number |
|
user_data |
user data set when the signal handler was connected. |
Flags: Action
“get-audio-tags”
signalGstTagList* user_function (GstPlayBin *playbin, gint stream, gpointer user_data)
Action signal to retrieve the tags of a specific audio stream number. This information can be used to select a stream.
playbin |
||
stream |
an audio stream number |
|
user_data |
user data set when the signal handler was connected. |
Flags: Action
“get-text-pad”
signalGstPad* user_function (GstPlayBin *playbin, gint stream, gpointer user_data)
Action signal to retrieve the stream-combiner sinkpad for a specific text stream. This pad can be used for notifications of caps changes, stream-specific queries, etc.
playbin |
||
stream |
a text stream number |
|
user_data |
user data set when the signal handler was connected. |
Flags: Action
“get-text-tags”
signalGstTagList* user_function (GstPlayBin *playbin, gint stream, gpointer user_data)
Action signal to retrieve the tags of a specific text stream number. This information can be used to select a stream.
playbin |
||
stream |
a text stream number |
|
user_data |
user data set when the signal handler was connected. |
Flags: Action
“get-video-pad”
signalGstPad* user_function (GstPlayBin *playbin, gint stream, gpointer user_data)
Action signal to retrieve the stream-combiner sinkpad for a specific video stream. This pad can be used for notifications of caps changes, stream-specific queries, etc.
playbin |
||
stream |
a video stream number |
|
user_data |
user data set when the signal handler was connected. |
Flags: Action
“get-video-tags”
signalGstTagList* user_function (GstPlayBin *playbin, gint stream, gpointer user_data)
Action signal to retrieve the tags of a specific video stream number. This information can be used to select a stream.
playbin |
||
stream |
a video stream number |
|
user_data |
user data set when the signal handler was connected. |
Flags: Action
“source-setup”
signalvoid user_function (GstPlayBin *playbin, GstElement *source, gpointer user_data)
This signal is emitted after the source element has been created, so it can be configured by setting additional properties (e.g. set a proxy server for an http source, or set the device and read speed for an audio cd source). This is functionally equivalent to connecting to the notify::source signal, but more convenient.
This signal is usually emitted from the context of a GStreamer streaming thread.
playbin |
||
source |
source element |
|
user_data |
user data set when the signal handler was connected. |
Flags: Run Last
“text-changed”
signalvoid user_function (GstPlayBin *playbin, gpointer user_data)
This signal is emitted whenever the number or order of the text streams has changed. The application will most likely want to select a new text stream.
This signal may be emitted from the context of a GStreamer streaming thread.
You can use gst_message_new_application()
and gst_element_post_message()
to notify your application's main thread.
Flags: Run Last
“text-tags-changed”
signalvoid user_function (GstPlayBin *playbin, gint stream, gpointer user_data)
This signal is emitted whenever the tags of a text stream have changed. The application will most likely want to get the new tags.
This signal may be emitted from the context of a GStreamer streaming thread.
You can use gst_message_new_application()
and gst_element_post_message()
to notify your application's main thread.
playbin |
||
stream |
stream index with changed tags |
|
user_data |
user data set when the signal handler was connected. |
Flags: Run Last
“video-changed”
signalvoid user_function (GstPlayBin *playbin, gpointer user_data)
This signal is emitted whenever the number or order of the video streams has changed. The application will most likely want to select a new video stream.
This signal is usually emitted from the context of a GStreamer streaming
thread. You can use gst_message_new_application()
and
gst_element_post_message()
to notify your application's main thread.
Flags: Run Last
“video-tags-changed”
signalvoid user_function (GstPlayBin *playbin, gint stream, gpointer user_data)
This signal is emitted whenever the tags of a video stream have changed. The application will most likely want to get the new tags.
This signal may be emitted from the context of a GStreamer streaming thread.
You can use gst_message_new_application()
and gst_element_post_message()
to notify your application's main thread.
playbin |
||
stream |
stream index with changed tags |
|
user_data |
user data set when the signal handler was connected. |
Flags: Run Last