Top | ![]() |
![]() |
![]() |
![]() |
GValueArray * | channel-positions | Read / Write |
GstRawAudioParseFormat | format | Read / Write |
gboolean | interleaved | Read / Write |
gint | num-channels | Read / Write |
GstAudioFormat | pcm-format | Read / Write |
gint | sample-rate | Read / Write |
GObject ╰── GInitiallyUnowned ╰── GstObject ╰── GstElement ╰── GstBaseParse ╰── GstRawBaseParse ╰── GstRawAudioParse
This element parses incoming data as raw video frames and timestamps these. It also handles seek queries in said raw video data, and ensures that output buffers contain exactly one frame, even if the input buffers contain only partial frames or multiple frames. In the former case, it will continue to receive buffers until there is enough input data to output one frame. In the latter case, it will extract the first frame in the buffer and output it, then the second one etc. until the remaining unparsed bytes aren't enough to form a complete frame, and it will then continue as described in the earlier case.
The element implements the properties and sink caps configuration as specified in the GstRawBaseParse documentation. The properties configuration can be modified by using the width, height, pixel-aspect-ratio, framerate, interlaced, top-field-first, plane-strides, plane-offsets, and frame-size properties.
If the properties configuration is used, plane strides and offsets will be
computed by using gst_video_info_set_format()
. This can be overridden by passing
GstValueArrays to the plane-offsets and plane-strides properties. When this is
done, these custom offsets and strides are used later even if new width,
height, format etc. property values might be set. To switch back to computed
plane strides & offsets, pass NULL to one or both of the plane-offset and
plane-array properties.
The frame size property is useful in cases where there is extra data between the frames (for example, trailing metadata, or headers). The parser calculates the actual frame size out of the other properties and compares it with this frame-size value. If the frame size is larger than the calculated size, then the extra bytes after the end of the frame are skipped. For example, with 8-bit grayscale frames and a actual frame size of 100x10 pixels and a frame-size of 1500 bytes, there are 500 excess bytes at the end of the actual frame which are then skipped. It is safe to set the frame size to a value that is smaller than the actual frame size (in fact, its default value is 0); if it is smaller, then no trailing data will be skipped.
If a framerate of 0 Hz is set (for example, 0/1), then output buffers will have no duration set. The first output buffer will have a PTS 0, all subsequent ones an unset PTS.
1 2 |
gst-launch-1.0 filesrc location=video.raw ! rawvideoparse use-sink-caps=false \ width=500 height=400 format=y444 ! autovideosink |
Read raw data from a local file and parse it as video data with 500x400 pixels and Y444 video format.
1 2 3 |
gst-launch-1.0 filesrc location=video.raw ! queue ! "video/x-raw, width=320, \ height=240, format=I420, framerate=1/1" ! rawvideoparse \ use-sink-caps=true ! autovideosink |
Read raw data from a local file and parse it as video data with 320x240 pixels and I420 video format. The queue element here is to force push based scheduling. See the documentation in GstRawBaseParse for the reason why.
plugin |
rawparse |
author |
Carlos Rafael Giani <dv@pseudoterminal.org> |
class |
Codec/Parser/Video |
name |
sink |
direction |
sink |
presence |
always |
details |
video/x-unaligned-raw, format=(string){ I420, YV12, YUY2, UYVY, AYUV, RGBx, BGRx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR, Y41B, Y42B, YVYU, Y444, v210, v216, NV12, NV21, GRAY8, GRAY16_BE, GRAY16_LE, v308, RGB16, BGR16, RGB15, BGR15, UYVP, A420, RGB8P, YUV9, YVU9, IYU1, ARGB64, AYUV64, r210, I420_10BE, I420_10LE, I422_10BE, I422_10LE, Y444_10BE, Y444_10LE, GBR, GBR_10BE, GBR_10LE, NV16, NV24, NV12_64Z32, A420_10BE, A420_10LE, A422_10BE, A422_10LE, A444_10BE, A444_10LE, NV61, P010_10BE, P010_10LE, IYU2, VYUY, GBRA, GBRA_10BE, GBRA_10LE, GBR_12BE, GBR_12LE, GBRA_12BE, GBRA_12LE, I420_12BE, I420_12LE, I422_12BE, I422_12LE, Y444_12BE, Y444_12LE, GRAY10_LE32, NV12_10LE32, NV16_10LE32 }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ] |
video/x-raw, format=(string){ I420, YV12, YUY2, UYVY, AYUV, RGBx, BGRx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR, Y41B, Y42B, YVYU, Y444, v210, v216, NV12, NV21, GRAY8, GRAY16_BE, GRAY16_LE, v308, RGB16, BGR16, RGB15, BGR15, UYVP, A420, RGB8P, YUV9, YVU9, IYU1, ARGB64, AYUV64, r210, I420_10BE, I420_10LE, I422_10BE, I422_10LE, Y444_10BE, Y444_10LE, GBR, GBR_10BE, GBR_10LE, NV16, NV24, NV12_64Z32, A420_10BE, A420_10LE, A422_10BE, A422_10LE, A444_10BE, A444_10LE, NV61, P010_10BE, P010_10LE, IYU2, VYUY, GBRA, GBRA_10BE, GBRA_10LE, GBR_12BE, GBR_12LE, GBRA_12BE, GBRA_12LE, I420_12BE, I420_12LE, I422_12BE, I422_12LE, Y444_12BE, Y444_12LE, GRAY10_LE32, NV12_10LE32, NV16_10LE32 }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ] |
name |
src |
direction |
source |
presence |
always |
details |
video/x-raw, format=(string){ I420, YV12, YUY2, UYVY, AYUV, RGBx, BGRx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR, Y41B, Y42B, YVYU, Y444, v210, v216, NV12, NV21, GRAY8, GRAY16_BE, GRAY16_LE, v308, RGB16, BGR16, RGB15, BGR15, UYVP, A420, RGB8P, YUV9, YVU9, IYU1, ARGB64, AYUV64, r210, I420_10BE, I420_10LE, I422_10BE, I422_10LE, Y444_10BE, Y444_10LE, GBR, GBR_10BE, GBR_10LE, NV16, NV24, NV12_64Z32, A420_10BE, A420_10LE, A422_10BE, A422_10LE, A444_10BE, A444_10LE, NV61, P010_10BE, P010_10LE, IYU2, VYUY, GBRA, GBRA_10BE, GBRA_10LE, GBR_12BE, GBR_12LE, GBRA_12BE, GBRA_12LE, I420_12BE, I420_12LE, I422_12BE, I422_12LE, Y444_12BE, Y444_12LE, GRAY10_LE32, NV12_10LE32, NV16_10LE32 }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ] |
“channel-positions”
property“channel-positions” GValueArray *
Channel positions used on the output.
Flags: Read / Write
“format”
property “format” GstRawAudioParseFormat
Format of the raw audio stream.
Flags: Read / Write
Default value: PCM
“interleaved”
property“interleaved” gboolean
True if audio has interleaved layout.
Flags: Read / Write
Default value: TRUE
“num-channels”
property“num-channels” gint
Number of channels in raw stream.
Flags: Read / Write
Allowed values: >= 1
Default value: 2
“pcm-format”
property“pcm-format” GstAudioFormat
Format of audio samples in PCM stream (ignored if format property is not set to pcm).
Flags: Read / Write
Default value: GST_AUDIO_FORMAT_UNKNOWN
“sample-rate”
property“sample-rate” gint
Rate of audio samples in raw stream.
Flags: Read / Write
Allowed values: >= 1
Default value: 44100