.. _SD.Multimedia.AcceleratedGstreamer:

.. include:: /content/swdocs.rsts

.. spelling::
   actmon
   bitrate
   br
   camerasrc
   cdn
   dec
   DivX
   dsink
   EGLimage
   ep
   gbr
   gcdn
   gep
   gicr
   gmo
   gpcr
   gso
   gst
   gstreamer
   gvcr
   gwb
   intermode
   Intra
   jx
   Kp
   macroblock
   macroblocks
   maxperf
   Nano
   nv
   Nvarguscamera
   nvarguscamerasrc
   nvcompositor
   nvdrmvideosink
   nveglglessink
   nveglstreamsrc
   nvegltransform
   nvgstcam
   nvgstcapture
   nvgstenc
   nvgstplayer
   nvivafilter
   nvjpegdec
   nvjpegenc
   nvoverlaylink
   nvoverlaysink
   nvoverlysink
   nvv
   nvvidconv
   nvvideosink
   omx
   omxh
   omxmpeg
   omxvp
   opencv
   pcr
   perf
   poc
   pre
   src
   th
   Theora
   Transcode
   transcoding
   unsetting
   vbv
   videocuda
   videodec
   videosink
   vp
   wb
   whitebalance
   Xorg
   xvimagesink
   YUV

Accelerated GStreamer
!!!!!!!!!!!!!!!!!!!!!

This topic is a guide to the GStreamer version 1.0 and 1.14 based accelerated solution included in |NVIDIA(r)| |Jetson(tm)| Linux.

.. todo::
   "Solution" makes it sound like some kind of application that uses GStreamer. Wouldn't it be more natural to call it "a hardware-accelerated version of GStreamer"?

   As a related issue, is it necessary to say "version 1.0" over and over? It seems simpler just to say "GStreamer." We've already stated which versions Jetson Linux supports (although the situation seems to be much more complex than claimed here, see later comments).

   As a further related issue, the term "GStreamer-1.0" is used repeatedly, but is never explained. It looks like the name of a software element, although using upper case letters in the name of a software element would be unusual. Perhaps it's just another way of writing "GStreamer version 1.0."

.. note:: References to GStreamer version 1.0 apply also to GStreamer version 1.14.

GStreamer-1.0 Installation and Setup
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

This section explains how to install and configure GStreamer.

.. todo::
   There's another section on installing ("building") GStreamer much later, at the start of the GStreamer part of the topic. I plan to consolidate this section with that one.

   I need some information to co-ordinate them. The procedures described here (install with apt) and there (build with gst-install, or build manually) are completely different, and the reader is given no direction for choosing one.

   The information about versions is also very spotty: above we say that Jetson Linux runs with GStreamer 1.0 or 1.14, but this section doesn't explain how to install 1.14 (assuming it installs 1.0 as given, which isn't clearly stated). The gst-install procedure doesn't say what versions it works with, but has an example with v1.16.2. The manual procedure instructions appear to say that it works only with the latest version, which it calls 1.16.2, but in fact the latest stable version is 1.18.4.

To install GStreamer-1.0
########################

- Enter the commands::

     $ sudo add-apt-repository universe
     $ sudo add-apt-repository multiverse
     $ sudo apt-get update
     $ sudo apt-get install gstreamer1.0-tools gstreamer1.0-alsa \
          gstreamer1.0-plugins-base gstreamer1.0-plugins-good \
          gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly \
          gstreamer1.0-libav
     $ sudo apt-get install libgstreamer1.0-dev \
          libgstreamer-plugins-base1.0-dev \
          libgstreamer-plugins-good1.0-dev \
          libgstreamer-plugins-bad1.0-dev

To check the GStreamer-1.0 version
##################################

- Enter the command::

     $ gst-inspect-1.0 --version

GStreamer-1.0 Plugin Reference
##############################

.. note:: The ``gst-omx`` plugin was deprecated in |NVIDIA(r)| Tegra\ |reg| Linux Driver Package (now Jetson Linux) release 32.1. Use the ``gst-v4l2`` plugin for development.

.. todo::
   I don't think the reader needs to know that ``gst-omx`` was deprecated in r32.1, which will be eight releases old by the time r34.1 is published.

   I understand that we normally remove a feature in the next minor release after it is deprecated. Has that happened yet? If so, we should say it is no longer supported, or just not mention it at all.

   As a related issue, it seems inappropriate to give extensive information about the use of gst-omx when it has long been deprecated if not removed. The whole point of deprecating a feature is to inform the reader that it is not to be used for development. Providing extensive information about how to use it will cause confusion: faced with huge amounts of information about gst-omx throughout the topic, the reader is likely to forget that it is deprecated, or not notice in the first place. Readers who need information about it to maintain existing code can refer to an old version of the document.

GStreamer version 1.0 includes the following ``gst-omx`` video decoders.

================ ==============================
Video decoder    Description
================ ==============================
omxh265dec       OpenMAX IL H.265 Video decoder
omxh264dec       OpenMAX IL H.264 Video decoder
omxmpeg4videodec OpenMAX IL MPEG4 Video decoder
omxmpeg2videodec OpenMAX IL MPEG2 Video decoder
omxvp8dec        OpenMAX IL VP8 Video decoder
omxvp9dec        OpenMAX IL VP9 video decoder
================ ==============================

GStreamer version 1.0 includes the following ``gst-v4l2`` video decoders:

+---------------+--------------------------+
| Video decoder | Description              |
+===============+==========================+
| nvv4l2decoder | V4L2 H.265 Video decoder |
|               +--------------------------+
|               | V4L2 H.264 Video decoder |
|               +--------------------------+
|               | V4L2 VP8 video decoder   |
|               +--------------------------+
|               | V4L2 VP9 video decoder   |
|               +--------------------------+
|               | V4L2 MPEG4 video decoder |
|               +--------------------------+
|               | V4L2 MPEG2 video decoder |
+---------------+--------------------------+

GStreamer version 1.0 includes the following ``gst-omx`` video encoders:

+----------------+----------------------------------------------------+
| Video encoder  | Description                                        |
+================+====================================================+
| omxh264enc     | OpenMAX IL H.264/AVC video encoder                 |
+----------------+----------------------------------------------------+
| omxh265enc     | OpenMAX IL H.265/AVC video encoder                 |
+----------------+----------------------------------------------------+
| omxvp9enc      | OpenMAX IL VP9 video encoder (supported on         |
|                | |NVIDIA(r)| |Jetson AGX Xavier(tm)| series)        |
|                |                                                    |
|                | .. todo::                                          |
|                |    The r32.6 doc did not say whether this is       |
|                |    supported on Jetson Xavier NX. Or, of course,   |
|                |    whether it is supported on Orin.                |
|                |                                                    |
|                |    I have noted Orin support issues in other       |
|                |    places, but not necessarily in every affected   |
|                |    place. A PIC needs to review the entire topic   |
|                |    for places were platform dependencies need      |
|                |    updating. That may mean adding "...and Jetson   |
|                |    Orin" to a list of supported platforms, removing|
|                |    a list because all platforms supported by r34.1 |
|                |    are supported by a feature, or adding a list    |
|                |    because I mistakenly removed one.               |
|                |                                                    |
+----------------+----------------------------------------------------+

GStreamer version 1.0 includes the following ``gst-v4l2`` video encoders:

+----------------+----------------------------------------------------+
| Video encoder  | Description                                        |
+================+====================================================+
| nvv4l2h264enc  | V4L2 H.264 video encoder                           |
+----------------+----------------------------------------------------+
| nvv4l2h265enc  | V4L2 H.265 video encoder                           |
+----------------+----------------------------------------------------+
| nvv4l2vp9enc   | V4L2 VP9 video encoder (supported with |NVIDIA(r)| |
|                | |Jetson Xavier(tm) NX| series and Jetson AGX Xavier|
|                | series only)                                       |
|                |                                                    |
|                | .. todo::                                          |
|                |    Are they supported on Orin? If so, the          |
|                |    limitation is no longer needed.                 |
|                |                                                    |
+----------------+----------------------------------------------------+

GStreamer version 1.0 includes the following ``gst-omx`` video sink:

============= ===============================
Video sink    Description
============= ===============================
nvoverlaysink OpenMAX IL videosink element
============= ===============================

GStreamer version 1.0 includes the following EGL\ |tm| image video sink:

.. todo:: This topic and others refer often to "EGLimage." Which form is correct or preferred?

+---------------+-----------------------------------------------------+
| Video sink    | Description                                         |
+===============+=====================================================+
| nveglglessink | EGL/GLES videosink element, both the X11 and        |
|               | Wayland backends                                    |
|               |                                                     |
|               | .. todo::                                           |
|               |    Something is missing before "both." should we    |
|               |    say "used in" both the X11 and Wayland backends? |
+---------------+-----------------------------------------------------+
| nv3dsink      | EGL/GLES videosink element                          |
+---------------+-----------------------------------------------------+

.. todo:: In the preceding table and others, the heading refers to "Video sink" and the body to "videosink." Which is preferred?

GStreamer version 1.0 includes the following DRM video sink:

============== =====================
Video sink     Description
============== =====================
nvdrmvideosink DRM videosink element
============== =====================

.. note:: The ``nvoverlaysink`` plugin was deprecated in Tegra Linux Driver Package (now Jetson Linux) release 32.1. For development work, use ``nvdrmvideosink`` and nv3dsink to render pipelines with the ``gst-v4l2`` decoder.

.. todo::
   The comments about gst-omx apply also to nvoverlaysink, since they were deprecated at the same time. Later several examples use nvoverlaysink, and need to be updated or deleted.

   I note that nvoverlaylink was introduced three tables above, then we passed onward to the EGL image sink and the DRM image sink; only then do we say, "Oh, by the way, nvoverlysink was deprecated eight releases ago, so don't use it!" Even if there is still reason to mention it, we should organize the information better than this.

GStreamer version 1.0 includes the following proprietary NVIDIA plugins:

+---------------------------+-----------------------------------------+
| NVIDIA proprietary plugin | Description                             |
+===========================+=========================================+
| nvarguscamerasrc          | Camera plugin for ARGUS API             |
|                           |                                         |
|                           | .. todo::                               |
|                           |    ARGUS is used in all caps in several |
|                           |    places, something I have not seen    |
|                           |    anywhere else. Is there a reason why |
|                           |    it's used that way here?             |
|                           |                                         |
+---------------------------+-----------------------------------------+
| nvv4l2camerasrc           | Camera plugin for V4L2 API              |
+---------------------------+-----------------------------------------+
| nvvidconv                 | Video format conversion and scaling     |
+---------------------------+-----------------------------------------+
| nvcompositor              | Video compositor                        |
+---------------------------+-----------------------------------------+
| nveglstreamsrc            | Acts as GStreamer Source Component,     |
|                           | accepts EGLStream from EGLStream        |
|                           | producer                                |
+---------------------------+-----------------------------------------+
| nvvideosink               | Video Sink Component. Accepts YUV-I420  |
|                           | format and produces EGLStream (RGBA)    |
+---------------------------+-----------------------------------------+
| nvegltransform            | Video transform element for NVMM to     |
|                           | EGLimage (supported with nveglglessink  |
|                           | only)                                   |
+---------------------------+-----------------------------------------+

GStreamer version 1.0 includes the following ``libjpeg``\ -based JPEG image
video encode/decode plugins:

========= ====================
JPEG      Description
========= ====================
nvjpegenc JPEG encoder element
nvjpegdec JPEG decoder element
========= ====================

.. note::
   Enter this command before starting the video decode pipeline using ``gst-launch`` or ``nvgstplayer``::

      $ export DISPLAY=:0

   Enter this command to start X server if it is not already running::

      $  xinit &

Decode Examples
@@@@@@@@@@@@@@@

The examples in this section show how you can perform audio and video
decode with GStreamer.

.. note:: GStreamer version 0.10 support is deprecated in Tegra Linux Driver Package (now Jetson Linux) Release 24.2. Use of GStreamer version 1.0 is recommended for development.

.. todo:: Release 24.2 is *24 releases old*. Is it really necessary to caution the reader against using components that far out of date? (If it is, this is the wrong place to do it.)

Audio Decode Examples Using gst-launch-1.0
##########################################

The following examples show how you can perform audio decode using
GStreamer-1.0.

- AAC Decode (OSS Software Decode)::

     $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
        qtdemux name=demux demux.audio_0 ! \
        queue ! avdec_aac ! audioconvert ! alsasink -e

  .. todo::
     PDF documents have a fixed column width, so we customarily continue commands after a maximum of 80 characters to avoid line wrapping. HTML documents allow the reader to avoid line wrapping by temporarily widening the browser window, so we customarily continue commands only if they are very long (e.g. over 120 characters). Can we observe that convention and consolidate each command on a longer line?

- AMR-WB Decode (OSS Software Decode)::

     $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
          qtdemux name=demux demux.audio_0 ! queue ! avdec_amrwb ! \
          audioconvert ! alsasink -e

- AMR-NB Decode (OSS Software Decode)::

     $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
          qtdemux name=demux demux.audio_0 ! queue ! avdec_amrnb ! \
          audioconvert ! alsasink -e

- MP3 Decode (OSS Software Decode)::

     $ gst-launch-1.0 filesrc location=<filename.mp3> ! mpegaudioparse ! \
          avdec_mp3 ! audioconvert ! alsasink -e

  .. note::
     To route audio over |HDMI(r)|, set the ``alsasink`` property ``device`` to the value given for your platform in the table
     :ref:`Port to device ID map <SD.Communications.AudioSetupAndDevelopment-PortToDeviceIdMap>`
     in the topic
     :ref:`Audio Setup and Development <SD.Communications.AudioSetupAndDevelopment>`.

     For example, use ``device=hw:0,7`` to route audio over the Jetson TX2 HDMI/DP 1 (HDMI) port.

     .. todo::
        Replace outdated TX2 example. I tried to do this but I don't understand how to read the table. There is no entry for TX2 which contains values 0 and 7. Perhaps 0 is a fixed value rather than a table value, or the table is wrong.

Video Decode Examples Using gst-launch-1.0
##########################################

The following examples show how you can perform video decode on
GStreamer-1.0.

- Video Decode Using gst-omx:

  The following examples show how you can perform video decode using the ``gst-omx`` plugin on GStreamer-1.0.

  - H.264 Decode (NVIDIA Accelerated Decode)::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
            qtdemux name=demux demux.video_0 ! queue ! h264parse ! omxh264dec ! \
            nveglglessink -e

- H.265 Decode (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
       qtdemux name=demux demux.video_0 ! queue ! h265parse ! omxh265dec ! \
       nvoverlaysink -e

- 10-bit H.265 Decode (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename_10bit.mkv> ! \
          matroskademux ! h265parse ! omxh265dec ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)NV12' ! \
          nvoverlaysink -e

- 12-bit H.265 Decode (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename_12bit.mkv> ! \
          matroskademux ! h265parse ! omxh265dec ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)NV12' ! \
          nvoverlaysink -e

  .. note:: For decode use cases with low memory allocation requirements (e.g. on Jetson Nano, use the ``enable-low-outbuffer`` property of the ``gst-omx`` decoder plugin.

     .. todo::
        Replace outdated reference to Nano (or delete the note because it refers to a long-deprecated plugin!).

        The next note also refers to gst-omx.

  For example::

     $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
          qtdemux ! h265parse ! omxh265dec enable-low-outbuffer=1 ! \
          'video/x- raw(memory:NVMM), format=(string)NV12' ! fakesink sync=1 -e

  .. note::
     To enable max perf mode, use the ``disable-dvfs`` property of the ``gst-omx`` decoder plugin. Expect increased power consumption in max perf mode.

  For example::

     $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
          qtdemux ! h265parse ! omxh265dec disable-dvfs=1 ! \
          'video/x-raw(memory:NVMM), format=(string)NV12' ! fakesink sync=1 -e

- VP8 Decode (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
          qtdemux name=demux demux.video_0 ! queue ! omxvp8dec ! \
          nvoverlaysink -e

  .. note::
     If the primary display is **not** used to render video, use the ``display-id`` property of ``nvoverlaysink``.

  For example::

      $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
           qtdemux name=demux demux.video_0 ! queue ! omxvp8dec ! \
           nvoverlaysink display-id=1 -e

- VP9 Decode (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
          matroskademux name=demux demux.video_0 ! queue ! omxvp9dec ! \
          nvoverlaysink display-id=1 -e

- MPEG-4 Decode (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
          qtdemux name=demux demux.video_0 ! queue ! mpeg4videoparse ! \
          omxmpeg4videodec ! nveglglessink -e

- MPEG-2 Decode (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename.ts> ! \
          tsdemux name=demux demux.video_0 ! queue ! mpegvideoparse ! \
          omxmpeg2videodec ! nveglglessink -e

Video Decode Using gst-v4l2
###########################

The following examples show how you can perform video decode using
the ``gst-v4l2`` plugin on GStreamer-1.0.


- H.264 Decode (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename_h264.mp4> ! \
          qtdemux ! queue ! h264parse ! nvv4l2decoder ! nv3dsink -e

  .. note::
     To enable max perf mode, use the ``enable-max-performance`` property of the ``gst-v4l2`` decoder plugin. Expect increased power consumption in max perf mode.

  For example::

     $ gst-launch-1.0 filesrc location=<filename_h264.mp4> ! \
          qtdemux ! queue ! h264parse ! nvv4l2decoder \
          enable-max-performance=1 ! nv3dsink -e

  .. note::
     To decode H.264/H.265 GDR streams you must enable error reporting by setting the property ``enable-frame-type-reporting`` to ``true``.

  For example::

      $ gst-launch-1.0 filesrc \
           location=<filename_h264.mp4> ! \
           qtdemux ! queue ! h264parse ! nvv4l2decoder \
           enable-frame-type-reporting=1 ! nv3dsink -e

- H.265 Decode (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename_h265.mp4> ! \
          qtdemux ! queue ! h265parse ! nvv4l2decoder ! nv3dsink -e

- 10-bit H.265 Decode (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename_10bit.mkv> ! \
          matroskademux ! queue ! h265parse ! nvv4l2decoder ! \
          nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)NV12' ! \
          nv3dsink -e

- 12-bit H.265 Decode (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename_12bit.mkv> ! \
          matroskademux ! queue ! h265parse ! nvv4l2decoder ! \
          nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)NV12' ! \
          nv3dsink -e

- 8-bit YUV444 (NV24) H.265 Decode (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename_8bit_YUV444.265> ! \
          h265parse ! nvv4l2decoder ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)NV12' ! \
          nv3dsink -e

- VP9 Decode (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename_vp9.mkv> ! \
          matroskademux ! queue ! nvv4l2decoder ! nv3dsink -e

- VP8 Decode (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename_vp8.mkv> ! \
          matroskademux ! queue ! nvv4l2decoder ! nv3dsink -e

- MPEG-4 Decode (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename_mpeg4.mp4> ! \
          qtdemux ! queue ! mpeg4videoparse ! nvv4l2decoder ! nv3dsink -e

- MPEG-4 Decode DivX 4/5 (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename_divx.avi> ! \
          avidemux ! queue ! mpeg4videoparse ! nvv4l2decoder ! nv3dsink -e

- MPEG-2 Decode (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename_mpeg2.ts> ! \
          tsdemux ! queue ! mpegvideoparse ! nvv4l2decoder ! nv3dsink -e

Image Decode Examples Using gst-launch-1.0
$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$

The following example shows how you can perform JPEG decode on GStreamer-1.0.

- JPEG Decode (NVIDIA Accelerated Decode)::

     $ gst-launch-1.0 filesrc location=<filename.jpg> ! nvjpegdec ! \
          imagefreeze ! xvimagesink -e

Encode Examples
@@@@@@@@@@@@@@@

The examples in this section show how you can perform audio and video
encode with GStreamer.

Audio Encode Examples Using gst-launch-1.0
##########################################

The following examples show how you can perform audio encode on
GStreamer-1.0.

- AAC Encode (OSS Software Encode)::

     $ gst-launch-1.0 audiotestsrc ! \
          'audio/x-raw, format=(string)S16LE,
          layout=(string)interleaved, rate=(int)44100, channels=(int)2' ! \
          voaacenc ! qtmux ! filesink location=test.mp4 -e

- AMR-WB Encode (OSS Software Encode)::

     $ gst-launch-1.0 audiotestsrc ! \
          'audio/x-raw, format=(string)S16LE, layout=(string)interleaved, \
          rate=(int)16000, channels=(int)1' ! voamrwbenc ! qtmux ! \
          filesink location=test.mp4 -e

Video Encode Examples Using gst-launch-1.0
##########################################

The following examples show how you can perform video encode with
GStreamer-1.0.

Video Encode Using gst-omx
$$$$$$$$$$$$$$$$$$$$$$$$$$

The following examples show how you can perform video encode using the
``gst-omx`` plugin with GStreamer-1.0.

- H.264 Encode (NVIDIA Accelerated Encode)::

     $ gst-launch-1.0 videotestsrc ! \
          'video/x-raw, format=(string)I420, width=(int)640, \
          height=(int)480' ! omxh264enc ! \
          'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! \
          qtmux ! filesink location=test.mp4 -e

  .. note::
     Jetson AGX Xavier series can support 8Kp30 H.265 encode. For example::

        $ gst-launch-1.0 nvarguscamerasrc ! \
             'video/x-raw(memory:NVMM), width=(int)3840, \
             height=(int)2160, format=(string)NV12, \
             framerate=(fraction)30/1' ! nvvidconv ! \
             'video/x-raw(memory:NVMM), width=(int)7860, \
             height=(int)4320, format=(string)NV12 ! nvv4l2h265enc \
             preset-level=1 control-rate=1 bitrate=40000000 ! \
             h265parse ! matroskamux ! \
             filesink location=<filename_8k_h265.mkv> -e

- H.265 Encode (NVIDIA Accelerated Encode)::

     $ gst-launch-1.0 videotestsrc ! \
          'video/x-raw, format=(string)I420, width=(int)640, \
          height=(int)480' ! omxh265enc ! filesink location=test.h265 -e

- 10-bit H.265 Encode (NVIDIA Accelerated Encode)::

     $ gst-launch-1.0 nvarguscamerasrc ! \
          'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \
          format=(string)NV12, framerate=(fraction)30/1' ! \
          nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)I420_10LE' ! \
          omxh265enc ! matroskamux ! filesink location=test_10bit.mkv -e

- VP8 Encode (NVIDIA Accelerated, Supported with Jetson TX2/TX2i and Jetson Nano)::

     $ gst-launch-1.0 videotestsrc ! \
          'video/x-raw, format=(string)I420, width=(int)640, \
          height=(int)480' ! omxvp8enc ! matroskamux ! \
          filesink location=test.mkv -e

  .. todo::
     Above is one of several examples that refer to omxvp8enc, which must be dropped in r34 because it runs only on devices that are no longer supported. (See the initial reference to omxvp8enc in the r32.6 documentation.) I presume that all of these should be deleted. I don't know whether they need to be replaced by anything else.

     I deleted some earlier references to VP8 before writing this note. If VP8 is to be retained, I'll check for omissions.

- VP9 Encode (NVIDIA Accelerated, Supported on Jetson AGX Xavier series)::

     $ gst-launch-1.0 videotestsrc ! \
          'video/x-raw, format=(string)I420, width=(int)640, \
          height=(int)480' ! omxvp9enc ! matroskamux ! \
          filesink location=test.mkv -e

  .. todo:: Presumably also Jetson Xavier NX, and perhaps Jetson Orin.

- MPEG-4 Encode (OSS Software Encode)::

     $ gst-launch-1.0 videotestsrc ! \
          'video/x-raw, format=(string)I420, width=(int)640, \
          height=(int)480' ! avenc_mpeg4 ! qtmux ! \
          filesink location=test.mp4 -e

- H.263 Encode (OSS Software Encode)::

     $ gst-launch-1.0 videotestsrc ! \
          'video/x-raw, format=(string)I420, width=(int)704, \
          height=(int)576' ! avenc_h263 ! qtmux ! filesink location=test.mp4 -e

Video Encode Using gst-v4l2
$$$$$$$$$$$$$$$$$$$$$$$$$$$

The following examples show how you can perform video encode using
``gst-v4l2`` plugin with GStreamer-1.0.

- H.264 Encode (NVIDIA Accelerated Encode)::

     $ gst-launch-1.0 nvarguscamerasrc ! \
          'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \
          format=(string)NV12, framerate=(fraction)30/1' ! nvv4l2h264enc ! \
          bitrate=8000000 ! h264parse ! qtmux ! filesink \
          location=<filename_h264.mp4> -e

  .. note::
     To enable max perf mode, use the maxperf-enable property of the ``gst-v4l2`` encoder plugin. Expect increased power consumption in max perf mode.

  For example::

      $ gst-launch-1.0 nvarguscamerasrc ! \
           'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \
           format=(string)NV12, framerate=(fraction)30/1' ! nvv4l2h264enc \
           maxperf-enable=1 bitrate=8000000 ! h264parse ! qtmux ! filesink \
           location=<filename_h264.mp4> -e

- 8-bit YUV444 (NV24) H.264 Encode (NVIDIA Accelerated Encode)::

     $ gst-launch-1.0 filesrc location=<filename_nv24_352_288.yuv>! \
       videoparse width=352 height=288 format=52 framerate=30 ! \
       'video/x-raw, format=(string)NV24' ! nvvidconv ! \
       'video/x-raw(memory:NVMM), format=(string)NV24' ! nvv4l2h264enc \
       profile=High444 ! h264parse ! filesink \
       location=<filename_8bit_nv24.264> -e

  .. note:: 8-bit YUV444 H.264 encode is supported with High444 profile.

- H.265 Encode (NVIDIA Accelerated Encode)::

     $ gst-launch-1.0 nvarguscamerasrc ! \
          'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \
          format=(string)NV12, framerate=(fraction)30/1' ! nvv4l2h265enc \
          bitrate=8000000 ! h265parse ! qtmux ! filesink \
          location=<filename_h265.mp4> -e

.. todo:: These examples' line divisions passed without comment in r32.6.1, and probably several earlier releases, but I wonder whether a continuation character inside a quoted string is really legal. The Gnu Bash shell documentation says no: "If a \\newline pair appears, *and the backslash itself is not quoted*, the \\newline is treated as a line continuation." It also says, "A single quote may not occur between single quotes, even when preceded by a backslash," implying that a backslash inside single quotes is not treated as an escape character at all.

- 10-bit H.265 Encode (NVIDIA Accelerated Encode)::

     $ gst-launch-1.0 nvarguscamerasrc ! \
          'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \
          format=(string)NV12, framerate=(fraction)30/1' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)P010_10LE' ! \
          nvv4l2h265enc bitrate=8000000 ! h265parse ! qtmux ! \
          filesink location=<filename_10bit_h265.mp4> -e

- 8-bit YUV444 (NV24) H.265 Encode (NVIDIA Accelerated Encode)::

     $ gst-launch-1.0 filesrc location=<filename_nv24_352_288.yuv> ! \
       videoparse width=352 height=288 format=52 framerate=30 ! \
       'video/x-raw, format=(string)NV24' ! nvvidconv ! \
       'video/x-raw(memory:NVMM), format=(string)NV24' ! nvv4l2h265enc \
       profile=Main ! h265parse ! filesink location=<filename_8bit_nv24.265> -e

  .. note:: 8-bit YUV444 H.265 encode is supported with Main profile.

- VP9 Encode (NVIDIA Accelerated Encode)::

     $ gst-launch-1.0 nvarguscamerasrc ! \
          'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \
          format=(string)NV12, framerate=(fraction)30/1' ! nvv4l2vp9enc \
          bitrate=8000000 ! matroskamux ! filesink \
          location=<filename_vp9.mkv> -e

  .. note::
     Jetson Orin does not support VP9 encode using gst-v4l2.

- VP9 Encode with IVF Headers (NVIDIA Accelerated Encode)::

     $ gst-launch-1.0 nvarguscamerasrc ! \
          'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \
          format=(string)NV12, framerate=(fraction)30/1' ! nvv4l2vp9enc \
          enable-headers=1 bitrate=8000000 ! filesink \
          location=<filename_vp9.vp9> -e

- VP8 Encode (NVIDIA Accelerated Encode)::

     $ gst-launch-1.0 nvarguscamerasrc ! \
          'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \
          format=(string)NV12, framerate=(fraction)30/1' ! nvv4l2vp8enc \
          bitrate=8000000 ! matroskamux ! filesink \
          location=<filename_vp8.mkv> -e

- VP8 Encode with IVF Headers (NVIDIA Accelerated Encode)::

     $ gst-launch-1.0 nvarguscamerasrc ! \
          'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \
          format=(string)NV12, framerate=(fraction)30/1' ! nvv4l2vp8enc \
          enable-headers=1 bitrate=8000000 ! filesink \
          location=<filename_vp8.vp8> -e

Image Encode Examples Using gst-launch-1.0
##########################################

The following example shows how you can perform JPEG encode on GStreamer-1.0.

- Image Encode::

     $ gst-launch-1.0 videotestsrc num-buffers=1 ! \
          'video/x-raw, width=(int)640, height=(int)480, \
          format=(string)I420' ! nvjpegenc ! filesink location=test.jpg -e

Supported H.264/H.265/VP8/VP9 Encoder Features with GStreamer-1.0
#################################################################

This section describes example gst-launch-1.0 usage for features
supported by the NVIDIA accelerated H.264/H.265/VP8/VP9 encoders.

Features Supported Using gst-omx
$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$

This section describes example ``gst-launch-1.0`` usage for features
supported by the NVIDIA accelerated H.264/H.265/VP8/VP9 ``gst-omx``
encoders.

.. note:: Display detailed information about ``omxh264enc`` or ``omxh265enc`` encoder properties with the command::

   $ gst-inspect-1.0 [omxh264enc | omxh265enc | omxvp8enc | omxvp9enc]

- Set the I-Frame interval::

     $ gst-launch-1.0 videotestsrc num-buffers=200 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420' ! omxh264enc iframeinterval=100 ! qtmux ! \
          filesink location=test.mp4 -e

- Set temporal tradeoff (rate at which the encoder should drop frames)::

     $ gst-launch-1.0 videotestsrc num-buffers=200 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420' ! omxh264enc temporal-tradeoff=1 ! qtmux ! \
          filesink location=test.mp4 -e

  Configuring temporal tradeoff causes the encoder to intentionally, periodically, drop input frames. The following modes are supported:

  - 0: Disable.
  - 1: Drop 1 in 5 frames.
  - 2: Drop 1 in 3 frames.
  - 3: Drop 1 in 2 frames.
  - 4: Drop 2 in 3 frames.

- Set rate control mode::

     $ gst-launch-1.0 videotestsrc num-buffers=200 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420' ! omxh264enc control-rate=1 ! qtmux ! \
          filesink location=test.mp4 -e

  The following modes are supported:

  - 0: Disable.
  - 1: Variable bit rate.
  - 2: Constant bit rate.
  - 3: Variable bit rate with frame skip. The encoder skips frames as necessary to meet the target bit rate.
  - 4: Constant bit rate with frame skip.

- Set peak bitrate::

     $ gst-launch-1.0 videotestsrc num-buffers=200 is-live=true ! \
          'video/x-raw,width=1280,height=720,format=I420' ! \
          omxh264enc bitrate=6000000 peak-bitrate=6500000 ! qtmux ! \
          filesink location=test.mp4 -e

  Peak bitrate takes effect only in variable bit rate mode (``control-rate=1``). By default, the value is configured as (1.2\ |times|\ bitrate).

- Set quantization range for I, P and B Frame: The format for the range is::

     "<I_range>:<P_range>:<B_range>"

  Where ``<I_range>``, ``<P_range>``, and ``<B_range>`` are each expressed as hyphenated values, as in this example::

     $ gst-launch-1.0 videotestsrc num-buffers=200 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420' ! \
          omxh264enc qp-range="10,30:10,35:10,35" ! qtmux ! \
          filesink location=test.mp4 -e

  .. todo:: Except that they're not expressed as hyphenated values; they're in the form <min>:<max>. Actually "range" is ambiguous, because either limit can be open or closed; I assume that both are closed.

  The range of B frames does not take effect if the number of B frames is 0.

- Set hardware preset level::

     $ gst-launch-1.0 videotestsrc num-buffers=200 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420' ! omxh264enc preset-level=0 ! qtmux ! \
          filesink location=test.mp4 -e

  The following modes are supported:

  - 0: **UltraFastPreset**.
  - 1: **FastPreset**: Only integer pixel (``integer-pel``) block motion is estimated. For I/P macroblock mode decision, only Intra 16\ |times|\ 16 cost is compared with Inter modes costs. Supports Intra 16\ |times|\ 16 and Intra 4\ |times|\ 4 modes.
  - 2: **MediumPreset**: Supports up to half pixel (``half-pel``) block motion estimation. For an I/P macroblock mode decision, only Intra 16\ |times|\ 16 cost is compared with Inter modes costs. Supports Intra 16\ |times|\ 16 and Intra 4\ |times|\ 4 modes.
  - 3: **SlowPreset**: Supports up to quarter pixel (``Qpel``) block motion estimation. For an I/P macroblock mode decision, Intra 4\ |times|\ 4 as well as  Intra 16\ |times|\ 16 cost is compared with Inter modes costs. Supports Intra 16\ |times|\ 16 and Intra 4\ |times|\ 4 modes.

- Set profile::

     $ gst-launch-1.0 videotestsrc num-buffers=200 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420' ! omxh264enc profile=8 ! qtmux ! \
          filesink location=test.mp4 -e

  From ``omxh264enc``, the following profiles are supported:

  - 1: Baseline profile.
  - 2: Main profile.
  - 8: High profile.

- Set level::

     $ gst-launch-1.0 videotestsrc num-buffers=200 is-live=true ! \
          'video/x-raw, format=(string)I420, width=(int)256, height=(int)256, \
          framerate=(fraction)30/1' ! omxh264enc bitrate=40000 ! \
          'video/x-h264, level=(string)2.2' ! qtmux ! \
          filesink location= test.mp4 -e

  The following levels are supported:

  - From ``omxh264enc``:

    ``1``, ``1b``, ``1.2``, ``1.3``, ``2``, ``2.1``, ``2.2``, ``3``, ``3.1``, ``3.2``, ``4``, ``4.1``, ``4.2``, ``5``, ``5.1``, and ``5.2``.

  - From ``omxh265enc``:

    ``main1``, ``main2``, ``main2.1``, ``main3``, ``main3.1``, ``main4``, ``main4.1``, ``main5``, ``high1``, ``high2``, ``high2.1``, ``high3``, ``high3.1``, ``high4``, ``high4.1``, and ``high5``.

- Set number of B frames between two reference frames::

     $ gst-launch-1.0 videotestsrc num-buffers=200 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420' ! omxh264enc num-B-Frames=2 ! qtmux ! \
          filesink location=test.mp4 -e

  .. note:: B-frame-encoding is not supported with ``omxh265enc``.

- Insert SPS and PPS at IDR::

     $ gst-launch-1.0 videotestsrc num-buffers=200 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420' ! omxh264enc insert-sps-pps=1 ! qtmux ! \
          filesink location=test.mp4 -e

  If enabled, a sequence parameter set (SPS) and a picture parameter set (PPS) are inserted before each IDR frame in the H.264/H.265 stream.

- Enable two-pass CBR::

     $ gst-launch-1.0 videotestsrc num-buffers=200 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420' ! omxh264enc EnableTwopassCBR=1
          control-rate=2 ! qtmux ! filesink location=test.mp4 -e

  Two-pass CBR must be enabled along with constant bit rate (``control-rate=2``).

- Set virtual buffer size::

     $ gst-launch-1.0 videotestsrc num-buffers=200 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420' ! omxh264enc vbv-size=10 ! qtmux ! \
          filesink location=test.mp4 -e

  If the buffer size of decoder or network bandwidth is limited, configuring virtual buffer size can cause video stream generation to correspond to the limitations according to the following formula:

     virtual buffer size = vbv-size |times| (bitrate/fps)

- Enable stringent bitrate::

     $ gst-launch-1.0 nvarguscamerasrc num-buffers=200 ! \
          'video/x-raw(memory:NVMM),width=1920,height=1080,format=(string)NV12' ! \
          omxh264enc control-rate=2 vbv-size=1 EnableTwopassCBR=true \
          EnableStringentBitrate=true ! qtmux ! filesink location=test.mp4 -e

Stringent bitrate must be enabled along with constant bit rate (``control-rate=2``), two-pass CBR being enabled, and virtual buffer size being set.

- Slice-header-spacing with spacing in terms of macroblocks::

     $ gst-launch-1.0 videotestsrc num-buffers=200 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420' ! \
          omxh264enc slice-header-spacing=200 bit-packetization=0 ! \
          qtmux ! filesink location=test.mp4 -e

  The parameter ``bit-packetization=0`` configures the network abstraction layer (NAL) packet as macroblock-based, and ``slice-header-spacing=200`` configures each NAL packet as 200\ |nbsp|\  macroblocks maximum.

- Slice header spacing with spacing in terms of number of bits::

     $ gst-launch-1.0 videotestsrc num-buffers=200 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420' ! \
          omxh264enc slice-header-spacing=1024 bit-packetization=1 ! \
          qtmux ! filesink location=test1.mp4 -e

  The parameter ``bit-packetization=1`` configures the network abstraction layer (NAL) packet as size-based, and ``slice-header-spacing=1024`` configures each NAL packet as 1024 bytes maximum.

Features Supported Using gst-v4l2
$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$

This section describes example gst-launch-1.0 usage for features supported by the NVIDIA accelerated H.264/H.265/VP8/VP9 ``gst-v4l2`` encoders.

.. note::
   Display detailed information on the ``nvv4l2h264enc``, ``nvv4l2h265enc``, ``v4l2vp9enc``, or ``nvv4l2vp8enc`` encoder property with the command::

      $ gst-inspect-1.0 [nvv4l2h264enc | nvv4l2h265enc | nvv4l2vp8enc | nvv4l2vp9enc]

- Set I-frame interval (supported with H.264/H.265/VP9 encode)::

     $ gst-launch-1.0 videotestsrc num-buffers=300 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \
          iframeinterval=100 ! h264parse ! qtmux ! filesink \
          location=<filename_h264.mp4> -e

  This property sets encoding Intra Frame occurrence frequency.

- Set rate control mode and bitrate (supported with H.264/H.265/VP9 encode):

  The supported modes are 0 (variable bit rate, or VBR) and 1 (constant bit rate, or CBR).

  - Set variable bitrate mode::

       $ gst-launch-1.0 videotestsrc num-buffers=300 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \
          control-rate=0 bitrate=30000000 ! h264parse ! qtmux ! filesink \
          location=<filename_h264_VBR.mp4> -e

  - Set constant bitrate mode::

       $ gst-launch-1.0 videotestsrc num-buffers=300 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \
          control-rate=1 bitrate=30000000 ! h264parse ! qtmux ! filesink \
          location=<filename_h264_CBR.mp4> -e

- Set quantization range for I, P and B frame (supported with H.264/H.265 encode)::

     $ gst-launch-1.0 videotestsrc num-buffers=300 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \
          ratecontrol-enable=0 quant-i-frames=30 quant-p-frames=30 \
          quant-b-frames=30 num-B-Frames=1 ! filesink \
          location=<filename_h264.264> -e

  The range of B frames does not take effect if the number of B frames is 0.

- Set hardware preset level (supported with H.264/H.265/VP9 encode)::

     $ gst-launch-1.0 videotestsrc num-buffers=300 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \
          preset-level=4 MeasureEncoderLatency=1 ! 'video/x-h264, \
          stream-format=(string)byte-stream, alignment=(string)au' ! \
          filesink location=<filename_h264.264> -e

  The following modes are supported:

  - 0: **DisablePreset**.
  - 1: **UltraFastPreset**.
  - 2: **FastPreset**: Only integer pixel (``integer-pel``)
    block motion is estimated. For I/P macroblock mode decisions, only Intra 16\ |times|\ 16 cost is compared with intermode costs. Supports Intra 16\ |times|\ 16 and Intra 4\ |times|\ 4 modes.
  - 3: **MediumPreset**: Supports up to half pixel (``half-pel``)
    block motion estimation. For I/P macroblock mode decisions, only Intra 16\ |times|\ 16 cost is compared with intermode costs. Supports Intra 16\ |times|\ 16 and Intra 4\ |times|\ 4 modes.
  - 4: **SlowPreset**: Supports up to quarter pixel (``Qpel``)
    block motion estimation. For I/P macroblock mode decisions, Intra 4\ |times|\ 4 as well as Intra 16\ |times|\ 16 cost is compared with intermode costs. Supports Intra 16\ |times|\ 16 and Intra 4\ |times|\ 4 modes.

- Set profile (supported with H.264/H.265 encode)::

     $ gst-launch-1.0 videotestsrc num-buffers=300 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \
          profile=0 ! 'video/x-h264, stream-format=(string)byte-stream, \
          alignment=(string)au' ! filesink location=<filename_h264.264> -e

  The following profiles are supported for H.264 encode:

  - 0: Baseline profile
  - 2: Main profile
  - 4: High profile

  The following profiles are supported for H.265 encode:

  - 0: Main profile
  - 1: Main10 profile

- Insert SPS and PPS at IDR (supported with H.264/H.265 encode)::

     $ gst-launch-1.0 videotestsrc num-buffers=300 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \
          insert-sps-pps=1 ! \
          'video/x-h264, stream-format=(string)byte-stream, \
          alignment=(string)au' ! filesink location=<filename_h264.264> -e

  If enabled, a sequence parameter set (SPS) and a picture parameter set (PPS) are inserted before each IDR frame in the H.264/H.265 stream.

- Enable two-pass CBR (supported with H.264/H.265 encode)::

     $ gst-launch-1.0 videotestsrc num-buffers=300 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \
          control-rate=1 bitrate=10000000 EnableTwopassCBR=1 ! \
          'video/x-h264, stream-format=(string)byte-stream, \
          alignment=(string)au' ! filesink location=<filename_h264.264> -e

  Two-pass CBR must be enabled along with constant bit rate (``control-rate=1``).

  .. note:: For multi-instance encode with two-pass CBR enabled, enable max perf mode by using the maxperf-enable property of the ``gst-v4l2`` encoder to achieve best performance. Expect increased power consumption in max perf mode.

- Slice-header-spacing with spacing in terms of macroblocks (Supported with H.264/H.265 encode)::

     $ gst-launch-1.0 videotestsrc num-buffers=300 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \
          slice-header-spacing=8 bit-packetization=0 ! 'video/x-h264, \
          stream-format=(string)byte-stream, alignment=(string)au' ! \
          filesink location=<filename_h264.264> -e

  The parameter ``bit-packetization=0`` configures the network abstraction  layer (NAL) packet as macroblock (MB)-based, and ``slice-header-spacing=8`` configures each NAL packet as 8\ |nbsp|\ macroblocks maximum.

  .. todo:: We're using "slice header spacing" both with and without hyphenation. Which form is correct? The former suggests the name of a technique or mode, the latter a setting or parameter name (which should be in code font and should not be capitalized).

- Slice header spacing with spacing in terms of number of bits (supported with H.264/H.265 encode)::

     $ gst-launch-1.0 videotestsrc num-buffers=300 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \
          slice-header-spacing=1400 bit-packetization=1 ! 'video/x-h264, \
          stream-format=(string)byte-stream, alignment=(string)au' ! \
          filesink location=<filename_h264.264> -e

  The parameter ``bit-packetization=1`` configures the network abstraction layer (NAL) packet as size-based, and ``slice-header-spacing=1400`` configures each NAL packet as 1400\ |nbsp|\ bytes maximum.

- Enable CABAC-entropy-coding (supported with H.264 encode for main or high profile)::

     $ gst-launch-1.0 videotestsrc num-buffers=300 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \
          profile=2 cabac-entropy-coding=1 ! 'video/x-h264, \
          stream-format=(string)byte-stream, alignment=(string)au' ! \
          filesink location=<filename_h264.264> -e

  The following entropy coding types are supported:

  - 0: CAVLC
  - 1: CABAC

- Set number of B frames between two reference frames (supported with H.264/H.265 encode)::

    $ gst-launch-1.0 videotestsrc num-buffers=300 ! \
      'video/x-raw, width=(int)1280, height=(int)720, \
      format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \
      'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \
      num-B-Frames=1 ! 'video/x-h264, stream-format=(string)byte-stream, \
      alignment=(string)au' ! filesink location=<filename_h264.264> -e

  This property sets the number of B frames between two reference frames.

  .. note:: For multi-instance encode with ``num-B-Frames=2``, enable max perf mode by specifying the maxperf-enable property of the ``gst-v4l2`` encoder for best performance. Expect increased power consumption in max perf mode.

- Set ``qp-range`` (supported with H.264/H.265 encode)::

     $ gst-launch-1.0 videotestsrc num-buffers=300 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \
          qp-range="24,24:28,28:30,30" num-B-Frames=1 ! 'video/x-h264, \
          stream-format=(string)byte-stream, alignment=(string)au' ! filesink \
          location=<filename_h264.264> -e

  This property sets quantization range for P, I and B frames.

- Enable ``MVBufferMeta`` (supported with H.264/H.265 encode)::

     $ gst-launch-1.0 videotestsrc num-buffers=300 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \
          EnableMVBufferMeta=1 ! 'video/x-h264, \
          stream-format=(string)byte-stream, alignment=(string)au' ! \
          filesink location=<filename_h264.264> -e

  This property enables motion vector metadata for encoding.

- Insert AUD (supported with H.264/H.265 encode)::

     $ gst-launch-1.0 videotestsrc num-buffers=300 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \
          insert-aud=1 ! 'video/x-h264, stream-format=(string)byte-stream, \
          alignment=(string)au' ! filesink location=<filename_h264.264> -e

  This property inserts an H.264/H.265 Access Unit Delimiter (AUD).

- Insert VUI (supported with H.264/H.265 encode)::

     $ gst-launch-1.0 videotestsrc num-buffers=300 ! \
          'video/x-raw, width=(int)1280, height=(int)720, \
          format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \
          insert-vui=1 ! 'video/x-h264, stream-format=(string)byte-stream, \
          alignment=(string)au' ! filesink location=<filename_h264.264> -e

  This property inserts H.264/H.265 video usability information (VUI) in SPS.

- Set picture order count (POC) type (supported with H.264 encode)::

     $ gst-launch-1.0 videotestsrc num-buffers=300 ! \
          'video/x-raw, width=1920, height=1080, format=I420' ! nvvidconv ! \
          nvv4l2h264enc \
          poc-type=2 ! h264parse ! filesink location=<filename_h264.264> -e

  The following values are supported for the poc-type property:

  -  0: POC explicitly specified in each slice header (the default)
  -  2: Decoding/coding order and display order are the same

Camera Capture with GStreamer-1.0
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

To display ``nvgstcapture-1.0`` usage information, enter the command::

   $ nvgstcapture-1.0 --help

.. note:: The ``nvgstcapture-1.0`` application default only supports ARGUS API using the ``nvarguscamerasrc`` plugin. The legacy ``nvcamerasrc`` plugin support is deprecated.

For more information, see `nvgstcapture-1.0 Reference <#nvgstcapture-1-0-reference>`__.

Camera Capture with GStreamer-1.0
#################################

.. todo:: This subsection has the same heading as the parent section. Please reword at least of them. If you can make them more informative, that's a bonus.

Use the following command to capture using ``nvarguscamerasrc`` and preview display with ``nvdrmvideosink``::

     $ gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), \
          width=(int)1920, height=(int)1080, format=(string)NV12, \
          framerate=(fraction)30/1' ! nvdrmvideosink -e

.. note:: The ``nvarguscamerasrc`` plugin’s ``maxperf`` property is removed, as VIC actmon DFS is implemented to handle VIC frequency scaling as per load enabling clients to get required performance.

.. todo::
   This note may be important or pointless, depending on *when* the property was removed. The references to plugins that were deprecated eight releases back put me on my guard. We should follow our customary practice of pruning notices of removed features after one minor release except where there is a substantial reason to deviate from it.

Progressive Capture Using nvv4l2camerasrc
#########################################

To capture and preview display with ``nv3dsink``, enter the command::

   $ gst-launch-1.0 nvv4l2camerasrc device=/dev/video3 ! \
        'video/x-raw(memory:NVMM), format=(string)UYVY, \
        width=(int)1920, height=(int)1080, \
        interlace-mode= progressive, \
        framerate=(fraction)30/1' ! nvvidconv ! \
        'video/x-raw(memory:NVMM), format=(string)NV12' ! \
        nv3dsink -e

.. note::
   The ``nvv4l2camerasrc`` plugin default currently supports only DMABUF (importer role) streaming I/O mode with ``V4L2_MEMORY_DMABUF``.

   The ``nvv4l2camerasrc`` plugin is currently verified using the NVIDIA V4L2 driver with a sensor that supports YUV capture in UYVY format.

   If you need to use a different type of sensor for capture in other YUV formats, see the topic
   :ref:`Sensor Software Driver Programming <SD.CameraDevelopment.SensorSoftwareDriverProgramming>`.
   In that case ``nvv4l2camerasrc`` must also be enhanced for required YUV format support.

The ``nvgstcapture-1.0`` application uses the ``v4l2src`` plugin to capture still images and video.

The following table shows USB camera support.

+--------------------+--------------------------------------------------------+
| USB camera support | Feature                                                |
+====================+========================================================+
| YUV                | Preview display                                        |
+--------------------+--------------------------------------------------------+
|                    | Image capture (VGA, 640\ |times|\ 480)                 |
+--------------------+--------------------------------------------------------+
|                    | Video capture (480p, 720p, H.264/H.265/VP8/VP9 encode) |
+--------------------+--------------------------------------------------------+

.. todo:: I don't understand how to read this table, with three rows and nothing in the first column for two of them. Please clarify.

Raw-YUV Capture Using v4l2src
#############################

Use the following command to capture raw YUV (I420 format) using v4l2src
and preview display with xvimagesink::

   $ gst-launch-1.0 v4l2src device="/dev/video0" ! \
        "video/x-raw, width=640, height=480, format=(string)YUY2" ! \
        xvimagesink -e

Camera Capture and Encode Support with OpenCV
#############################################

The OpenCV sample application ``opencv_nvgstcam`` simulates the camera
capture pipeline. Similarly, the OpenCV sample application
``opencv_nvgstenc`` simulates the video encode pipeline.

Both sample applications are based on GStreamer 1.0. They currently are
supported only by OpenCV version 3.3.

.. todo:: Are these statements still current?

   These cases continue to occur throughout the topic. Please search for "version," "release," and any other appropriate keywords and take appropriate action if they are outdated. It is better to make them version-neutral, rather than update their numbers, where possible.

- opencv_nvgstcam: Camera capture and preview.

  To simulate the camera capture pipeline with the ``opencv_nvgstcam`` sample application, enter the command::

     $ ./opencv_nvgstcam --help

  .. note:: ``opencv_nvgstcam`` as distributed currently supports only single-instance CSI capture using the ``d`` plugin. You can modify and rebuild the application to support GStreamer pipelines for CSI multi-instance capture and USB camera capture using the ``v4l2src`` plugin. The application uses an OpenCV-based videosink for display.

    .. todo:: Is there actually a plugin named ``d``? It looks like a typo.

  For camera CSI capture and preview rendering with OpenCV, enter th3 command::

     $ ./opencv_nvgstcam --width=1920 --height=1080 --fps=30

- opencv_nvgstenc: Camera capture and video encode.

  To simulate the camera capture and video encode pipeline with the ``opencv_nvgstenc`` sample application, enter the command::

     $ ./opencv_nvgstenc --help

  .. note:: ``opencv_nvgstenc`` as distributed currently supports only camera CSI capture using the ``nvarguscamerasrc`` plugin and video encode in H.264 format using the ``nvv4l2h264enc`` plugin with an MP4 container file. You can modify and rebuild the application to support GStreamer pipelines for different video encoding formats. The application uses an OpenCV-based videosink for display.

  For camera CSI capture and video encode with OpenCV, enter the command::

     $ ./opencv_nvgstenc --width=1920 --height=1080 --fps=30 --time=60 \
             --filename=test_h264_1080p_30fps.mp4

Video Playback with GStreamer-1.0
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

To display ``nvgstplayer-1.0`` usage information, enter the command::

   $ nvgstplayer-1.0 --help

Video can be output to HD displays using the HDMI connector on the Jetson device. The ``GStreamer-1.0`` application currently supports the following video sinks:

.. note:: The ``nvoverlaysink`` plugin is deprecated in Tegra Linux Driver Package (now Jetson Linux) release 32.1. Use the ``nvdrmvideosink`` plugin for development.

.. todo:: Has this plugin been dropped yet? If not, either it's way overdue, or it was deprecated far too soon!

For overlay Sink (video playback on overlay in full-screen mode), enter the command::

     $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
          qtdemux name=demux ! h264parse ! omxh264dec ! nvoverlaysink -e

.. todo:: "For overlay Sink" may or may not be good English, depending on exactly how the words are being used, but it looks wrong and it's difficult to parse. Would it be accurate to say something like "To use an overlay sink"?

Video Playback Examples
#######################

The following examples show how you can perform video playback using
GStreamer-1.0.

- Overlay sink (video playback using overlay parameters).

  .. todo:: A note as long as the following one is somewhat self-defeating: if this much text is marked as important by putting it in a note, the reader will not perceive any of it as particularly important. That's particularly true in a case like this, where there's no text for the note to apply to -- just an example! Can the contents of the note be made ordinary text, following the command? I think that would be equally effective and easier to read.

  .. note::
     The following steps are required to use the “overlay” property on Jetson Xavier NX series and Jetson AGX Xavier series.

       .. todo::
          And Orin? If so, the platform qualification is unneeded.

          As a separate issue, what does "property" mean here? It customarily refers to a name/value pair associated with an object. No object is in sight here, so I suspect it means something else, and it would be better to use another word.

     Enter these commands to set ``win_mask``::

        # sudo -s 
        # cd /sys/class/graphics/fb0 
        # echo 4 > blank  // Blanks monitor for changing
        # // display setting.
        # echo 0x0 > device/win_mask
        # // Clears current window setting.
        # // window setting.
        # echo 0x3f > device/win_mask
        # // Assigns all 6 overlay windows
        # // in display controller to
        # // display 0 (fb0).
        # echo 0 > blank  // Unblank display.

     .. todo::
        What is ``win_mask``? It is never mentioned again. The list of commands needed to set it is too long for it to be something simple like an environment variable or a config file setting.

     Enter this command to stop X11::

        $ sudo systemctl stop gdm
        $ sudo loginctl terminate-seat seat0

     .. todo:: The text says "command," then lists two commands. Probably both are needed, but perhaps this means something else.

     For more introduction about the overlay windows in the display controller, please refer to the *Tegra X2 Technical Reference Manual* (TRM).

     .. todo:: Maybe this was never updated when AGX Xavier and Xavier NX were introduced, or maybe the information actually is in the Tegra X2 TRM, regardless of what processor the user has. In the latter case we need to find some other reference, since TX2 is no longer supported.

     Because X11 uses one window, you must disable it to use all six overlays. Disabling X11 also helps avoid memory bandwidth contention when using a non-X11 overlay.

     .. todo:: Will the reader understand the logical connection between X11 using one window and having to disable it? I don't... it suggests to me that there are a limited number of windows and the overlays need all of them, which seems unlikely.

.. code-block::

      $ gst-launch-1.0 filesrc location=<filename_1080p.mp4> ! \
           qtdemux ! h264parse ! omxh264dec \
           nvoverlaysink overlay-x=100 overlay-y=100 overlay-w=640 \
           overlay-h=480 overlay=1 \
           overlay-depth=0 & gst-launch-1.0 filesrc \
           location=<filename_1080p.mp4> ! qtdemux ! h264parse ! omxh264dec ! \
           nvoverlaysink overlay-x=250 overlay-y=250 overlay-w=640 \
           overlay-h=480 overlay=2 overlay-depth=1 -e

- ``nveglglessink`` (windowed video playback, NVIDIA EGL/GLES videosink using default X11 backend):

  Enter this command to start the GStreamer pipeline using ``nveglglesink`` with the default X11 backend::

     $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
          qtdemux name=demux ! h264parse ! omxh264dec ! nveglglessink -e

  The ``nvgstplayer-1.0`` application accepts command line options that specify window position and dimensions for windowed playback::

     $ nvgstplayer-1.0 -i <filename> --window-x=300 --window-y=300 \
          --window-width=500 --window-height=500

- nveglglessink (windowed video playback, NVIDIA EGL/GLES videosink using Wayland backend):

  You can use ``nveglglsink`` with the Wayland backend instead of the default X11 backend.

  Ubuntu 16.04 does not support the Wayland display server. That is, there is no UI support to switch to Wayland from Xorg. You must start the Wayland server (Weston) using the target’s shell before performing any Weston based operation.

  .. todo:: Jetson Linux now is based on Ubuntu 18.04. Does that affect this limitation?

- To start Weston: The following steps are required before you first run the GStreamer pipeline with the Wayland back end. They are not required on subsequent runs.

  .. todo::
     At this point I lost track of the document structure due to the large number of poorly distinguished levels. Is "To start Weston..." a sibling of "nveglglessink (window video playback... using Wayland backend)," or is it a child?

     More generally, we need to reorganize this topic to reduce the number of levels. We have two or three levels of bulleted lists under five levels of section headings. We can't chop up the document this finely and get a coherent result. The reader will lose track of where they are, as I have.

  #. Stop the display manager::

     $ sudo systemctl stop gdm
     $ sudo loginctl terminate-seat seat0

  #. For Weston 6.0, use the ``tegra_drm`` driver::

     $ sudo ln -sf /usr/lib/aarch64-linux-gnu/tegra/libnvgbm.so /usr/lib/aarch64-linux-gnu/libgbm.so.1
     $ sudo modprobe tegra-udrm modeset=1

  #. Unset the ``DISPLAY`` environment variable::

        $ unset DISPLAY

  #. Create a temporary ``xdg/`` directory::

        $ mkdir /tmp/xdg
        $ chmod 700 /tmp/xdg

  #. Start the Weston compositor::

     $ sudo XDG_RUNTIME_DIR=/tmp/xdg weston --idle-time=0 &

- To run the GStreamer pipeline with the Wayland backend: Enter this command to start the GStreamer pipeline using ``nveglglesink`` with the Wayland backend::

     $ sudo XDG_RUNTIME_DIR=/tmp/xdg gst-launch-1.0 filesrc \
             location=<filename.mp4> ! qtdemux name=demux ! h264parse ! \
             omxh264dec ! nveglglessink winsys=wayland

- DRM video sink (video playback using DRM): This sink element uses DRM to render video on connected displays.

  .. todo::
     The text doesn't say what the following procedure does.

  #. Stop the display manager::

        $ sudo systemctl stop gdm
        $ sudo loginctl terminate-seat seat0

  #. Enter this command to start the GStreamer pipeline using ``nvdrmvideosink``::

        $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
             qtdemux! queue ! h264parse !nvv4l2decoder ! nvdrmvideosink -e

Properties
##########

``nvdrmvideosink`` supports these properties:

- ``conn_id``: Set the connector ID for the display.
- ``plane_id``: Set the plane ID.
- ``set_mode``: Set the default mode (resolution) for playback.

The following command illustrates the use of these properties::

   $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
        qtdemux! queue ! h264parse ! ! nvv4l2decoder ! nvdrmvideosink \
        conn_id=0 plane_id=1 set_mode=0 -e

- ``nv3dsink`` video sink (video playback using 3D graphics API): This video sink element works with NVMM buffers and renders using the 3D graphics rendering API. It performs better than ``nveglglessink`` with NVMM buffers.

  .. todo:: I'm assuming that we're back at the first level of the bulleted list under the heading "Properties," which is a sibling of the heading "Video Playback Examples." The old *Developer Guide* is no guide to document structure because it hierarchy clearly doesn't match its content in this part.

  This command starts the GStreamer pipeline using ``nv3dsink``::

     $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
          qtdemux ! queue ! h264parse ! nvv4l2decoder ! nv3dsink -e

  The sink supports setting a specific window position and dimensions using the properties shown in this example::

     $ nv3dsink window-x=300 window-y=300 window-width=512 window-height=512

Video Decode Support with OpenCV
################################

You can simulate a video decode pipeline using the GStreamer-1.0-based
OpenCV sample application ``opencv_nvgstdec``.

.. note:: The sample application currently operates only with OpenCV version 3.3.

To perform video decoding with ``opencv_nvgstdec``, enter the command::

   $ ./opencv_nvgstdec --help

.. note::
   ``opencv_nvgstdec`` as distributed currently supports only video decode of H264 format using the ``nvv4l2decoder`` plugin. You can modify and rebuild the application to support GStreamer pipelines for video decode of different formats. For display, the application utilizes an OpenCV based videosink component.

To perform video decoding with ``opencv_nvgstdec``, enter the command::

   $ ./opencv_nvgstdec --file-path=test_file_h264.mp4

Video Streaming with GStreamer-1.0
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

This section describes procedures for video streaming with GStreamer 1.0.

To perform video streaming with nvgstplayer-1.0
###############################################

- Using nvgstplayer-1.0: Enter the command::

     $ nvgstplayer-1.0 -i rtsp://10.25.20.77:554/RTSP_contents/VIDEO/H264/
          test_file_h264.3gp –stats

  The supported formats for video streaming are:

  .. raw:: html
     :file: AcceleratedGstreamer/ToPerformVideoStreamingWithNvgstplayer10.htm

- Using gst-launch-1.0 pipeline:

  - Streaming and video rendering:

    - Transmitting (from target): CSI camera capture + video encode + RTP streaming using network sink::

         $ gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), \
              format=NV12, width=1920, height=1080' ! \
              nvv4l2h264enc insert-sps-pps=true ! h264parse ! \
              rtph264pay pt=96 ! udpsink host=127.0.0.1 port=8001 sync=false -e

    - Receiving (on target) : Network Source + video decode + video render::

         $ gst-launch-1.0 udpsrc address=127.0.0.1 port=8001 \
              caps='application/x-rtp, encoding-name=(string)H264, payload=(int)96' ! \
              rtph264depay ! queue ! h264parse ! nvv4l2decoder ! nv3dsink -e

  - Streaming and file dump:

    - Transmitting (from target): CSI camera capture + video encode + RTP streaming using network sink::

         $ gst-launch-1.0 nvarguscamerasrc ! \
              'video/x-raw(memory:NVMM), format=NV12, width=1920, height=1080' ! \
              nvv4l2h264enc insert-sps-pps=true ! h264parse ! \
              rtph264pay pt=96 ! udpsink host=127.0.0.1 port=8001 sync=false -e

    - Receiving (on target): Network Source + video decode + file dump::

         $ gst-launch-1.0 udpsrc address=127.0.0.1 port=8001 \
              caps='application/x-rtp, encoding-name=(string)H264, payload=(int)96' ! \
              rtph264depay ! queue ! h264parse ! nvv4l2decoder ! nvvidconv ! \
              'video/x-raw, format=(string)I420' ! filesink location=test.yuv -e

.. _SD.Multimedia.AcceleratedGstreamer-VideoFormatConversionWithGStreamer10:

Video Format Conversion with GStreamer-1.0
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

The NVIDIA proprietary ``nvvidconv`` GStreamer-1.0 plugin allows conversion between OSS (raw) video formats and NVIDIA video formats. The ``nvvidconv`` plugin currently supports the format conversions described in this section.

Raw-YUV Input Formats
#####################

Currently nvvidconv supports the ``I420``, ``UYVY``, ``YUY2``, ``YVYU``, ``NV12``, ``NV16``, ``NV24``, ``P010_10LE``, ``GRAY8``, ``BGRx``, ``RGBA``, and ``Y42B RAW-YUV`` input formats.

- Using the ``gst-omx`` encoder::

     $ gst-launch-1.0 videotestsrc ! 'video/x-raw, format=(string)UYVY, \
          width=(int)1280, height=(int)720' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! omxh264enc ! \
          'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! \
          qtmux ! filesink location=test.mp4 -e

- Using the ``gst-v4l2`` encoder (with other than the GRAY8 pipeline)::

     $ gst-launch-1.0 videotestsrc ! 'video/x-raw, format=(string)UYVY, \
          width=(int)1280, height=(int)720' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! \
          nvv4l2h264enc ! 'video/x-h264, \
          stream-format=(string)byte-stream' ! h264parse ! \
          qtmux ! filesink location=test.mp4 -e

- Using the ``gst-v4l2`` encoder with the GRAY8 pipeline:::

          gst -launch-1.0 videotestsrc ! 'video/x-raw, format=(string)GRAY8, \
          width=(int)640, height=(int)480, framerate=(fraction)30/1' ! \
          nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)I420' ! \
          nvv4l2h264enc ! 'video/x-h264, \
          stream-format=(string)byte-stream' ! h264parse ! qtmux ! \
          filesink location=test.mp4 -e

.. note:: Format conversion with raw YUV input is CPU-intensive due to the “software to hardware” memory copies involved.

Raw-YUV Output Formats
######################

Currently ``nvvidconv`` supports the ``I420``, ``UYVY``, ``YUY2``, ``YVYU``, ``NV12``, ``NV16``, ``NV24``, ``GRAY8``, ``BGRx``, ``RGBA``, and ``Y42B RAW-YUV`` output formats.

- Using the ``gst-omx`` decoder::

     $ gst-launch-1.0 filesrc location=640x480_30p.mp4 ! qtdemux ! \
          queue ! h264parse ! omxh264dec ! nvvidconv ! \
          'video/x-raw, format=(string)UYVY' ! videoconvert ! xvimagesink -e

- Using the ``gst-v4l2`` decoder (with other than the GRAY8 pipeline)::

     $ gst-launch-1.0 filesrc location=640x480_30p.mp4 ! qtdemux ! \
          queue ! h264parse ! nvv4l2decoder ! nvvidconv ! \
          'video/x-raw, format=(string)UYVY' ! videoconvert ! xvimagesink -e

- Using the ``gst-v4l2`` decoder with the GRAY8 pipeline::

     $ gst-launch-1.0 filesrc location=720x480_30i_MP.mp4 ! qtdemux ! \
          queue ! h264parse ! nvv4l2decoder ! nvvidconv ! 'video/x-raw, \
          format=(string)GRAY8' ! videoconvert ! xvimagesink -e

.. note:: Format conversion with raw YUV output is CPU-intensive due to the “hardware to software” memory copies involved.

NVIDIA Input and Output Formats
###############################

Currently ``nvvidconv`` supports the combinations of NVIDIA input and output formats described in the following table. Any format in the left column may be converted to any format in the right column of the same row.

.. raw:: html
   :file: AcceleratedGstreamer/NvidiaInputAndOutputFormats.htm

Enter these commands to convert between NVIDIA formats:

- Using the ``gst-omx`` decoder::

     $ gst-launch-1.0 filesrc location=1280x720_30p.mp4 ! qtdemux ! \
          h264parse ! omxh264dec ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)RGBA' ! nvoverlaysink -e

- Using the ``gst-v4l2`` decoder::

     $ gst-launch-1.0 filesrc location=1280x720_30p.mp4 ! qtdemux ! \
          h264parse ! nvv4l2decoder ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)RGBA' ! nvdrmvideosink -e

- Using the ``gst-omx`` encoder::

     $ gst-launch-1.0 nvarguscamerasrc ! \
          'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \
          format=(string)NV12, framerate=(fraction)30/1' ! nvvidconv ! \
          video/x-raw(memory:NVMM), format=(string)I420' ! omxh264enc ! \
          tmux ! filesink location=test.mp4 -e

- Using the ``gst-v4l2`` encoder::

     $ gst-launch-1.0 nvarguscamerasrc ! \
          'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \
          format=(string)NV12, framerate=(fraction)30/1' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc ! \
          h264parse ! qtmux ! filesink location=test.mp4 -e

- Using the ``gst-v4l2`` decoder and nv3dsink with the GRAY8 pipeline::

     $ gst-launch-1.0 filesrc location=1280x720_30p.mp4 ! qtdemux ! \
          h264parse ! nvv4l2decoder ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)GRAY8' ! nvvidconv ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nv3dsink -e

.. _SD.Multimedia.AcceleratedGstreamer-VideoScalingWithGStreamer10:

Video Scaling with GStreamer-1.0
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

The NVIDIA proprietary ``nvvidconv`` GStreamer-1.0 plugin also allows you to
perform video scaling. The ``nvvidconv`` plugin currently supports scaling
with the format conversions described in this section.

- Raw-YUV input formats:

  Currently ``nvvidconv`` supports the I420, UYVY, YUY2, YVYU, NV12, NV16, NV24, P010_10LE, GRAY8, BGRx, RGBA, and Y42B RAW-YUV input formats for scaling.

  - Using the ``gst-omx`` encoder::

       $ gst-launch-1.0 videotestsrc ! \
            'video/x-raw, format=(string)I420, width=(int)1280, \
            height=(int)720' ! nvvidconv ! \
            'video/x-raw(memory:NVMM), width=(int)640, height=(int)480, \
            format=(string)I420' ! omxh264enc ! \
            'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! \
            qtmux ! filesink location=test.mp4 -e

  - Using the ``gst-v4l2`` encoder::

       $ gst-launch-1.0 videotestsrc ! \
            'video/x-raw, format=(string)I420, width=(int)1280, \
            height=(int)720' ! nvvidconv ! \
            'video/x-raw(memory:NVMM), width=(int)640, height=(int)480, \
            format=(string)I420' ! nvv4l2h264enc ! \
            'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! \
            qtmux ! filesink location=test.mp4 -e

.. note:: Video scaling with raw YUV input is CPU-intensive due to the “software to hardware” memory copies involved.

- Raw-YUV Output Formats:

  Currently ``nvvidconv`` supports the I420, UYVY, YUY2, YVYU, NV12, NV16, NV24, GRAY8, BGRx, RGBA, and Y42B RAW-YUV output formats for scaling.

  - Using the ``gst-omx`` decoder::

       $ gst-launch-1.0 filesrc location=1280x720_30p.mp4 ! qtdemux ! \
            queue ! h264parse ! omxh264dec ! nvvidconv ! \
            'video/x-raw, format=(string)I420, width=640, height=480' ! \
            xvimagesink -e

  - Using the ``gst-v4l2`` decoder::

       $ gst-launch-1.0 filesrc location=1280x720_30p.mp4 ! qtdemux ! \
            queue ! h264parse ! nvv4l2decoder ! nvvidconv ! \
            'video/x-raw, format=(string)I420, width=640, height=480' ! \
            xvimagesink -e

.. note:: Video scaling with raw YUV output is CPU-intensive due to the “hardware to software” memory copies involved.

.. _SD.Multimedia.AcceleratedGstreamer-VideoCroppingWithGstreamer10:

Video Cropping with GStreamer-1.0
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

The NVIDIA proprietary ``nvvidconv`` GStreamer-1.0 plugin also allows you to
perform video cropping:

- Using the ``gst-omx`` decoder::

     $ gst-launch-1.0 filesrc location=<filename_1080p.mp4> ! qtdemux ! \
          h264parse ! omxh264dec ! \
          nvvidconv left=400 right=1520 top=200 bottom=880 ! \
          nvoverlaysink display-id=1 -e

- Using the ``gst-v4l2`` decoder::

     $ gst-launch-1.0 filesrc location=<filename_1080p.mp4> ! qtdemux ! \
          h264parse ! nvv4l2decoder ! \
          nvvidconv left=400 right=1520 top=200 bottom=880 ! nv3dsink -e

Video Transcode with GStreamer-1.0
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

You can perform video transcoding between the following video formats.

- H.264 decode to VP9 Encode (NVIDIA accelerated decode to NVIDIA accelerated encode):

  - Using the ``gst-omx`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
            qtdemux name=demux demux.video_0 ! queue ! h264parse ! \
            omxh264dec ! omxvp9enc bitrate=20000000 ! matroskamux name=mux ! \
            filesink location=<Transcoded_filename.mkv> -e

  - Using the ``gst-v4l2`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename_1080p.mp4> ! qtdemux ! \
            h264parse ! nvv4l2decoder ! \
            nvvidconv left=400 right=1520 top=200 bottom=880 ! nv3dsink -e

- H.265 decode to VP9 encode (NVIDIA accelerated decode to NVIDIA accelerated encode):

  - Using the ``gst-omx`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
            qtdemux name=demux demux.video_0 ! queue ! h265parse ! \
            omxh265dec ! omxvp9enc bitrate=20000000 ! matroskamux name=mux ! \
            filesink location=<Transcoded_filename.mkv> -e

  - Using the ``gst-v4l2`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
            qtdemux name=demux demux.video_0 ! queue ! h265parse ! nvv4l2decoder ! \
            nvv4l2vp9enc bitrate=20000000 ! queue ! matroskamux name=mux ! \
            filesink location=<Transcoded_filename.mkv> -e

- VP8 decode to H.264 encode (NVIDIA accelerated decode to NVIDIA accelerated encode):

  - Using the ``gst-omx`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.webm> ! \
            matroskademux name=demux demux.video_0 ! queue ! omxvp8dec ! \
            omxh264enc bitrate=20000000 ! qtmux name=mux ! \
            filesink location=<Transcoded_filename.mp4> -e

  - Using the ``gst-v4l2`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mebm> ! \
            matroskademux name=demux demux.video_0 ! queue ! nvv4l2decoder ! \
            nvv4l2h264enc bitrate=20000000 ! h264parse ! queue ! \
            qtmux name=mux ! filesink location=<Transcoded_filename.mp4> -e

- VP9 decode to H.265 encode (NVIDIA accelerated decode to NVIDIA accelerated encode):

  - Using the ``gst-omx`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.webm> ! \
            matroskademux name=demux demux.video_0 ! queue ! omxvp9dec ! \
            omxh265enc bitrate=20000000 ! qtmux name=mux ! \
            filesink location=<Transcoded_filename.mp4> -e

  - Using the ``gst-v4l2`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.webm> ! \
            matroskademux name=demux demux.video_0 ! queue ! nvv4l2decoder ! \
            nvv4l2h265enc bitrate=20000000 ! h265parse ! queue ! \
            qtmux name=mux ! filesink location=<Transcoded_filename.mp4> -e

- MPEG-4 decode to VP9 encode (NVIDIA accelerated decode to NVIDIA accelerated encode):

  - Using the ``gst-omx`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
            qtdemux name=demux demux.video_0 ! queue ! mpeg4videoparse ! \
            omxmpeg4videodec ! omxvp9enc bitrate=20000000 ! matroskamux \
            name=mux ! filesink location=<Transcoded_filename.mkv> -e

  - Using the ``gst-v4l2`` pipeline::

     $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
          qtdemux name=demux demux.video_0 ! queue ! mpeg4videoparse ! \
          nvv4l2decoder ! nvv4l2vp9enc bitrate=20000000 ! queue ! \
          matroskamux name=mux ! filesink \
          location=<Transcoded_filename.mkv> -e

- MPEG-4 decode to H.264 encode (NVIDIA accelerated decode to NVIDIA accelerated encode):

  - Using the ``gst-omx`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
            qtdemux name=demux demux.video_0 ! queue ! mpeg4videoparse ! \
            omxmpeg4videodec ! omxh264enc bitrate=20000000 ! \
            qtmux name=mux ! filesink location=<Transcoded_filename.mp4> -e

  - Using the ``gst-v4l2`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
            qtdemux name=demux demux.video_0 ! queue ! mpeg4videoparse ! \
            nvv4l2decoder ! nvv4l2h264enc bitrate=20000000 ! h264parse ! \
            queue ! qtmux name=mux ! filesink \
            location=<Transcoded_filename.mp4> -e

- H.264 decode to VP8 encode (NVIDIA accelerated decode to NVIDIA accelerated encode):

  - Using the ``gst-omx`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
            qtdemux name=demux demux.video_0 ! queue ! h264parse ! \
            omxh264dec ! omxvp8enc bitrate=20000000 ! queue ! \
            matroskamux name=mux ! \
            filesink location=<Transcoded_filename.mkv> -e

  - Using the ``gst-v4l2`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
            qtdemux name=demux demux.video_0 ! queue ! h264parse ! \
            nvv4l2decoder ! nvv4l2vp8enc bitrate=20000000 ! queue ! \
            matroskamux name=mux ! \
            filesink location=<Transcoded_filename.mkv> -e

- H.265 decode to VP8 encode (NVIDIA accelerated decode to NVIDIA accelerated encode):

  - Using the ``gst-omx`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
            qtdemux name=demux demux.video_0 ! queue ! h265parse ! \
            omxh265dec ! omxvp8enc bitrate=20000000 ! queue ! \
            matroskamux name=mux ! \
            filesink location=<Transcoded_filename.mkv> -e

  - Using the ``gst-v4l2`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
            qtdemux name=demux demux.video_0 ! queue ! h265parse ! \
            nvv4l2decoder ! nvv4l2vp8enc bitrate=20000000 ! queue ! \
            matroskamux name=mux ! \
            filesink location=<Transcoded_filename.mkv> -e

- VP8 decode to MPEG-4 encode (NVIDIA accelerated decode to OSS software encode):

  - Using the ``gst-omx`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mkv> ! \
            matroskademux name=demux demux.video_0 ! queue ! omxvp8dec ! \
            nvvidconv ! avenc_mpeg4 bitrate=4000000 ! queue ! \
            qtmux name=mux ! filesink location=<Transcoded_filename.mp4> -e

  - Using the ``gst-v4l2`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mkv> ! \
            matroskademux name=demux demux.video_0 ! queue ! nvv4l2decoder ! \
            nvvidconv ! avenc_mpeg4 bitrate=4000000 ! queue ! \
            qtmux name=mux ! filesink location=<Transcoded_filename.mp4> -e

- VP9 decode to MPEG-4 encode (NVIDIA accelerated decode to OSS software encode):

  - Using the ``gst-omx`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mkv> ! \
            matroskademux name=demux demux.video_0 ! queue ! omxvp9dec ! \
            nvvidconv ! avenc_mpeg4 bitrate=4000000 ! qtmux name=mux ! \
            filesink location=<Transcoded_filename.mp4> -e

  - Using the ``gst-v4l2`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mkv> ! \
            matroskademux name=demux demux.video_0 ! queue ! nvv4l2decoder ! \
            nvvidconv ! avenc_mpeg4 bitrate=4000000 ! qtmux name=mux ! \
            filesink location=<Transcoded_filename.mp4> -e

- H.264 decode to Theora encode (NVIDIA accelerated decode to OSS software encode):

    - Using the ``gst-omx`` pipeline::

         $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
              qtdemux name=demux demux.video_0 ! queue ! h264parse ! \
              omxh264dec ! nvvidconv ! theoraenc bitrate=4000000 ! \
             oggmux name=mux ! filesink location=<Transcoded_filename.ogg> -e

  - Using the ``gst-v4l2`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
            qtdemux name=demux demux.video_0 ! queue ! h264parse ! \
            nvv4l2decoder ! nvvidconv ! theoraenc bitrate=4000000 ! \
            oggmux name=mux ! filesink location=<Transcoded_filename.ogg> -e

- H.264 decode to H.263 encode (NVIDIA accelerated decode to OSS software encode):

  - Using the ``gst-omx`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
            qtdemux name=demux demux.video_0 ! queue ! h264parse ! \
            omxh264dec ! nvvidconv ! \
            'video/x-raw, width=(int)704, height=(int)576, \
            format=(string)I420' ! avenc_h263 bitrate=4000000 ! qtmux ! \
            filesink location=<Transcoded_filename.mp4> -e

  - Using the ``gst-v4l2`` pipeline::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
            qtdemux name=demux demux.video_0 ! queue ! h264parse ! \
            nvv4l2decoder ! nvvidconv ! \
          '  video/x-raw, width=(int)704, height=(int)576, \
          for  mat=(string)I420' ! avenc_h263 bitrate=4000000 ! qtmux ! \
          files  ink location=<Transcoded_filename.mp4> -e

CUDA Video Post-Processing with GStreamer-1.0
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

This section describes GStreamer-1.0 plugins for |NVIDIA(r)| CUDA\ |reg|  post-processing operations.

gst-videocuda
#############

This GStreamer-1.0 plugin performs CUDA post-processing operations on
decoder-provided EGL images and renders video using ``nveglglessink``.

Following are sample commands for creating pipelines and using them in applications.

- Sample decode pipeline::

     $ gst-launch-1.0 filesrc location=<filename_h264_1080p.mp4> ! \
          qtdemux name=demux ! h264parse ! omxh264dec ! videocuda ! \
          nveglglessink max-lateness=-1 -e

- Sample decode command::

     $ nvgstplayer-1.0 -i <filename_h264_1080p.mp4> --svd="omxh264dec" \
          --svc="videocuda" --svs="nveglglessink # max-lateness=-1" \
          --disable-vnative --no-audio --window-x=0 --window-y=0 \
          --window-width=960 --window-height=540

gst-nvivafilter
###############

This NVIDIA proprietary GStreamer-1.0 plugin performs pre/post and CUDA
post-processing operations on CSI camera captured or decoded frames, and
renders video using overlay video sink or video encode.
.. note:: The ``gst-nvivafilter`` pipeline requires unsetting the ``DISPLAY`` environment variable using the command ``unset DISPLAY`` if ``lightdm`` is stopped.

- Sample decode pipeline:

  - Using the gst-omx-decoder::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! qtdemux ! \
            h264parse ! omxh264dec ! nvivafilter cuda-process=true \
            customer-lib-name="libnvsample_cudaprocess.so" ! \
            'video/x-raw(memory:NVMM), format=(string)NV12' ! nvoverlaysink -e

  - Using the ``gst-v4l2`` decoder::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! qtdemux ! queue ! \
            h264parse ! nvv4l2decoder ! nvivafilter cuda-process=true \
            customer-lib-name="libnvsample_cudaprocess.so" ! \
            'video/x-raw(memory:NVMM), format=(string)NV12' ! \
            nvdrmvideosink -e

- Sample CSI camera pipeline::

     $ gst-launch-1.0 nvarguscamerasrc ! \
          'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, \
          format=(string)NV12, framerate=(fraction)30/1' ! \
          nvivafilter cuda-process=true \
          customer-lib-name="libnvsample_cudaprocess.so" ! \
          'video/x-raw(memory:NVMM), format=(string)NV12' ! nvoverlaysink -e

.. note:: See ``nvsample_cudaprocess_src.tbz2`` for the         ``libnvsample_cudaprocess.so`` library sources. The sample CUDA implementation of ``libnvsample_cudaprocess.so`` can be replaced by a custom CUDA implementation.

.. _SD.Multimedia.AcceleratedGstreamer-VideoRotationWithGstreamer10:

Video Rotation with GStreamer-1.0
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

The NVIDIA proprietary nvvidconv GStreamer-1.0 plugin also allows you to
perform video rotation operations.

The following table shows the supported values for the ``nvvidconv`` ``flip-method`` property.

================================ ==============================
Flip method                      ``flip-method`` property value
================================ ==============================
Identity (no rotation. default)  0
Counterclockwise 90 degrees      1
Rotate 180 degrees               2
Clockwise 90 degrees             3
Horizontal flip                  4
Upper right diagonal flip        5
Vertical flip                    6
Upper left diagonal flip         7
================================ ==============================

.. note:: To get information on the nvvidconv flip-method property, enter the command::

   $ gst-inspect-1.0 nvvidconv

- To rotate video 90 degrees counterclockwise:

  - With ``gst-omx`` decoder::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
            qtdemux name=demux ! h264parse ! omxh264dec ! \
            nvvidconv flip-method=1 ! \
            'video/x-raw(memory:NVMM), format=(string)I420' ! nvoverlaysink -e

  - With  ``gst-v4l2`` decoder::

       $ gst-launch-1.0 filesrc location=<filename.mp4> ! \
            qtdemux name=demux ! h264parse ! nvv4l2decoder ! \
            nvvidconv flip-method=1 ! \
            'video/x-raw(memory:NVMM), format=(string)I420' ! \
            nvdrmvideosink -e

- To rotate video 90 degrees clockwise::

     $ gst-launch-1.0 filesrc location=<filename.mp4> ! qtdemux name=demux ! \
          h264parse ! omxh264dec ! nvvidconv flip-method=3 ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! \
          omxh264enc ! qtmux ! filesink location=test.mp4 -e

- To rotate 180 degrees::

     $ gst-launch-1.0 nvarguscamerasrc! \
          'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \
          format=(string)NV12, framerate=(fraction)30/1' ! \
          nvvidconv flip-method=2 ! \
          'video/x-raw(memory:NVMM), format=(string)I420' ! nvoverlaysink -e

- To scale and rotate video 90 degrees counterclockwise:

  - Using the ``gst-omx`` decoder::

       $ gst-launch-1.0 filesrc location=<filename_1080p.mp4> ! qtdemux ! \
            h264parse ! omxh264dec ! nvvidconv flip-method=1 ! \
            'video/x-raw(memory:NVMM), width=(int)480, height=(int)640, \
            format=(string)I420' ! nvoverlaysink -e

  - Using the ``gst-v4l2`` decoder::

       $ gst-launch-1.0 filesrc location=<filename_1080p.mp4> ! qtdemux ! \
            h264parse ! nvv4l2decoder ! nvvidconv flip-method=1 ! \
            'video/x-raw(memory:NVMM), width=(int)480, height=(int)640, \
            format=(string)I420' ! nvdrmvideosink -e

- To scale and rotate video 90 degrees clockwise::

     $ gst-launch-1.0 nvarguscamerasrc ! \
          'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \
          format=(string)NV12, framerate=(fraction)30/1' ! \
          nvvidconv flip-method=3 ! 'video/x-raw(memory:NVMM), \
          width=(int)480, height=(int)640, format=(string)I420' ! \
          nvoverlaysink -e

.. todo::
   The original's examples referred to both "the gst-omx-decoder" and "the gst-omx decoder." Which one is correct?

   The same issue applies to ``gst-v4l2`` decoder (or gst-v4l2-decoder).

- To scale and rotate video 180 degrees:

  - Using the gst-omx decoder::

     $ gst-launch-1.0 filesrc location=<filename_1080p.mp4> ! \
          qtdemux ! h264parse ! omxh264dec ! nvvidconv flip-method=2 ! \
          'video/x-raw(memory:NVMM), width=(int)640, height=(int)480, \
          format=(string)I420' ! nvoverlaysink -e

  - Using the ``gst-v4l2`` decoder::

       $ gst-launch-1.0 filesrc location=<filename_1080p.mp4> ! \
            qtdemux ! h264parse ! nvv4l2decoder ! nvvidconv flip-method=2 ! \
            'video/x-raw(memory:NVMM), width=(int)640, height=(int)480, \
            format=(string)I420' ! nvdrmvideosink -e

Video Composition with GStreamer-1.0
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

With the NVIDIA proprietary ``nvcompositor`` GStreamer-1.0 plugin, you can
perform video composition operations on ``gst-omx`` video decoded streams.

.. note:: nvcompositor supports video decode (``gst-omx``) with the overlay render pipeline for gst-1.14.

Prerequisites
#############

- Install the following dependent GStreamer package::

     $ sudo apt-get install gstreamer1.0-plugins-bad

- Clear the registry cache file, in case there is an issue with ``gst-inspect-1.0 nvcompositor``::

     $ rm .cache/gstreamer-1.0/registry.aarch64.bin

  .. todo:: I gather that gst-inspect-1.0 nvcompositor is a command that the reader may enter to display information about nvcompositor, but the following text never instructs them to enter it! Perhaps the instruction to enter it is implicit in the caution to clear the cache file, but if so, that won't be clear until the reader gets further and notices (if they're sufficiently observant) that they were never told to run it. We need to give clearer instructions than this.

To composite decoded streams with different formats
###################################################

.. todo::
   I think "with different formats" means "with heterogeneous formats," that is, a single operation that composites video streams with different formats.

   This is confusing because the introduction to "Video Composition with GStreamer-1.0" did not mention such a capability. Without that, "different formats" seems to refer to several operations each of which composites two video streams with the same format. Consequently I puzzled over the text for a while, wondering what format the first two bulleted items were for, and how I was supposed to know!

- Using the ``gst-omx`` decoder::

     $ gst-launch-1.0 nvcompositor \
          name=comp sink_0::xpos=0 sink_0::ypos=0 sink_0::width=1920 \
          sink_0::height=1080 sink_1::xpos=0 sink_1::ypos=0 \
          sink_1::width=1600 sink_1::height=1024 sink_2::xpos=0 \
          sink_2::ypos=0 sink_2::width=1366 sink_2::height=768 \
          sink_3::xpos=0 sink_3::ypos=0 sink_3::width=1024 \
          sink_3::height=576 ! nvoverlaysink display-id=1 \
          filesrc location=<filename_h264_1080p_30fps.mp4> ! qtdemux ! \
          h264parse ! omxh264dec ! comp. filesrc \
          location=< filename_h265_1080p_30fps.mp4> ! qtdemux ! h265parse ! \
          omxh265dec ! comp. filesrc \
          location=< filename_vp8_1080p_30fps.webm> matroskademux ! \
          omxvp8dec ! \
          comp. filesrc location=<filename_vp9_1080p_30fps.webm> ! \
          matroskademux ! omxvp9dec ! comp. -e

- Using the ``gst-v4l2`` decoder::

     $ gst-launch-1.0 nvcompositor \
          name=comp sink_0::xpos=0 sink_0::ypos=0 sink_0::width=1920 \
          sink_0::height=1080 sink_1::xpos=0 sink_1::ypos=0 \
          sink_1::width=1600 sink_1::height=1024 sink_2::xpos=0 \
          sink_2::ypos=0 sink_2::width=1366 sink_2::height=768 \
          sink_3::xpos=0 sink_3::ypos=0 sink_3::width=1024 \
          sink_3::height=576 ! nv3dsink \
          filesrc location=<filename_h264_1080p_30fps.mp4> ! qtdemux ! \
          h264parse ! nvv4l2decoder ! comp. filesrc \
          location=< filename_h265_1080p_30fps.mp4> ! qtdemux ! \
          h265parse ! nvv4l2decoder ! comp. filesrc \
          location=< filename_vp8_1080p_30fps.webm> ! matroskademux ! \
          nvv4l2decoder ! \
          comp. filesrc location=<filename_vp9_1080p_30fps.webm> ! \
          matroskademux ! nvv4l2decoder ! comp. -e

Interpolation Methods for Video Scaling
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

The NVIDIA proprietary ``nvvidconv`` GStreamer-1.0 plugin allows you to
choose the interpolation method used for scaling.

The following table shows the supported values for the ``nvvidconv``
``interpolation-method`` property.

======================== =======================================
Interpolation method     ``interpolation-method`` property value
======================== =======================================
Nearest                  0
Bilinear                 1
5-tap                    2
10-tap                   3
Smart (default)          4
Nicest                   5
======================== =======================================

.. note:: To display information about the ``nvvidconv`` interpolation-method property, enter the command::

      $ gst-inspect-1.0 nvvidconv

To use bilinear interpolation method for scaling
################################################

- Using the ``gst-omx`` pipeline::

     $ gst-launch-1.0 filesrc location=<filename_1080p.mp4>! \
          qtdemux name=demux ! h264parse ! omxh264dec ! \
          nvvidconv interpolation-method=1 ! \
          'video/x-raw(memory:NVMM), format=(string)I420, width=1280, \
          height=720' ! nvoverlaysink -e

- Using the ``gst-v4l2`` pipeline::

     $ gst-launch-1.0 filesrc location=<filename_1080p.mp4>! \
          qtdemux name=demux ! h264parse ! nvv4l2decoder ! \
          nvvidconv interpolation-method=1 ! \
          'video/x-raw(memory:NVMM), format=(string)I420, width=1280, \
          height=720' ! nvdrmvideosink -e

EGLStream Producer Example
@@@@@@@@@@@@@@@@@@@@@@@@@@

The NVIDIA proprietary ``nveglstreamsrc`` and ``nvvideosink`` GStreamer-1.0
plugins allow simulation of an EGLStream producer pipeline (for preview only.)

To simulate an EGLStream producer pipeline, enter the command::

   $ nvgstcapture-1.0 --camsrc=3

EGL Image Transform Example
@@@@@@@@@@@@@@@@@@@@@@@@@@@

The NVIDIA proprietary ``nvegltransform`` GStreamer-1.0 plugin allows simulation of an EGLImage transform pipeline.

To simulate an EGL Image transform pipeline:

- Using the ``gst-omx`` pipeline::

     $ gst-launch-1.0 filesrc location=<filename_h264_1080p.mp4> ! \
          qtdemux ! h264parse ! omxh264dec ! nvvidconv ! \
          'video/x-raw(memory:NVMM), width=(int)1280, height=(int)720, \
          format=(string)NV12' ! nvegltransform ! nveglglessink -e

- Using the ``gst-v4l2`` pipeline::

     $ gst-launch-1.0 filesrc location=<filename_h264_1080p.mp4> ! \
          qtdemux ! h264parse ! nvv4l2decoder ! nvegltransform ! nveglglessink -e

GStreamer Build Instructions
@@@@@@@@@@@@@@@@@@@@@@@@@@@@

Use the ``gst-install`` script to install a specific GStreamer version. This section provides a procedure for building current versions of GStreamer.

To build GStreamer using gst-install
####################################

1. Run the command::

      $ gst-install [--prefix=<install_path>] [--version=<version>]

   Where:

   - ``<install_path>`` is the location where GStreamer is to be installed.

   - ``<version>`` is the GStreamer version to be installed.

2. Run the commands::

      $ export LD_LIBRARY_PATH=<install_path>/lib/aarch64-linux-gnu
      $ export PATH=<install_path>/bin:$PATH

   Where ``<install_path>`` is the location GStreamer has been installed.

For example::

   $ gst-install --prefix=/home/ubuntu/gst-1.16.2 --version=1.16.2
   $ export LD_LIBRARY_PATH=/home/ubuntu/gst-1.16.2/lib/aarch64-linux-gnu
   % export PATH=/home/ubuntu/gst-1.16.2/bin:$PATH

To build GStreamer manually
###########################

1. Download the latest version of GStreamer, available  from the
   `freedesktop.org GStreamer source directory <https://gstreamer.freedesktop.org/src/>`__.

   You need the following files from version 1.16.2:

   - ``gstreamer-1.16.2.tar.xz``
   - ``gst-plugins-base-1.16.2.tar.xz``
   - ``gst-plugins-good-1.16.2.tar.xz``
   - ``gst-plugins-bad-1.16.2.tar.xz``
   - ``gst-plugins-ugly-1.16.2.tar.xz``

   .. todo::
      First, up to now the whole topic has concerned GStreamer 1.0 and 1.14; now we're giving instructions for building GStreamer version 1.16.2, which (although we do not explicitly say so) appears to be the current version. There's no explanation of the differences between 1.1/1.14 and 1.16.2. Nor are there instructions for making GStreamer applications work with v1.16.2 or cautions about things you can't do or must do when using it.

      It seems logical that if the reader is going to bother building GStreamer themselves, they should get something they didn't already have, and the current release is a logical thing for them to get. But it also seems logical that if there are no compatibility issues worth mentioning, we might as well give them a package containing v1.16.2 and save them the trouble.

      Second, according to the `GStreamer Releases page <https://gstreamer.freedesktop.org/releases/>`__, the latest stable release is 1.18.4, not 1.16.2.

      Third, the source files at the URL given above are not packed in archives. The files the reader is told they need are not there at all, unless they're buried in some unidentified subdirectory.

2. To install required packages, enter the command::

      $ sudo apt-get install build-essential dpkg-dev flex bison \
            autotools-dev automake liborc-dev autopoint libtool \
            gtk-doc-tools libgstreamer1.0-dev

3.  In the home (``~``) directory, create a subdirectory named ``gst_<version>``, where ``<version>`` is the version number of GStreamer you are building.

4. Copy the downloaded ``.tar.xz`` files to the ``gst_<version>`` directory.

5. Uncompress the ``.tar.xz`` files in the ``gst_<version>`` directory.

6. Set the environment variable ``PKG_CONFIG_PATH``. by entering the command::

      $ export PKG_CONFIG_PATH=/home/ubuntu/gst_1.16.2/out/lib/pkgconfig

7. Build GStreamer (in this example, ``gstreamer-1.16.2``) by entering the commands::

      $ ./configure --prefix=/home/ubuntu/gst_1.16.2/out
      $ make
      $ make install

8. Build ``gst-plugins-base-1.16.2`` by entering the commands::

      $ sudo apt-get install libxv-dev libasound2-dev libtheora-dev \
            libogg-dev libvorbis-dev
      $ ./configure --prefix=/home/ubuntu/gst_1.16.2/out
      $ make
      $ make install

9. Build ``gst-plugins-good-1.16.2`` by entering the commands::

      $ sudo apt-get install libbz2-dev libv4l-dev libvpx-dev \
            libjack-jackd2-dev libsoup2.4-dev libpulse-dev
      $ ./configure --prefix=/home/ubuntu/gst_1.16.2/out
      $ make
      $ make install

10. Obtain and build ``gst-plugins-bad-1.16.2`` by entering the commands::

       $ sudo apt-get install faad libfaad-dev libfaac-dev
       $ ./configure --prefix=/home/ubuntu/gst_1.16.2/out
       $ make
       $ make install

11. Obtain and build ``gst-plugins-ugly-1.16.2`` by entering the commands::

       $ sudo apt-get install libx264-dev libmad0-dev
       $ ./configure --prefix=/home/ubuntu/gst_1.16.2/out
       $ make
       $ make install

12. Set the environment variable ``LD_LIBRARY_PATH`` by entering the command::

       $ export LD_LIBRARY_PATH=/home/ubuntu/gst_1.16.2/out/lib/

13. Copy the NVIDIA ``gstreamer-1.0`` libraries to the ``gst_1.16.2`` plugin directory by entering the commands::

       $ cd /usr/lib/aarch64-linux-gnu/gstreamer-1.0/
       $ cp libgstnv\* libgstomx.so ~/gst_1.16.2/out/lib/gstreamer-1.0/

    .. todo::
       The term "NVIDIA gstreamer-1.0" is puzzling. In the source document it's ``nvidia gstreamer-1.0``, which makes it look like a filename with a space in it. Assuming it's not that, is it "NVIDIA ``gstreamer-1.0``," where NVIDIA identifies the provider and ``gstreamer-1.0`` is a filename or directory name? Or is it "NVIDIA GStreamer 1.0," where GStreamer is the name of the framework and 1.0 is the release number?

    The NVIDIA ``gstreamer-1.0`` libraries include:

    - ``libgstnvarguscamera.so``
    - ``libgstnvv4l2camerasrc.so``
    - ``libgstnvcompositor.so``
    - ``libgstnvdrmvideosink.so``
    - ``libgstnveglglessink.so``
    - ``libgstnveglstreamsrc.so``
    - ``libgstnvegltransform.so``
    - ``libgstnvivafilter.so``
    - ``libgstnvjpeg.so``
    - ``libgstnvtee.so``
    - ``libgstnvvidconv.so``
    - ``libgstnvvideo4linux2.so``
    - ``libgstnvvideocuda.so``
    - ``libgstnvvideosink.so``
    - ``libgstnvvideosinks.so``
    - ``libgstomx.so``

.. _SD.Multimedia.AcceleratedGstreamer-Nvgstcapture10Reference:

nvgstcapture-1.0 Reference
@@@@@@@@@@@@@@@@@@@@@@@@@@

This section describes the nvgstcapture-1.0 application.

.. note::
   By default, ``nvgstcapture-1.0`` only supports the ARGUS API using the nvarguscamerasrc plugin. The legacy ``nvcamerasrc`` plugin is no longer supported.

Command Line Options
####################

To display command usage information, run ``nvgstcapture-1.0`` with one of these command line options:one of these command line options:

- ``-h`` or ``--help``: Shows command line options except for GStreamer options.
- ``--help-all``: Shows all command line options.
- ``--help-get``: Shows GStreamer command line options.

The following table describes the application’s other command-line options:

.. raw:: html
   :file: AcceleratedGstreamer/Nvgstcapture10CommandLineOptions.htm

CSI Camera Supported Resolutions
################################

CSI camera supports the following image resolutions for Nvarguscamera:

.. todo:: This is ungrammatical, and it's not clear what it means. Some possibilities: "A CSI camera," "The CSI camera," "The CSI camera driver."

-  640\ |times|\ 480
-  1280\ |times|\ 720
-  1920\ |times|\ 1080
-  2104\ |times|\ 1560
-  2592\ |times|\ 1944
-  2616\ |times|\ 1472
-  3840\ |times|\ 2160
-  3896\ |times|\ 2192
-  4208\ |times|\ 3120
-  5632\ |times|\ 3168
-  5632\ |times|\ 4224

CSI Camera Runtime Commands
###########################

Options for Nvarguscamera
$$$$$$$$$$$$$$$$$$$$$$$$$

.. todo::
   It's not clear what this subheading means, since the heading seems to describe the contents accurately (it's a table of commands, not options) and there are no others. The following text says "commands options," which appears to mean "commands."

   As a related issue, our style guidelines do not allow a subheading to follow a heading without intervening text. If both headings are needed, we need text between them.

   As a separate issue, is the name really "Nvarguscamera," with an initial cap? This is very unusual for a program name. Maybe it's the proper name (distinguished from the filename) of a library or API, but the fact that it starts with the letters 'Nv', and the fact that it's not a pronounceable word, argue against that.

   As another separate issue, are we talking about "Nvarguscamera," or "CSI camera"? Maybe the two terms are equivalent, but we should use one consistently.

The following table describes CSI camera runtime command line options for ``Nvarguscamera``.

+--------------------------------------------------------------------------+
| Nvarguscamera command line options                                       |
+-------------+--------------------------+---------------------------------+
| Command     | Description              | Value and examples              |
+=============+==========================+=================================+
| h           | Help.                    | |mdash|                         |
+-------------+--------------------------+---------------------------------+
| q           | Quit.                    | |mdash|                         |
+-------------+--------------------------+---------------------------------+
| mo:<value>  | Set capture mode.        | 1: image                        |
|             |                          |                                 |
|             |                          | 2: video                        |
+-------------+--------------------------+---------------------------------+
| gmo         | Get capture mode.        | |mdash|                         |
+-------------+--------------------------+---------------------------------+
| so:<val>    | Set sensor orientation.  | 0: none                         |
|             |                          |                                 |
|             |                          | 1: rotate counter-clockwise     |
|             |                          | 90\ |deg|                       |
|             |                          |                                 |
|             |                          | 2: rotate 180\ |deg|            |
|             |                          |                                 |
|             |                          | 3: rotate clockwise 90\ |deg|   |
+-------------+--------------------------+---------------------------------+
| gso         | Get sensor orientation.  | |mdash|                         |
+-------------+--------------------------+---------------------------------+
| wb:<value>  | Set white balance mode.  | 0: off                          |
|             |                          |                                 |
|             |                          | 1: auto                         |
|             |                          |                                 |
|             |                          | 2: incandescent                 |
|             |                          |                                 |
|             |                          | 3: fluorescent                  |
|             |                          |                                 |
|             |                          | 4: warm-fluorescent             |
|             |                          |                                 |
|             |                          | 5: daylight                     |
|             |                          |                                 |
|             |                          | 6: cloudy-daylight              |
|             |                          |                                 |
|             |                          | 7: twilight                     |
|             |                          |                                 |
|             |                          | 8: shade                        |
|             |                          |                                 |
|             |                          | 9: manual                       |
+-------------+--------------------------+---------------------------------+
| gwb         | Get white balance mode.  | |mdash|                         |
+-------------+--------------------------+---------------------------------+
| st:<value>  | Set saturation.          | 0-2                             |
|             |                          |                                 |
|             |                          | Example: ``st:1.25``            |
+-------------+--------------------------+---------------------------------+
| gst         | Get saturation.          | |mdash|                         |
+-------------+--------------------------+---------------------------------+
| j           | Capture one image.       | |mdash|                         |
+-------------+--------------------------+---------------------------------+
| jx<sec>     | Capture after a delay of | |mdash|                         |
|             | ``<sec>`` seconds.       |                                 |
|             |                          | Example: ``jx5000`` for a       |
|             |                          | 5\ |nbsp|\ second delay.        |
+-------------+--------------------------+---------------------------------+
| j:<value>   | Capture ``<count>``      | |mdash|                         |
|             | images in succession.    |                                 |
|             |                          | Example: ``j:6`` to capture     |
|             |                          | 6\ |nbsp|\ images.              |
+-------------+--------------------------+---------------------------------+
| 0           | Stop recording video.    | |mdash|                         |
+-------------+--------------------------+---------------------------------+
| 1           | Start recording video.   | |mdash|                         |
+-------------+--------------------------+---------------------------------+
| 2           | Video snapshot (while    | |mdash|                         |
|             | recording video).        |                                 |
+-------------+--------------------------+---------------------------------+
| gpcr        | Get preview resolution.  | |mdash|                         |
+-------------+--------------------------+---------------------------------+
| gicr        | Get image capture        | |mdash|                         |
|             | resolution.              |                                 |
+-------------+--------------------------+---------------------------------+
| gvcr        | Get video capture        | |mdash|                         |
|             | resolution.              |                                 |
+-------------+--------------------------+---------------------------------+

USB Camera Runtime Commands
###########################

The following table describes USB camera runtime commands.

+------------------------------------------------------------------------------+
| USB camera runtime commands                                                  |
+-------------+-----------------------------------+----------------------------+
| Command     | Description                       | Value and examples         |
+=============+===================================+============================+
| h           | Help.                             | |mdash|                    |
+-------------+-----------------------------------+----------------------------+
| q           | Quit.                             | |mdash|                    |
+-------------+-----------------------------------+----------------------------+
| mo:<value>  | Set capture mode.                 | 1: image                   |
|             |                                   |                            |
|             |                                   | 2: video                   |
+-------------+-----------------------------------+----------------------------+
| gmo         | Get capture mode.                 | |mdash|                    |
+-------------+-----------------------------------+----------------------------+
| j           | Capture one image.                | |mdash|                    |
+-------------+-----------------------------------+----------------------------+
| jx<ms>      | Capture after a delay of          | |mdash|                    |
|             | ``<ms>`` milliseconds.            |                            |
|             |                                   | Example: ``jx5000`` to     |
|             |                                   | capture after a            |
|             |                                   | 5000\ |nbsp|\ millisecond  |
|             |                                   | (5\ |nbsp|\ second) delay. |
+-------------+-----------------------------------+----------------------------+
| j:<n>       | Capture ``<n>`` images in         | |mdash|                    |
|             | succession.                       |                            |
|             |                                   | Example: ``j:6`` to capture|
|             |                                   | 6\ |nbsp|\ images.         |
+-------------+-----------------------------------+----------------------------+
| 1           | Start recording video.            | |mdash|                    |
+-------------+-----------------------------------+----------------------------+
| 0           | Stop recording video.             | |mdash|                    |
+-------------+-----------------------------------+----------------------------+
| pcr:<value> | Set preview resolution.           | 0: 176\ |times|\ 144       |
|             |                                   |                            |
|             |                                   | 1: 320\ |times|\ 240       |
|             |                                   |                            |
|             |                                   | 2: 640\ |times|\ 480       |
|             |                                   |                            |
|             |                                   | 3: 1280\ |times|\ 720      |
+-------------+-----------------------------------+----------------------------+
| gpcr        | Get preview resolution.           | |mdash|                    |
+-------------+-----------------------------------+----------------------------+
| gicr        | Get image capture resolution.     | |mdash|                    |
+-------------+-----------------------------------+----------------------------+
| gvcr        | Get video capture resolution.     | |mdash|                    |
+-------------+-----------------------------------+----------------------------+
| br:<value>  | Set encoding bit rate in bytes.   | Example: ``br:4000000``    |
+-------------+-----------------------------------+----------------------------+
| gbr         | Get encoding bit rate.            | |mdash|                    |
+-------------+-----------------------------------+----------------------------+
| cdn:<value> | Set capture device node.          | 0: ``//dev/video0``        |
|             |                                   |                            |
|             |                                   | 1: ``//dev/video1``        |
|             |                                   |                            |
|             |                                   | 2: ``//dev/video2``        |
+-------------+-----------------------------------+----------------------------+
| gcdn        | Get capture device node.          | |mdash|                    |
+-------------+-----------------------------------+----------------------------+

Runtime Video Encoder Configuration Options
###########################################

The following table describes runtime video encoder configuration
options supported for ``Nvarguscamera``.

+------------------------------------------------------------------------+
| Runtime video encoder options                                          |
+-------------+--------------------------------+-------------------------+
| Command     | Description                    | Value and examples      |
+=============+================================+=========================+
| br:<val>    | Sets encoding bit-rate in      | Example: ``br:4000000`` |
|             | bytes.                         |                         |
+-------------+--------------------------------+-------------------------+
| gbr         | Gets encoding bit-rate in      | |mdash|                 |
|             | bytes.                         |                         |
+-------------+--------------------------------+-------------------------+
| ep:<val>    | Sets encoding profile (for     | 0: baseline             |
|             | H.264 only).                   |                         |
|             |                                | 1: main                 |
|             |                                |                         |
|             |                                | 2: high                 |
|             |                                |                         |
|             |                                | Example: ``ep:1``       |
+-------------+--------------------------------+-------------------------+
| gep         | Gets encoding profile (for     | |mdash|                 |
|             | H.264 only).                   |                         |
+-------------+--------------------------------+-------------------------+
| Enter ‘f’   | Forces IDR frame on video      | |mdash|                 |
|             | encoder (for H.264 only).      |                         |
+-------------+--------------------------------+-------------------------+

.. todo::
   In the last entry, what does "Enter 'f'" mean, as opposed to just "f"? Presumably configuration options go in a file (what file?), so "enter" would be a direction concerning the text editor, not the option, which makes no sense.

   Note, the same issue occurs in other places.

Notes
#####

- ``nvgstcapture-1.0`` generates image and video output files in the same directory as the application itself.

- Filenames are respectively in these formats:

  - Image content: ``nvcamtest_<pid>_<sensor_id>_<counter>.jpg``
  - Video content: ``nvcamtest_<pid>_<sensor_id>_<counter>.mp4``

  Where:

  - ``<pid>`` is the process ID.
  - ``<sensor_id>`` is the sensor ID.
  - ``<counter>`` is a counter starting from 0 each time the application is run.

- Rename or move files between runs to avoid overwriting results you want to save.

- The application supports native capture mode (video only) by default.

- Advanced features, such as setting zoom, brightness, exposure, and whitebalance levels, are not supported for USB cameras.

nvgstplayer-1.0 Reference
@@@@@@@@@@@@@@@@@@@@@@@@@

This section describes the operation of the the ``nvgstplayer-1.0`` application.

nvgstplayer-1.0 Command Line Options
####################################

.. note::
   To list supported options, enter the command::

      $ nvgstplayer-1.0 --help

This table describes ``nvgstplayer-1.0`` command line options.

.. raw:: html
   :file: AcceleratedGstreamer/Nvgstplayer10CommandLineOptions.htm

nvgstplayer-1.0 Runtime Commands
################################

This table describes nvgstplayer runtime commands.

.. raw:: html
   :file: AcceleratedGstreamer/Nvgstplayer10RuntimeCommands.htm

Video Encoder Features
@@@@@@@@@@@@@@@@@@@@@@

The respective GStreamer-1.0-based ``gst-omx`` video encoders support the following features:

.. raw:: html
   :file: AcceleratedGstreamer/VideoEncoderFeatures~GstOmx.htm

The respective GStreamer-1.0-based ``gst-v4l2`` video encoders support the following features:

.. raw:: html
   :file: AcceleratedGstreamer/VideoEncoderFeatures~GstV4l2.htm

Supported Cameras
@@@@@@@@@@@@@@@@@

This section describes the supported cameras.

CSI Cameras
###########

- Jetson AGX Xavier series can capture camera images via CSI interface.

  .. todo:: Presumably also Xavier NX? Orin?

-  Jetson AGX Xavier series supports both YUV and RAW Bayer capture data.

  .. todo:: Presumably also Xavier NX? Orin?

- GStreamer supports simultaneous capture from multiple CSI cameras. Support is validated using the ``nvgstcapture`` application.

- Capture is validated for SDR, PWL HDR and DOL HDR modes for various sensors using the ``nvgstcapture`` application.

- Jetson AGX Xavier series also support the MIPI CSI virtual channel feature. The virtual channel is a unique channel identifier used for multiplexed sensor streams sharing the same CSI port/brick and CSI stream through supported GMSL (Gigabit Multimedia Serial Link) aggregators.

- GMSL + VC capture is validated on Jetson AGX Xavier series using the ``nvgstcapture`` application. The reference GMSL module (MAX9295-serializer/­MAX9296-deserializer/­IMX390-sensor) is used for validation purposes.

USB 2.0 Cameras
###############

The following camera has been validated on Jetson platforms running Jetson Linux with USB 2.0 ports. This camera is UVC compliant.

- `Logitech C920 <https://gstreamer.freedesktop.org/src/>`__

Industrial Camera Details
#########################

The following USB 3.0 industrial camera is validated on Jetson AGX Xavier series under Jetson Linux:

- `See3CAM_CU130 <https://www.e-consystems.com/UltraHD-USB-Camera.asp>`__

Characteristics of this camera are:

-  USB 3.0
-  UVC compliant
-  3840\ |times|\ 2160 at 30 FPS; 4224\ |times|\ 3156 at 13 FPS
-  Purpose\ |mdash|\ embedded navigation
-  Test using the nvgstcapture app.
-  Issues encountered:

   FPS cannot be fixed. Changes based on exposure.

   FPS cannot be changed. Needs payment to vendor to get the support added to their firmware.
