Features¶
Here are a few of the key features and concepts of the Camera Module. Information about how to connect and configure a camera can be found in the chapter Installation guide. We recommend to use GStreamer open source multimedia framework to handle camera streams through pipelines.
GStreamer Pipelines¶
GStreamer is pre-installed on the displays and it is a handy way to create pipelines that enable advanced video display and/or manipulation. You can use GStreamer both as a quick way to test a camera stream or integrate it as the “video engine” in your own QML application.
Figure 1. The principles of a GStreamer pipeline.
The main idea with a GStreamer pipeline is that it has a source, for example a camera or a video file, and a sink that is the output of the pipeline, normally a video display component. Each part of the pipeline is called an element. Between the source and the sink of the pipeline, there are normally several elements that decode, filter, convert, format or manipulate the video image. Each element has also a source and sink, so you can see the pipeline as a chain of elements connected via their sources and sinks respectively. Each sink must be “compatible” with the source it connects to.
The following table shows a few common elements mainly for i.MX8 and their functions. Sometimes there is a hardware accelerated version available.
Element | Description | Alternative element |
---|---|---|
v4l2h264dec v4l2jpegdec etc. |
Video compression decoders | decodebin (auto-decoder) |
autovideosink | Automatically detected videosink | kmssink (gst-launch-1.0) qml6glsink (Qt6 applications) |
videoconverter | Colorspace converter | imxvideoconverter_g2d |
compositor | Multiple video stream compositor | imxcompositor_g2d |
Below are a few example pipelines that show how to display the video stream from an IP camera. These pipelines are executed directly in the Linux shell and the elements at the end of the pipelines differ depending on the hardware. We use a camera that streams on port 50004 as source.
i.MX6 (VS):
gst-launch-1.0 udpsrc port=50004 caps="application/x-rtp,encoding-name=JPEG,payload=26" ! rtpjpegdepay ! jpegparse ! imxvpudec ! autovideosink sync=false
i.MX6 (VI):
gst-launch-1.0 udpsrc port=50004 caps="application/x-rtp,encoding-name=JPEG,payload=26" ! rtpjpegdepay ! jpegparse ! jpegdec ! autovideosink sync=false
i.MX8 (V510, V700, V705, V710, V1000, V1200):
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! decodebin ! imxvideoconvert_g2d ! kmssink sync=false
Intel Atom (X1200):
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! vaapidecodebin ! queue ! kmssink sync=false
You can also create a test stream that simulates a camera for testing the pipeline on the display. From the Virtual Machine or another display, use one of the following pipelines that will display a video test source or a video file replay on the target display:
gst-launch-1.0 videotestsrc ! jpegenc ! rtpjpegpay ! udpsink host='IP' port=50004
gst-launch-1.0 filesrc location=myvideo.mp4 ! decodebin ! jpegenc ! rtpjpegpay ! udpsink host='IP' port=50004
where ‘IP’ is the IP number of the receiving display.
💡 Since the processing of high resolution images is resource consuming, a good idea is to adapt the resolution of the camera stream to fit the size of the area where it is going to be displayed.
Wayland vs. EGLFS¶
Wayland is a display server protocol that makes it possible to have several applications running simultaneously. With a Wayland compositor, you can for example run a camera display application on half the screen and a machine control system on the other half. You can read more about this in the Waylandsink section below.
EGLFS is the default “basic” platform that can only manage one application at the time in full screen.
The CCpilot displays are equipped with the reference implementation of Wayland compositor called Weston. If you run GStreamer pipelines with the CLI gst-launch-1.0, you need to adapt the video sink to the platform you are currently running or try the autovideosink to let GStreamer pick the best suitable sink.
Weston can be started and stopped by the following commands:
i.MX6:
/etc/init.d/weston stop
/etc/init.d/weston start
i.MX8:
systemctl stop weston
systemctl start weston
systemctl disable weston (will disable autostart of Weston at bootup)
GStreamer in QML Applications¶
A GStreamer operated camera view within a QML application can be achieved in several ways. CrossControl is currently supporting two techniques, i.e. using the GStreamer SDK with QmlGlSink element or QMultimedia package of Qt 5.x.
Qml6GlSink/QmlGlSink¶
An advantage of using the Qml6GlSink (Qt6) or QmlGlSink (Qt5) element is that is has better support for hardware acceleration and how the pipeline is being built up is in the hands of the developer. The disadvantage is that you have to construct the pipeline element-by-element in code, connect the video sink to its QML viewer and handle the execution of the pipeline.
One golden mean is to use the gst-parse-launch function which offers a string-based way to create the pipeline, just like with gst-launch-1.0, but still with the possibility to alter the pipeline from code later on.
You can use the Eth Camera QML Example
to get a few examples of how to view a camera stream on a display using the sink and Qt.
⚠️ Tests have shown that the QmlGlSink is not fully functional when running on the Wayland platform. The result is that the application execution gets stalled if there is no active video stream constantly running. This has been solved with the Qml6GlSink.
The QML sink (Qt6/Qt5) and the Eth Camera QML Example
(Qt6) can be easily downloaded and installed from the Downloads page.
The QML example currently covers the following:
MJPEG camera stream
H264/H265 camera stream
RTSP camera stream
gst-parse-launch example
MP4 video clip playback
QMultimedia and MediaPlayer¶
⚠️ Important: QMultimedia is currently only available in Qt 5.x. With Qt 6.5.3, the Qml6GlSink is recommended for GStreamer integration in Qt applications.
An alternative to QmlGlSink is to use the MediaPlayer component from the QMultimedia package. MediaPlayer is easy to use and the pipeline can be defined in the QML part as one string with the elements separated by “!” in the common way. MediaPlayer also has built-in functions for start, stop and auto-replay of the pipeline.
A minimal example of how to use MediaPlayer may look as follows:
MediaPlayer {
id: mediaplayer
source: Qt.resolvedUrl("gst-pipeline udpsrc port=50004 ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! qtvideosink")
Component.onCompleted: {
mediaplayer.play();
}
}
VideoOutput {
anchors.fill: parent
source: mediaplayer
}
See the QtMultimedia QML example project
for a complete example of how a camera stream can be displayed in QML using the QMultimedia and MediaPlayer. You can request the source code for it by contacting our support.
Waylandsink¶
The waylandsink is a GStreamer sink that renders the video stream into its own window. In combination with a Wayland compositor, for example the Window Manager, it is possible to create an advanced user interface with embedded video view without the need of creating a separate video viewer application.
Figure 2. Example of Waylandsink used in Window Manager combined with a gauge cluster application on top.
The waylandsink can be launched directly from the command line interface, for example:
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! decodebin ! imxvideoconvert_g2d ! queue ! waylandsink sync=false
Displaying Multiple Camera Feeds¶
Several camera feeds can be combined into one output stream using the compositor element imxcompositor_g2d of GStreamer.
Example:
gst-launch-1.0 imxcompositor_g2d name=comp \
sink_0::xpos=0 \
sink_0::ypos=0 \
sink_0::width=400 \
sink_0::height=240 \
sink_1::xpos=400 \
sink_1::ypos=0 \
sink_1::width=400 \
sink_1::height=240 \
sink_2::xpos=0 \
sink_2::ypos=240 \
sink_2::width=400 \
sink_2::height=240 \
sink_3::xpos=400 \
sink_3::ypos=240 \
sink_3::width=400 \
sink_3::height=240 \
! imxvideoconvert_g2d ! kmssink sync=false \
udpsrc port=50001 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! jpegparse ! v4l2jpegdec ! comp.sink_0 \
udpsrc port=50002 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! jpegparse ! v4l2jpegdec ! comp.sink_1 \
udpsrc port=50003 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! jpegparse ! v4l2jpegdec ! comp.sink_2 \
udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! jpegparse ! v4l2jpegdec ! comp.sink_3
Figure 3. Example of the GStreamer compositor used to merge four camera inputs into one composite output.
This is a convenient solution in cases where there is no need for interactive control of the locations or dimensions of the camera views during runtime. More flexible solutions require individual handling of each camera stream in the application or to use several window contexts such as in the Window Manager.
QtWidgets¶
A QtWidgets example using GStreamer is available on request. Please contact our support.
Troubleshooting¶
💡 If the camera stream appears with green lines or other distortion the reason might be that there is a mismatch between standard and hardware accelerated elements in the pipeline.
This happens especially when using the decodebin element which, according to the GStreamer documentation, “auto-magically constructs a decoding pipeline using available decoders and demuxers via auto-plugging”. Adding an imxvideoconvert_g2d may solve the problem.
💡 It can sometimes be hard to get the pipeline to work the way you want. When running a QML application, you can set the environment variable GST_DEBUG to a value between 1 and 9. This will enable different levels of detailed debug information to be written in the Output of the application. Setting the variable to 0 will disable the debug information.
💡 You can use the gst-inspect-1.0 command to get a list of all available GStreamer elements or help about a specific element.
💡 Insufficient access rights to create the GStreamer pipeline or error message about XDG_RUNTIME_DIR not set may occur when running a pipeline under Wayland or as user with insufficient execution rights. The following export can be tested to resolve the problem:
export XDG_RUNTIME_DIR=/run/usr/994 (CCLinux 3.4)
or
export XDG_RUNTIME_DIR=/run/user/0 (other CCLinux versions)
Then try to execute the CLI or the application with elevated user rights, with environment preserved:
sudo -E gst-launch-1.0 ...
or
sudo -E ./<app>
Eth Camera Configurator¶
The eth-camera-configurator
is a configuration tool that is included in the Camera Module. With this tool you can configure any camera that supports the ISO17215 standard. See the Installation guide page for more details.
Eth Camera Settings¶
The Eth Camera Settings is a library that you can use to embed camera configuration functionality into your own program, for example for initializing the camera with the correct settings before starting to receive a video stream. The eth-camera-configurator
mentioned above also uses this API. See the IPCamera API Reference for a complete reference guide to the API of the library functions.
Known Issues¶
When making several calls to the library, there might be an application failure if a new call is made without waiting for the response from the camera of the previous call.
CODESYS¶
CODESYS is based on QML, from v3.5.17, and using a camera stream in a CODESYS application is similar to how to implement it in a QML application.
See the CameraView Guide for more information and and example of how a camera stream can be displayed in a CODESYS application.
Read more:
Using the ETHCamerasettings library to control ETH Cameras
Using gstreamer to show MJPEG IP camera stream
Stream ETH-Video from Display to Display or VM to Display
Sending SubscribeROIVideo through the terminal