Here are a few of the key features and concepts of the Camera Module. Information about how to connect and configure a camera can be found in the chapter Installation guide. We recommend to use GStreamer open source multimedia framework to handle camera streams through pipelines.

GStreamer Pipelines

GStreamer is pre-installed on the displays and it is a handy way to create pipelines that enable advanced video display and/or manipulation. You can use GStreamer both as a quick way to test a camera stream or integrate it as the “video engine” in your own QML application.

flowchart LR A[source] --> B[decoder] B --> C[filter] C --> D[manipulator] D --> E[...] E --> F[sink]

Figure 2. The principles of a GStreamer pipeline.

The main idea with a GStreamer pipeline is that it has a source, for example a camera or a video file, and a sink that is the output of the pipeline, normally a video display component. Each part of the pipeline is called an element. Between the source and the sink of the pipeline, there are normally several elements that decode, filter, convert, format or manipulate the video image. Each element has also a source and sink, so you can see the pipeline as a chain of elements connected via their sources and sinks respectively. Each sink must be “compatible” with the source it connects to.

Below are a few example pipelines that show how to display the video stream from an IP camera. These pipelines are executed directly in the Linux shell and the elements at the end of the pipelines differ depending on the hardware. We use a camera that streams on port 50004 as source.

i.MX6 (VS):

gst-launch-1.0 udpsrc port=50004 caps="application/x-rtp,encoding-name=JPEG,payload=26" ! rtpjpegdepay ! jpegparse ! imxvpudec ! autovideosink sync=false

i.MX6 (VI):

gst-launch-1.0 udpsrc port=50004 caps="application/x-rtp,encoding-name=JPEG,payload=26" ! rtpjpegdepay ! jpegparse ! jpegdec ! autovideosink sync=false

i.MX8 (V700, V1000, V1200):

gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! decodebin ! videoconvert ! kmssink sync=false


gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! queue ! videoconvert ! vaapisink display=5 fullscreen=true

You can also create a test stream that simulates a camera for testing the pipeline on the display. From the Virtual Machine or another display, use one of the following pipelines that will display a video test source or a video file replay on the target display:

gst-launch-1.0 videotestsrc ! jpegenc ! rtpjpegpay ! udpsink host='IP' port=50004
gst-launch-1.0 filesrc location=myvideo.mp4 ! decodebin ! jpegenc ! rtpjpegpay ! udpsink host='IP' port=50004

where ‘IP’ is the IP number of the receiving display.

💡 Since the processing of high resolution images is resource consuming, a good idea is to adapt the resolution of the camera stream to fit the size of the area where it is going to be displayed.

Wayland vs. EGLFS

Wayland is a display server protocol that makes it possible to have several applications running simultaneously. With a Wayland compositor, you can for example run a camera display application on half the screen and a machine control system on the other half.

EGLFS is the default “basic” platform that can only manage one application at the time in full screen.

The CCpilot displays are equipped with the reference implementation of Wayland compositor called Weston. If you run GStreamer pipelines with the CLI gst-launch-1.0, you need to adapt the video sink to the platform you are currently running or try the autovideosink to let GStreamer pick the best suitable sink.

Weston can be started and stopped by the following commands:


/etc/init.d/weston stop
/etc/init.d/weston start


systemctl stop weston
systemctl start weston
systemctl disable weston (will disable autostart of Weston at bootup)

GStreamer in QML Applications

A GStreamer operated camera view within a QML application can be achieved in several ways. CrossControl is currently supporting two techniques, i.e. using the GStreamer SDK with QmlGlSink element or QMultimedia package of Qt.


One advantage by using the QmlGlSink element is that is has better support for hardware acceleration. The disadvantage is that you have to build up the pipeline element-by-element in code, connect the video sink to its QML viewer and handling the threading of the pipeline.

You can use the Eth Camera QML Example to get an example of how to display a camera stream on a display using QmlGlSink and Qt. Please contact our support to request the source code for the example.

⚠️ Tests have shown that QmlGlSink is not fully functional when running on the Wayland platform. The result is that the application execution gets stalled if there is no active video stream constantly running.

QMultimedia and MediaPlayer

An alternative to QmlGlSink is to use the MediaPlayer component from the QMultimedia package. MediaPlayer is easy to use and the pipeline can be defined in the QML part as one string with the elements separated by “!” in the common way. MediaPlayer also has built-in functions for start, stop and auto-replay of the pipeline.

A minimal example of how to use MediaPlayer may look as follows:

MediaPlayer {
    id: mediaplayer
    source: Qt.resolvedUrl("gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! qtvideosink")

    Component.onCompleted: {

VideoOutput {
    anchors.fill: parent
    source: mediaplayer

See the QtMultimedia QML example project for a complete example of how a camera stream can be displayed in QML using the QMultimedia and MediaPlayer. You can request the source code for it by contacting our support.

💡 It can sometimes be hard to get the pipeline to work the way you want. When running a QML application, you can set the environment variable GST_DEBUG to a value between 1 and 9. This will enable different levels of detailed debug information to be written in the Output of the application. Setting the variable to 0 will disable the debug information.

Eth Camera Configurator

The eth-camera-configurator is a configuration tool that is included in the Camera Module. With this tool you can configure any camera that supports the ISO17215 standard. See the Installation guide page for more details.

Eth Camera Settings

The Eth Camera Settings is a library that you can use to embed camera configuration functionality into your own program, for example for initializing the camera with the correct settings before starting to receive a video stream. The eth-camera-configurator mentioned above also uses this API. See the IPCamera API Reference for a complete reference guide to the API of the library functions.


CODESYS is based on QML, from v3.5.17, and using a camera stream in a CODESYS application is similar to how to implement it in a QML application.

See the CODESYS Camera Example Project for an example of how a camera stream can be displayed in a CODESYS application. You can request it from our support.

Read more:
Using the ETHCamerasettings library to control ETH Cameras
Using gstreamer to show MJPEG IP camera stream
Stream ETH-Video from Display to Display or VM to Display
Sending SubscribeROIVideo through the terminal