Ever generated a large gstreamer graph?
Make it a convenient map with OpenLayers.
gst-launch command from this previous article:
gst-launch-1.0 ximagesrc xid=0x200004c ! video/x-raw,framerate=30/1 ! videoconvert ! vaapih264enc quality-level=1 ! h264parse config-interval=10 ! queue ! mux. pulsesrc device=alsa_output.pci-0000_00_1f.3.analog-stereo.monitor ! "audio/x-raw,rate=16000,channels=2,depth=16" ! audioconvert ! avenc_aac ! aacparse ! queue ! mux. mpegtsmux name=mux ! rtpmp2tpay ! udpsink host=126.96.36.199 port=5000
You can dump a graph of the pipeline by adding
GST_DEBUG_DUMP_DOT_DIR=/tmp in front of the command.
See: gstreamer debug docs
Reminder: the dot graph in converted in to a png image via this command:
dot -Tpng 0.00.02.419736943-gst-launch.PLAYING_PAUSED.dot > pipeline.png
Because Terminals are cool~
There were simpler times on the internet when everything was text-based.
A time where cathode-ray tube displayed 80 by 24 blocks of glowing and colorful characters.
Don’t worry, Python comes to the rescue. It has a module to generate the ASCII art, without the pain:
Rich — https://github.com/willmcgugan/rich
Learn more about Rich capabilities in this article or this one
Tips to share a movie over internet and watch it together with someone else.
Do you miss watching a movie with friends or family? Isn’t it great when you can pause a movie and discuss about it?
If you try to share your desktop, running a movie player, inside your favorite video meeting solution, it is probably not going to be great because of low framerate and maybe bad sound. This functionality is designed for business presentation, not video.
This article shows how to stream your desktop movie player directly. It even works directly from a DVD. …
Using simple OpenGL 3D view rotation
Do you remember inastitch, the video stitcher for live stream?
Quick summary: inastitch merges three live video streams into a single wider stream.
In this previous project, homography of the input streams was simply achieved in 2D by warping pixels using an OpenGL pixel shader. This article introduces homography in 3D with a vertex shader. It becomes very easy to rotate the view.
The first evolution from the previous demo is to replace the convex camera setup with a concave camera setup.
As low as 100µs delay can be reached with stock Raspberry Pi and no custom hardware.
Examples use-cases for synchronized video streams are:
Let’s see an early stitching test with free running cameras.
Each camera is capturing at 90fps, hence there is about
11ms between each consecutive frame in the stream.
Scanning the streets might not need an expensive setup.
Do you remember the 8-camera video stitcher project?
Eight Raspberry Pi and corresponding camera are connected on a local network and synchronized to capture 360° panoramic images.
What if you let it grow taller and attach a bike to it?
Stitching with OpenCV is all nice, but what if you wanted to do it all in hardware on GPU with vanilla OpenGL?
OpenCV comes with an advanced sample implementation which produces great results on still images, however, using this program on every single frame of video streams is unsurprisingly extremely slow. This is the solution which was used for 360° video using Raspberry Pi(s).
This article follows the one mentioned above, where video frames were transformed on the CPU with OpenCV, and introduce a full GPU pipeline.
It should be noted that OpenCV has GPU support for many operations, but…
PTP on Raspberry Pi, it’s
apt install linuxptpand done, right?
Maybe not. Here are some tips to run it better than the default settings.
This article is part of a series about inastitch.
In this project, clock synchronization is critical for frame capture alignment.
It all started with this git repo of patches to make PTP work on Raspberry Pi 3: https://github.com/twteamware/raspberrypi-ptp. Those patches are now integrated in the official release of RPi OS.
performance.md is the most interesting. …
You have cool project idea and you want to make a prototype more compatible with the automotive world? Here are some design choices for you…
Of course, some old-school hard real-time C code running bare metal on a microcontroller would always rekindle the heart of automotive engineers, but for quick prototyping it is definitely not the most convenient target. On the other hand, using a Linux-based target is unfortunately not real-time, but enables to leverage many open-source libraries.
This article uses inastitch as sample project to explain how each library was used in a practical example.
Inastitch is a prototype…
What would it take to make your own video footage at 360 degrees?
The answer: a bunch of cameras and some open source software.
First we need to talk about image stitching. With no surprise, OpenCV has a great sample implementation for this and often gives impressive seamless results. The thing is we want to produce video, not static images, so we need to talk next about camera synchronization on the Raspberry Pi in order to capture frames at the same time. And finally, how do we make a 360 degrees video with all of this?