Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've been trying to setup a pipeline of pass through hdmi from an HDMI input to an HDMI output with an OrangePI5 Plus. I could talk for a long time (now) about the issues with vendor supplied kernels and unsupported hardware. I was completely naive until I had the hardware in hand, having not done any embedded work.

Right now, the best of breed thought that I have is to run Weston, have a Qt application that is full screen, and to use DMA buffers so I can do some zero copy processing. Rockchip has their own MPP and RGA libraries that are tied into the Mali GPU, and I'm not smart enough to understand the current level of driver/userspace support to not leverage these libraries.

Rockchip and the ARM ecosystem is such a mess.

If anyone has any pointers, experience, approaches, code, etc, I would love to see it.



Not sure the kind of processing you need to do on the video stream, but have you considered giving `ffmpeg` a try if you just need plain pass-thru from video input to output? `ffmpeg` might be built with support for the Mali libraries you mention for the OS you are using. If you are able to run `weston`, `ffmpeg` should be able to output directly to the DRM card thru the use of SDL2 (assuming it was built with it).

If the HDMI-USB capture card that outputs `mjpeg` exposes a `/dev/video` node, then it might be as simple as running:

`SDL_VIDEODRIVER=kmsdrm ffmpeg -f video4linux2 -input_format mjpeg -i /dev/video0 -f opengl "hdmi output"`

An alternative could be if you can get a Raspberry Pi 3 or even 2, and can get a distro where `omxplayer` can still be installed. You can then use `omxplayer` to display your mjpeg stream on the output of your choice, just make sure the `kms/fkms` dtoverlay is not loaded because `omxplayer` works directly with DispmanX/GPU (BTW, not compatible with Pi 4 and above and much less 5), which contains a hardware `mjpeg` decoder, so for the most part, bytes are sent directly to the GPU.

Hope some of this info can be of help.


Looks helpful! I assume ffmpeg needs to be built with SDL for this to work? I couldn't get it to work with my current minimal compile, but I don't think the board I'm working on has SDL, so might need to install that and recompile


That's correct, `ffmpeg` needs to be built with `SDL` (SDL2 really is what is used on all recent versions). When `ffmpeg` is built, and the dev files for SDL2 are present, ffmpeg's build configuration picks it up automatically and it will link against the library unless instructed otherwise by a configuration flag. When you run `ffmpeg` first lines usually show the configuration which it was built, so it might have hints as to what it was built with, but if you want to confirm what it was links against you can do a quick:

$ ldd `which ffmpeg`

And you should get the list of dynamic libraries your build is linked against. If SDL2 is indeed included, you should see a line starting with "libSDL2-2...".

If I remember correctly you should be able to output to the framebuffer even if there is no support for SDL2, you just have to change the previous line provided from using `-f opengl "hdmi output"` to `-pix_fmt bgra -f fbdev /dev/fb0`.

You can also use any other framebuffer device if present and you'd prefer it (e.g. /dev/fb1, /dev/fb2). Also, you might need something different to `bgra` on your board, but usually `ffmpeg` will drop a hint as to what.


In general DRM/KMS can be quite confusing a there seems little userland documentation available. I assume you get the DMA buffers from the HDMI input somehow? If so, you should be able to use drmModeAddFB2WithModifiers to create a DRM framebuffer from them. Then attach that to a DRM plane, place that on a CRTC and then schedule a page flip after modesetting a video mode.

The advantage would be that you can directly run without starting into any kind of visual environment first. But it's a huge mess to get going: I wrote quite a bit of Pi4/5 code recently to make a zero copy HEVC/H264 decoder working and it was a quite a challenge. Maybe code like https://github.com/dvdhrm/docs/tree/master/drm-howto can help?


The HDMI receive device on the OrangePi5 plus is in a semi-functional state. Collabora is in the process of up-streaming code so the RK3588 will work with the mainline linux kernel.

Until that happens, working driver code is in a very transitive space.

To get going, and sidestep that problem, I've purchased an HDMI to USB capture cards that use MacroSilicon chips. I've some thought of using a cheaper CPU in the future with a daughter board based on this project which uses MacoSilicon chips: https://github.com/YuzukiHD/YuzukiLOHCC-PRO, which made it potentially not a waste of time to dig into.

The MacroSilicon HDMI to USB capture cards output MJEPG, which Rockchip's MPP library has decoder for.

So the thought is: (1) allocate a DMA buffer (2) set that DMA buffer as the MJEPG decoder target (3) get the decoded data to display (sounds like I may need to encode again?) & a parallel processing pipeline

I'll dig into the stuff you've sent over, very helpful thanks for the pointers!

I've thought about switching to Pi4/5 for this. Based on your experience, would you recommend that platform?


> I've thought about switching to Pi4/5 for this. Based on your experience, would you recommend that platform?

Their kernel fork is well maintained and if there is a reproducible problem it usually gets fixed quite quickly. Overall pretty happy. KMS/DRM was a bit wonky as there was a transition phase where they used a hacky mix between KMS and the old proprietary broadcom APIs (FakeKMS). But those days are over and so far KMS/DRM works pretty well for what I'm using it.


Not the same thing but there is this project that does digital rgb to hdmi using a pi https://github.com/hoglet67/RGBtoHDMI I believe they use a custom firmware on the pi and a CPLD, but you could probably eliminate that doing hdmi to hdmi.


Fascinating, thanks for pointing this project out!


I know there is at least one ffmpeg fork with Rockchip mpp and rga support, although I haven’t tested it myself yet: https://github.com/nyanmisaka/ffmpeg-rockchip

I have tested the mpp SDK a bit and the code is easy to work with, with examples for encode and decode, both sync and async.


They don't have a MJPEG decoder yet, which is a blocker for hardware acceleration, but I'm going to try and patch the library with the author and get it added. Thanks for pointing it out!


you can also run Qt directly on the console fb without wayland/X


I might end up doing that. When I was first digging into it, the Qt documentation seemed confusing. But after sinking 10-20 hours into everything, it's starting to click a lot more.

Thanks for the pointer!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: