1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

SDR & HDR 1080p/4k capable setup with Hyperion-NG for Media Center

Discussion in 'Hyperion Setup Showcase' started by Awawa, 27 July 2020.

  1. Awawa

    Awawa Active Member

    Messages:
    227
    Hardware:
    RPi1/Zero, RPi3, +nodeMCU/ESP8266
    I want to share my configuration for HDR & SDR content and HyperionNG/RaspberryPi3. I made a very long way from utv007 & Rpi3 for SDR years ago. Well, SDR is not a problem but HDR is another story. This solution applies to 1080/4k PC movie player configuration only. Any nonstandard gaming FPS (freesync,gsync et) is not a concern: hardware for that solution doesn't exists yet beside almost analog one: the external capture camera.

    On the market there are only two budget products that can handle it: HDFury x4 (1080 only but it's not so critical for Hyperion usage) and HDFury Diva. You can save yourself money and abandon testing other Chinese matrix/scallers/grabbers that suppose to do the same: they can't at least for now. HDR images that they outputs have bleak colors and are almost useless for our Ambilight experience. Ezcap I use is no better than them....but it allows one hack to make HDR working almost perfectly.

    So you can buy one of HDFury devices or go for compromises: results are almost as good but not quite. And there is a need to output movie always in BT2020 color space (the solution I chose) or to switch configuration manually between HDR/SDR content (otherwise this would results in totally over-saturated colors).

    Used hardware:
    Rpi3->Rpi4, WS2801, ezcap 269, PC mediaplayer (mpc hc, latest madVR, oldie but a goldie myHTPC as frontend) with Flirc for remote control, nvidia gfx, HDR enabled TV/projector.
    The scheme is on the picture below:

    [​IMG]

    USB 2.0 is fully sufficient for Ezcap 269 in terms of speed (thanks to MJPEG compression) and power consumption (well, according to measures it's 0mA so it's probably powered by HDMI line).

    [​IMG]
    [​IMG]

    USB bus usage for 1080/30 mode is very low even for USB 2.0

    [​IMG]

    Some information about Ezcap 269:

    [​IMG]

    If the Ezcap is connected to USB2.0 it will report only MJPEG encoding:
    [​IMG]

    But when we connect it to USB3.0 YUV encoding is also available:
    [​IMG]


    There are some significant differences between that grabber and solutions based on analog (or HDMI->analog) UTV007: picture quality and you don't have to set crop for a frame or manually calibrate colors (beside the trick for HDR).

    Sorry for any spelling mistakes: English is not my primary.
     
    Last edited: 28 September 2020
    • Like Like x 4
  2. Awawa

    Awawa Active Member

    Messages:
    227
    Hardware:
    RPi1/Zero, RPi3, +nodeMCU/ESP8266
    Preparation

    I strongly suggest to install Raspbian with read only mode for a sake of SD card. It also enables switching to write mode if needed.

    The solution is described in the following link but make attention to comments because I had some troubles with /tmp folder and fixing that is a crucial thing (there are some folders hardlinked to /tmp and need to be re-created). Otherwise /tmp remains read-only folder after each reboot.

    https://medium.com/swlh/make-your-raspberry-pi-file-system-read-only-raspbian-buster-c558694de79

    Always verify output format on TV because there is possibility that you can have "almost" very nice output for HDR movie & Hyperion but in fact the content can be transcoded by software/graphic card to SDR output on the fly (these is the least desirable option, because it causes TV picture degradation).
     
    Last edited: 27 July 2020
  3. Awawa

    Awawa Active Member

    Messages:
    227
    Hardware:
    RPi1/Zero, RPi3, +nodeMCU/ESP8266
    MadVR configuration

    This results in:
    - automatic hardcoded black bar removal (not necessarily but then we don't need to do it with hyperion and the movie is always stretch to the full screen when MpcHC player is notified by MadVR)
    [​IMG]

    - automatic resolution switching
    [​IMG]

    - and mostly important: output SDR into BT2020 color space (there is an option enabled only for nvidia).
    [​IMG]

    - and same important HDR passthrough
    [​IMG]

    For Ezcap set that in Nvidia driver panel: RGB, 8bit, full range and of course enable HDR in Windows.
    [​IMG]

    Corelec:
    [​IMG]
     
    Last edited: 4 October 2020
  4. Awawa

    Awawa Active Member

    Messages:
    227
    Hardware:
    RPi1/Zero, RPi3, +nodeMCU/ESP8266
    COPY HYPERION CONFIGURATION FOLDER FROM ReadOnly BACKUP TO ReadWrite TEMPORARY FOLDER

    RaspberryPi configuration: move from ReadOnly backup folder to Hyperion configuration folder

    SSH to raspbian and create a script in user folder:

    hyperion.sh (make it executable, write-access need to be switched on, run it from /etc/rc.local)
    Code:
    #!/bin/bash
    FILE=/tmp/.hyperion/me
    if [ -f "$FILE" ]; then
        echo "config exists"
    else
        /bin/cp -avr /home/pi/hyperion/. /tmp/.hyperion/
    fi
    
    Make symlink /home/pi/.hyperion/ to /tmp/.hyperion/ first and create file /home/pi/hyperion/me
     
    Last edited: 17 September 2020
  5. Awawa

    Awawa Active Member

    Messages:
    227
    Hardware:
    RPi1/Zero, RPi3, +nodeMCU/ESP8266
    Fork installation

    Ezcap 269 works on USB 2.0 with MJPEG encoding.
    The penalty of MJPEG are jpg artifacts that can be visible on LED especially on dark scenes. Even if the movie is stopped but the grabber is still working there is a little noise. YUY2 is better and withgout these artifacts. Higher resolution can help quality of MJPEG stream. If you try to use MJPEG with current Hyperion.ng release you will be limited by resources to few FPS and lag may appear. You can reduce lag (and quality) with size decimation or you can try multithreaded fork:

    https://github.com/awawa-dev/HyperHDR

    Installation from package

    Before proceeding make sure that you have working Hyperion.NG setup because basic&common configuration isn't part of that topic. Then uninstall it before installing my fork.

    1 Connect Ezcap 269 to USB 2.0 (MJPEG) or USB 3.0 (YUY2) port of Rpi3/Rpi4.

    2 Install Hyperion from my fork.

    3 Generate LUT table from the LUT generator page (link in the grabber configuration page) and upload it to the Hyperion configuration folder.

    Typically /home/pi/.hyperion/.....
    That file (lut_lin_tables.3d) is also needed for YUV grabbers - internal table can be generated by Hyperion in this case if missing but without support for HDR.

    4 Restart service or RPi. Then enable HDR tone mapping in the grabber properties. If your are using MJPEG encoding then for better performance try Border mode (mainly for leds) or the full screen to preview result.

    5 Check result in the live feed (upper right corner) and debug log (System->Log). Without that unfortunately I cant help if something goes wrong.


    Installation on SD card


    Rpi 1 / Zero
    SD-card-image-rpi1-armv6.zip

    Rpi 2 / 3 / 4
    SD-card-image-rpi234-armv7.zip

    Just write img file on SD card like Hyperbian:
    https://docs.hyperion-project.org/en/user/HyperBian.html

    LUT table file is included!

    Default host changed from Hyperbian to hyperhdr so search www panel on:
    http://hyperhdr:8090/
     
    Last edited: 30 September 2020
  6. Awawa

    Awawa Active Member

    Messages:
    227
    Hardware:
    RPi1/Zero, RPi3, +nodeMCU/ESP8266
    Results (benchmark)

    Keep in mind that the delay is only for the grabber image processing.
    Image to LEDS, effects, black border detection, portal handling etc. take additional resources (if something left).

    Live testing: do not compare lag by comparing TV vs WWW Hyperion video live preview (as others factors come in play) but TV vs LEDS.

    Hyperion.NG, YUV, Rpi 3, single core
    640x480 15fps => delay 25ms, 40% CPU, 15FPS
    1280x720 10fps => delay 70ms, 70% CPU, 10FPS
    1920x1080 2fps => delay 170ms, 40% CPU, 2FPS

    Hyperion.NG, MJPEG, Rpi 3, single core
    640x480 30fps => delay 84ms, 100% CPU, 12FPS
    1024x768 30fps => delay 208ms, 100% CPU, 5FPS
    1920x1080 30fps => delay 542ms, 100% CPU, 2FPS

    HyperHDR, YUV, Rpi 3, multithreaded
    640x480 15fps => delay 8ms, 10/5/0/0% CPUS, 15FPS
    1280x720 10fps => delay 22ms, 20/10/0/0% CPUS, 10FPS
    1920x1080 2fps => delay 57ms, 5/2/0/0% CPUS, 2FPS

    HyperHDR, MJPEG, Rpi 3, multithreaded
    640x480 30fps => delay 10ms, 25/5/0/0% CPUS, 30FPS
    1024x768 30fps => delay 19ms, 50/10/10/0% CPUS, 30FPS
    1920x1080 30fps => delay 50ms, 60/50/30/0% CPUS, 30FPS

    HyperHDR, YUV, Rpi 4, multithreaded
    640x480 => delay 7ms
    1280x720 => delay 11ms
    1920x1080 => delay 30ms
     
    Last edited: 15 September 2020
    • Like Like x 1
  7. Awawa

    Awawa Active Member

    Messages:
    227
    Hardware:
    RPi1/Zero, RPi3, +nodeMCU/ESP8266
    Results (HDR video)

    Before and after on some HDR/BT2020 content that was broken by the ezcap 269 video grabber:
    [​IMG]
    [​IMG]
    [​IMG]
     
    Last edited: 15 September 2020
    • Like Like x 1
  8. Awawa

    Awawa Active Member

    Messages:
    227
    Hardware:
    RPi1/Zero, RPi3, +nodeMCU/ESP8266
    Results (live HDR)

    Testing lag:


     
    Last edited: 25 September 2020
    • Like Like x 1
  9. Awawa

    Awawa Active Member

    Messages:
    227
    Hardware:
    RPi1/Zero, RPi3, +nodeMCU/ESP8266
    Summary of HyperHDR fork changes:
    - Overall performance without tone mapping for USB grabbers improved x10 (MJPEG) and x3 (YUV) over Hyperion 2.0.0.8A thanks to optimization & using of multi-threading
    - Direct support for USB grabbers under Windows 10 using Microsoft Media Foundation (really fast & of course multi-threaded)
    - Build for newer Raspbian Buster. It's a complete migration from older Raspbian Stretch.
    - Option for hyperion-remote, JSON API and web GUI remote to turn on/off HDR tone mapping
    - MJPEG & YUV HDR LUT tone mapping
    - Hardware brightness&contrast control for USB grabbers (both Windows and Linux)
    - Ready to write SD images of HyperHDR
    - New option to choose video encoding format (for multi format grabbers for ex. Ezcap 269, MS2109 clones).
    - Add configurable Signal Threshold Counter option for signal detection
    - Option for luminescence & saturation for hyperion-remote
    - New advanced LED mean color algorithm in image->LED mapping
    - New weighted advanced LED mean color algorithm in image->LED mapping
    - Improved backlight algorithm to minimize leds flickering on the dark scenes (smoothing with continues output is still recommended)
    - Add old style color calibration
    - Fix for SK9822 leds on SPI (aka fake APA102)
    - Required libglvnd library dependency included for tar container.
    - Improved YUV decoding using LUT tables for speed up
    - Windows installer contains default LUT table
    - Installers for DEB & RPM now include LUT table
     
    Last edited: 31 October 2020
  10. TPmodding

    TPmodding Administrator Staff Member Administrator

    Messages:
    1,890
    Hardware:
    RPi1/Zero, RPi2, RPi3, +Arduino, +nodeMCU/ESP8266
    Hey, awesome!
    Is this not possible to add it to hyperion directly? With the a toggle in the WebUI?
     
  11. Awawa

    Awawa Active Member

    Messages:
    227
    Hardware:
    RPi1/Zero, RPi3, +nodeMCU/ESP8266
    Thanks! I will try but I'm not promising anything, because that part of WWW interface code is a little bit of magic for me (as QT & C++ that I've forgotten ... too many years developing in c# ;) )
     
    Last edited: 31 July 2020
  12. Awawa

    Awawa Active Member

    Messages:
    227
    Hardware:
    RPi1/Zero, RPi3, +nodeMCU/ESP8266
    Code:
    https://github.com/awawa-dev/hyperion.ng.git
     
    Last edited: 19 August 2020
    • Like Like x 1
  13. Paulchen-Panther

    Paulchen-Panther Moderator Staff Member Developer

    Messages:
    864
    Hardware:
    RPi1/Zero, RPi3, 32/64bit, +Arduino, +nodeMCU/ESP8266
  14. Awawa

    Awawa Active Member

    Messages:
    227
    Hardware:
    RPi1/Zero, RPi3, +nodeMCU/ESP8266
  15. Andrew_xXx

    Andrew_xXx Software Developer

    Messages:
    48
    As great research it is, i would not add this to hyperion, its not a real hdr to sdr tone mapping, it just some needlessly resource intensive trick to adjust sdr image that does not contain any hdr data and the resulting images are practically the same as those posted in the 4k hdr grabber thread, frankly the latter are little better, thus, there are no real advantages, its just wasting cpu cycles.

    Calling this option HDR to SDR tone mapping is false and misleading the users, its more like a SDR enhancer.

    It works for that fast-hdr project guy cuz he has direct true HDR video input with all hdr data present, which hyperion doesn't have.

    The thing with HDR is, its way more complicated than it was before, HDR streams have many hdr metadata present, those data needs to be there, at least the mandatory Mastering Display Color Volume metadata (RGB primaries, white point parameters, display maximum and minimum light levels) is needed to calibrate the end tv device, there are also MaxFALL metadata Maximum Frame Average Light Level and MaxCLL metadata Maximum Content Light Level.

    Every hdr device like tvs need these data to display HDR correctly. Every tv has its own HDR screen capabilities that are calibrated in the factory and hardcoded into it, it then calculates the right values for that display from the Mastering Display metadata and other parameters in the hdr stream and its own calibration data.

    It gets even worse, the basic HDR10 format is static metadata for the whole stream, the easiest one, HDR10+ and Dolby VISON is dynamic hdr per scene or even per frame, that metadata is present in SEI headers in HDR video, and it can be a lot of it, it is very complicated to process that data properly.

    To sum it up, its impossible for hyperion to get HDR to do the tone mapping correctly, they only way to do it is to have a grabber that is really outputting raw HDR metadata, which probably doesn't exist, even then im not sure it would be possible due to all the parameters i described. Its a lot of things to do.

    Those parameters are needed for any HDR device, including grabbers, so a working HDR to SDR grabber would already do everything of this and it looks like it would need to have embed its own virtual SDR display parameters to calculate the proper output.

    So those proper grabbers are probably having low MaxFLL and MaxCLL values like most SDR non HDR tvs, but the colours... its always estimated, there is not such thing as exact hdr or sdr colours, with HDR it is always interpreted from HDR values, display or grabber embed parameters, that also means that a single 3d lut file will not work for every hdr video as it can have different properties that needs to be taken into account when tone mapping.

    So, as hard as it gets, we can't win this, too much to calculate and process, raw hdr data unavailable and probably never will be available, its not a hdr to sdr tone mapper, i would not add this, i see no point, but if you want, dont give it this misleading name.
     
  16. Awawa

    Awawa Active Member

    Messages:
    227
    Hardware:
    RPi1/Zero, RPi3, +nodeMCU/ESP8266
    OK, I understand that and I repeated many times on this forum that we can't fix HDR from SDR result because some information are lost. As I review fast-hdr there is some math behind these results and they are way better than manipulating with saturation/brightness/contrast (doesn't work with HDR10+ and Dolby Vision either, don't know if there is any windows player that can handle DV) even if it's more resource hungry solution. So there is a tone mapping no doubt but rather "HDR" (broken by the grabber/matrix/splitter/scaller) to SDR - I thought that was clear.
    @Andrew_xXx
    you have alternative in the first post and I mentioned it: HDFury diva or x4 for $$$, other software alternatives are always workaround: that one produces very good result in the real time on Rpi3 and it's first attempt to implement it in Hyperion. You don't have to enable it, it's only an option that you can have or not. Beside if you could read it first before writing I would appreciate it:
    [​IMG][​IMG]
     
    Last edited: 2 August 2020
    • Like Like x 1
  17. Awawa

    Awawa Active Member

    Messages:
    227
    Hardware:
    RPi1/Zero, RPi3, +nodeMCU/ESP8266
    Last edited: 8 August 2020
    • Like Like x 1
  18. Andrew_xXx

    Andrew_xXx Software Developer

    Messages:
    48
    Sorry didn't answer, had rough week.

    Its just that it is not an HDR to SDR tone mapper and it never will be, the function name is misleading, on top of that hdr makes it so everyone based on his setup/devices could need different adjusting so it need to have some options.

    I was just analysing this and comparing with your own (did u change login?) adjusted screenshots from the topic https://hyperion-project.org/threads/4k-hdr-capable-hdmi-splitter-supposedly.631/page-12 and it looks all the same, just look

    corrected from the hdr thread ttps://postimg.cc/5jgLNDJz
    corrected this thread with 3d lut https://postimg.cc/kVpVwf2Y

    As i said its very close, i could even prefer the non 3d lut one, so what is the point of a 3d lut if it look basically the same, a very resource intensive option that even needs to lower the fps to get it to work real-time.
    Based on the screenshots i don't think there will be much difference in the led colors, at least not that noticeable if at all.

    So it is great to research it more before releasing anything and not call it HDR to SDR tone mapper, refine it more. A more interesting option would be an universal image adjuster that everyone can calibrate for its own purposes and it should be mandatory to have a quick switch option as we can't detect if it was HDR or not so it would need to be switched easily, maybe with presets.

    I cant remember now, but do hyperion allows to save v4l2 device setting configuration, like those to fix the image?

    Code:
    /usr/bin/v4l2-ctl --set-ctrl contrast=220
    /usr/bin/v4l2-ctl --set-ctrl saturation=255
    /usr/bin/v4l2-ctl --set-ctrl brightness=100
    /usr/bin/v4l2-ctl --set-ctrl hue=2
    So we would have and switchable image adjuster than can do all in one, able to set v4l2 commands and if it is not enough use the 3d lut or other software fixes.

    Also regarding the HDR data and 3D LUT, 3d lut is a non linear color mapping, but we dont have the HDR data and it is impossible to know from what rec 2020 colors it was transformed to a rec 709 space, so we can't do the non linear mapping as it is a pure guess, so im just confused how a non linear transformation could help

    A helper image

    [​IMG]
    Every data between the big triangle and small triangle is lost without information about what it was, if lets say we had a green like 2020 color between the upper triangles points and we have it transformed in a broken state by a 709 device then it could be any 2020 color between those points, no way to find out what it was.

    All im just saying, we can't use a non linear transformation without knowing how it is done. We could do it if we knew how the wrong colors are made, but for now i dont see how this would carry the information we need.

    I was thinking about this also, what if every device is wrongly transforming the colors in the same way, if we knew the way, and if it has some data that helps to revert it then we would have something to work on.

    If we only knew how the wrong colours are made, i did some tests with ffdshow and madvr tone mappers but couldn't get definitive results.

    I see u did also done some research about this here https://github.com/hyperion-project/hyperion.ng/pull/928 didn't have time yet to analyse it, but its very interesting.

    And yes i did think the same way, use 3d lut to revert the wrong colors, but there is no data to revert from, this is what i realised and i felt dumb :) It was never there at least to my current state of knowledge.

    On the other hand if we had the HDR raw data it is too complicated to transform it by RPI, its not only the 3d lut, its all the formats data, dolby, hdr10, hdr10+, transform colors smtpe, pq, lightning level, per scene HDR changes, its a lot to do to adjust it, so it is a lot better to have a device to do it for us, i did read some research from the creators of HDFury itself having all the issues i mentioned in my previous post, there is no guarantee that the same HDR data looks the same on different devices, it needs adjusting and they really have a virtual display parameters in grabber to calculate the SDR but it is always an approximation.

    This is probably why there only couple and expensive devices working, not even sure they support full specification for dolby vision.

    So i think we need to research this better and not rush this, but there is a light in the tunnel for sure.
     
  19. Awawa

    Awawa Active Member

    Messages:
    227
    Hardware:
    RPi1/Zero, RPi3, +nodeMCU/ESP8266
    Added support for multi-threading for MJPEG encoding (used by ezcap 269 among others).
    Before on Rpi 3+ I got 7-8FPS, big delay around 140ms and process using only one core reaching almost 100%.
    That causes some unpleasant feeling watching some action movie when LED's stayed behind action on TV.

    Using MT I've reached almost 50FPS & average delay around 30ms for 800x600 applying LUT without any decimation.
    The load is balanced on all 4 cores if needed.

    Patch for HDR/LUT correction is also already included.

    Won't go for the mainstream this time, because that multitreading is tested well only on Rpi3+ (and some on x86) and it brings too much revolution for current Hyperion Alpha stage.
    And there are some really stability and performance issues with video frame decoding in the current Hyperion version that LUT operation would make things only worse. I fixed things that I found in my fork for testing.

    Download sources & binaries from:
    https://github.com/awawa-dev/hyperion.ng


    Yes, and so we are waiting from 2017 at least for some solution for HDR bleak & wash out colors.
    And yes, there is a light at the end: expiring Philips' patents for ambilight... if any new and better transformation shows up can be used with current LUT feature.
    It's only a matter of creating generator for a new table. The table isn't build in - can be change by user at any time.
     
    Last edited: 19 August 2020
  20. Andrew_xXx

    Andrew_xXx Software Developer

    Messages:
    48
    Reassessed this again and there is no data to do proper lut, at least not to bring back proper or close to real hdr to sdr color, lut is for color mapping, and that is non linear, the basic methods with v4l2-ctl works equally good, lut correction is not noticeable on the leds either so there is no point for such changes and performance issues.
    The best thing we could have is that configurable image adjuster i mentioned with build in v4l2-ctl options, saturation, brightness, contrast and maybe even more, its enough.

    Philips ambilight patents are partially gone, so i guess no one is interested in it, they (tv producers) could also add a hdmi out with decimated (or not) image, by they don't bother either.