Close

ALSA style routing for video

A project log for Silly software wishlist

Motivation to do some software projects by writing them down.

lion-mclionheadlion mclionhead 04/17/2020 at 18:421 Comment

This will never happen, but it was an interesting idea.  Linux famously had a way to route audio using the .asoundrc file & the ALSA driver.  You could route audio to a loopback device & record it.  You could mix channels different ways, route audio to a network, change sample formats & maybe sample rates.  The conversions were done in the kernel.  

1 problem was the routing was highly dependent on samplerate & buffer size.  They didn't always have to match, but most combinations didn't work.  The loopback device absolutely never worked for lions.  The buffers always glitched.

Eventually, ALSA's kernel based routing framework was abandoned as the user space jack framework appeared.  The fact that jack got right what ALSA got wrong appears to be a human issue more than a physics issue.  Thus, the kernel now has tons of dead code which is replicated in user space.

No routing framework was ever implemented for video.  Video stopped at video4linux 2.  It supported recording & playback on dedicated video hardware.  You'd actually have a video decoding board output to a separate monitor or to another capture board in the same computer which could overlay it on the PC.

It originally was heading towards a framework for routing, the same as the audio drivers.  A routing framework for video would have required the output of any hardware decoder to end up in a frame buffer.  It would have had ways to loop back video sent to a playback device & mix it different ways.  Perhaps it would have amounted to picture in picture, layer modes, chroma keying, garbage mattes, & standard effects all enabled with a .videorc file & performed in the kernel.  Maybe it could have gone as far as time shifting in kernel space.  It might have been useful for video conferencing.

1 problem is while video playback originally required dedicated hardware to get good results, it took many paths while audio playback continued to require hardware.  Today, most video playback is purely in software & the highest quality video requires GPU hardware.  The video4linux 2 driver could have been expanded to accept H.265 frames & abstract all the steps involving the GPU to give the user decompressed RGB frames.   The routing framework could have just worked with the RGB frames.  Instead, we got some routines in the X server for handling low level macroblocks & motion vectors.  Of course, getting RGB frames from the output of the hardware could have been hard.


Multimedia on the desktop stopped progressing around 2010.  Since then, all progress has been on phones.  Android on the lion kingdom's phone no longer uses ALSA for audio or video4linux for 2 video.  Those device nodes are gone, replaced by vendor specific drivers.

Today, both the kernel & jack frameworks have been replaced yet again by user space API's in Java, Kotlin, or this week's publicity generating language.  The modern audio API's expose none of the routing frameworks that existed before, just the basic volume levels, input source & output destination of the Windows 3.1 control panel.  No such framework still exists for video in the modern user space.

Discussions

Ken Yap wrote 04/18/2020 at 01:57 point

>the kernel now has tons of dead code which is replicated in user space

With kernel modules they only exist in the source archives are are not in memory on a running machine.

  Are you sure? yes | no