Lesson 2 : Decoding

Download lesson [here]

Let’s consider the following filtergraph:

Streaming part               | Decoding part
                             |
(LiveThread:livethread) -->> (AVThread:avthread) --> {InfoFrameFilter:info_filter}

Like in the previous lessons, we are reading frames from an IP camera. Instead of churning them through a series of filters, we pass them to another, independently running thread that performs decoding (AVThread).

Let’s list all the symbols used until now and the corresponding objects:

Symbol

Base class

Explanation

()

Thread

An independently running thread

>>

Crossover between two threads

{}

FrameFilter

A framefilter

That’s all you need to create complex filtergraphs with Valkka.

We start as usual, by constructing the filterchain from end-to-beginning:

# decoding part
info_filter     =InfoFrameFilter("info_filter")
avthread        =AVThread("avthread",info_filter)

AVThread is a thread that does all the heavy-lifting of decoding from H264 to bitmap images. It decodes on the CPU. If you fancy to use a hardware accelerator, then you could substitute AVThread with VAAPIThread instead.

Next, we need a framefilter to feed the frames into AVThread. This framefilter is requested from the AVThread itself:

# streaming part
av_in_filter    =avthread.getFrameFilter()
livethread      =LiveThread("livethread")

Finally, proceed as before: pass av_in_filter as a parameter to the connection context, start threads, etc.

ctx =LiveConnectionContext(LiveConnectionType_rtsp, "rtsp://admin:nordic12345@192.168.1.41", 1, av_in_filter)

Start threads. Starting the threads should be done in end-to-beginning order (in the same order we constructed the filterchain).

avthread.startCall()
livethread.startCall()

# start decoding
avthread.decodingOnCall()

livethread.registerStreamCall(ctx)
livethread.playStreamCall(ctx)
time.sleep(5)

# stop decoding
# avthread.decodingOffCall()

Stop threads. Stop threads in beginning-to-end order (i.e., following the filtergraph from left to right).

livethread.stopCall()
avthread.stopCall()

print("bye")

You will see output like this:

InfoFrameFilter: info_filter start dump>>
InfoFrameFilter: FRAME   : <AVBitmapFrame: timestamp=1525870759898 subsession_index=0 slot=1 / h=1080; w=1920; l=(1920,960,960); f=12>
InfoFrameFilter: PAYLOAD : [47 47 47 47 47 47 47 47 47 47 ]
InfoFrameFilter: timediff: -22
InfoFrameFilter: info_filter <<end dump
InfoFrameFilter: info_filter start dump>>
InfoFrameFilter: FRAME   : <AVBitmapFrame: timestamp=1525870759938 subsession_index=0 slot=1 / h=1080; w=1920; l=(1920,960,960); f=12>
InfoFrameFilter: PAYLOAD : [47 47 47 47 47 47 47 47 47 47 ]
InfoFrameFilter: timediff: -11
InfoFrameFilter: info_filter <<end dump
...
...

So, instead of H264 packets, we have decoded bitmap frames here.

In the next lesson, we’ll dump them on the screen.

When using the API to pass frames between threads, that’s all you need to know for now.

“Under the hood”, however, things are a bit more complex.

The framefilter requested from AVThread writes into AVThread’s internal FrameFifo. This is a first-in-first-out queue where a copy of the incoming frame is placed. Frames are copied into pre-reserved frames, taken from a pre-reserved stack. Both the fifo and the stack are thread-safe and mutex-protected. The user has the possibility to define the size of the stack when instantiating AVThread.

For more details, see the cpp documentation and especially the FrameFifo class.

However, all these gory details are not a concern for the API user at this stage. :)

Note

There are several FrameFifo and Thread classes in Valkka. See the inheritance diagram. Only a small subset of the methods should be called by the API user. These typically end with the word “Call” (and are marked with the “<pyapi>” tag).