FFmpeg
Typedefs | Functions | Variables
doc/multithreading.txt File Reference

Typedefs

using clients = =============================================Slice threading - *The client 's draw_horiz_band() must be thread-safe according to the comment in avcodec.h. Frame threading - *Restrictions with slice threading also apply. *Custom get_buffer2() and get_format() callbacks must be thread-safe. *There is one frame of delay added for every thread beyond the first one. Clients must be able to handle this
 

Functions

the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec whose streams don t reset across will not work because their bitstreams cannot be decoded in parallel *The contents of buffers must not be read before ff_thread_await_progress () has been called on them. reget_buffer() and buffer age optimizations no longer work. *The contents of buffers must not be written to after ff_thread_report_progress() has been called on them. This includes draw_edges(). Porting codecs to frame threading
 
the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec whose streams don t reset across will not work because their bitstreams cannot be decoded in parallel *The contents of buffers must not be read before as well as code calling get_buffer ()
 
the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec whose streams don t reset across will not work because their bitstreams cannot be decoded in parallel *The contents of buffers must not be read before as well as code calling up to before the decode process starts Call ff_thread_finish_setup () afterwards. If some code can 't be moved
 
the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec whose streams don t reset across will not work because their bitstreams cannot be decoded in parallel *The contents of buffers must not be read before as well as code calling up to before the decode process starts Call have update_thread_context () run it in the next thread. Add AV_CODEC_CAP_FRAME_THREADS to the codec capabilities. There will be very little speed gain at this point but it should work. If there are inter-frame dependencies
 
the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec whose streams don t reset across will not work because their bitstreams cannot be decoded in parallel *The contents of buffers must not be read before as well as code calling up to before the decode process starts Call have so the codec calls ff_thread_report await_progress ()
 
the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec whose streams don t reset across will not work because their bitstreams cannot be decoded in parallel *The contents of buffers must not be read before as well as code calling up to before the decode process starts Call have so the codec calls ff_thread_report set FF_CODEC_CAP_ALLOCATE_PROGRESS in FFCodec caps_internal and use ff_thread_get_buffer () to allocate frames. Otherwise decode directly into the user-supplied frames. Call ff_thread_report_progress() after some part of the current picture has decoded. A good place to put this is where draw_horiz_band() is called - add this if it isn 't called anywhere
 
the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec whose streams don t reset across will not work because their bitstreams cannot be decoded in parallel *The contents of buffers must not be read before as well as code calling up to before the decode process starts Call have so the codec calls ff_thread_report set FF_CODEC_CAP_ALLOCATE_PROGRESS in FFCodec caps_internal and use as it s useful too and the implementation is trivial when you re doing this Note that draw_edges () needs to be called before reporting progress. Before accessing a reference frame or its MVs
 

Variables

FFmpeg multithreading methods
 
the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec implementations
 
the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec whose streams don t reset across frames
 

Typedef Documentation

◆ clients

using clients = ============================================= Slice threading - * The client's draw_horiz_band() must be thread-safe according to the comment in avcodec.h. Frame threading - * Restrictions with slice threading also apply. * Custom get_buffer2() and get_format() callbacks must be thread-safe. * There is one frame of delay added for every thread beyond the first one. Clients must be able to handle this

Definition at line 25 of file multithreading.txt.

Function Documentation

◆ ff_thread_await_progress()

the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec whose streams don t reset across will not work because their bitstreams cannot be decoded in parallel* The contents of buffers must not be read before ff_thread_await_progress ( )

◆ get_buffer()

the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec whose streams don t reset across will not work because their bitstreams cannot be decoded in parallel* The contents of buffers must not be read before as well as code calling get_buffer ( )

◆ ff_thread_finish_setup()

the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec whose streams don t reset across will not work because their bitstreams cannot be decoded in parallel* The contents of buffers must not be read before as well as code calling up to before the decode process starts Call ff_thread_finish_setup ( )

◆ update_thread_context()

the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec whose streams don t reset across will not work because their bitstreams cannot be decoded in parallel* The contents of buffers must not be read before as well as code calling up to before the decode process starts Call have update_thread_context ( )

◆ await_progress()

the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec whose streams don t reset across will not work because their bitstreams cannot be decoded in parallel* The contents of buffers must not be read before as well as code calling up to before the decode process starts Call have so the codec calls ff_thread_report await_progress ( )

◆ ff_thread_get_buffer()

the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec whose streams don t reset across will not work because their bitstreams cannot be decoded in parallel* The contents of buffers must not be read before as well as code calling up to before the decode process starts Call have so the codec calls ff_thread_report set FF_CODEC_CAP_ALLOCATE_PROGRESS in FFCodec caps_internal and use ff_thread_get_buffer ( )

◆ draw_edges()

the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec whose streams don t reset across will not work because their bitstreams cannot be decoded in parallel* The contents of buffers must not be read before as well as code calling up to before the decode process starts Call have so the codec calls ff_thread_report set FF_CODEC_CAP_ALLOCATE_PROGRESS in FFCodec caps_internal and use as it s useful too and the implementation is trivial when you re doing this Note that draw_edges ( )

Variable Documentation

◆ methods

FFmpeg multithreading methods
Initial value:
==============================================
FFmpeg provides two methods for multithreading codecs.
Slice threading decodes multiple parts of a frame at the same time

Definition at line 2 of file multithreading.txt.

◆ implementations

the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec implementations
Initial value:
==============================================
Slice threading -
None except that there must be something worth executing in parallel.
Frame threading -
* Codecs can only accept entire pictures per packet.
* Codecs similar to ffv1

Definition at line 29 of file multithreading.txt.

◆ frames

the pkt_dts and pkt_pts fields in AVFrame will work as usual Restrictions on codec whose streams don t reset across frames

Definition at line 36 of file multithreading.txt.

be
it s the only field you need to keep assuming you have a context There is some magic you don t need to care about around this just let it be(in the first position) for now. Options ------- Then comes the options array. This is what will define the user accessible options. For example
to
const char * to
Definition: webvttdec.c:35
Slice
Definition: magicyuv.c:39
frame
static AVFrame * frame
Definition: demux_decode.c:54
codecs
static const struct codec_string codecs[]
methods
FFmpeg multithreading methods
Definition: multithreading.txt:2
a
The reader does not expect b to be semantically here and if the code is changed by maybe adding a a division or other the signedness will almost certainly be mistaken To avoid this confusion a new type was SUINT is the C unsigned type but it holds a signed int to use the same example SUINT a
Definition: undefined.txt:41
that
if it could not because there are no more it should return AVERROR_EOF The typical implementation of request_frame for a filter with several inputs will look like that
Definition: filter_design.txt:273
packet
enum AVPacketSideDataType packet
Definition: decode.c:1380