Compile FFmpeg for Android

To generate libraries from FFmpeg source code is not an easy task. I understood this fact when I faced this challenge. Fortunately, I met this a handful tutorial. In simple words, to compile FFmpeg for Android we have to do two steps:

Compile x264 using this tutorial.

Link x264 to FFmpeg and compile FFmpeg using this tutorial.

After these steps you will have result as follow:

What is H.264 and why do we need it?

H.264 is a video compression standard. It is the most popular codec for a video that is used in Android. The raw video is a sequence of frames. One frame is a matrix that contains info about each pixel. The size of each pixel is 3 bytes because RGB has 3 color channels by 1 byte. So, the size of one 1080p frame: 1920 x 1080 x 3 = ~5.9 megabytes. Moreover, if we want to show 24 frames per second(FPS): 5.9 MB * 24 = ~141.6 MB per one second. An encoded frame can have size about several kilobytes instead of 5.9 megabytes. So, we need to compress the raw video to decrease size. x264 is a program that can do it according to H.264. The result of video encoding is a file with extension *.h264.

Application Binary Interfaces(ABIs)

As you can see, ffmpeg-3.3.2 folder contains android folder now, which, in turn, contains folders for different ABIs. ABI defines the CPU instruction set(s) that the machine code should use. FFmpeg supports only ARM CPU architectures, but it is not a big problem because a majority amount of smartphones use ARM. Each ABI folder contains two interesting for us folders: include and lib. Include contains header files, and lib contains shared libraries.

C/C++ libraries

There are two types of libraries: shared and static.

Shared libraries have extension *.so. Programs at run-time refer to all code from this library. A program using a shared library only refers to the code that it uses in the shared library.

Static libraries have *.a extension. All code is directly linked into the program at compile time. A program using a static library takes copies of the code that it uses from the static library and makes it part of the program.

Integration of pre-built C/C++ libraries to an Android project

To include FFmpeg’s headers and *.so files to the project I use folder structure as follow:

However, you can do it in another way. This project does remuxing, so, I added only needed libraries.

To link *.so files to our project we have to configure Gradle as follow:

Also, we have to add the next lines to CMakeLists.txt:

I use set function to define a new variable with name ffmpeg_DIR and ${CMAKE_SOURCE_DIR}/src/main/cpp/ffmpeg value. CMAKE_SOURCE_DIR is a variable of CMake that contains the path to CMakeLists.txt file.

ANDROID_ABI is another variable of CMake that contains a name of ABI. It changes during build time and can take values as follow: armeabi, armeabi-v7a, arm64-v8a, x86, x86_64, mips, mips64. We can specify the certain ABIs using Gradle:

ndk {

abiFilters "armeabi", "armeabi-v7a", "arm64-v8a"

}

So now build process is going only for these ABIs.

The add_library function adds a library to our project. Using SHARED constant we specify that we use *.so file and we use IMPORTED const to show that library file is located outside the project. So we use set_target_properties to define a path to the library file.

To add FFmpeg’s headers we can use this function:

include_directories(${ffmpeg_DIR}/lib/${ANDROID_ABI}/include)

After this we can link pre-built libraries: avutil-55, avformat-57, avcodec-57 to our library:

Creating of C/C++ wrapper to make using of pre-built libraries easier

So now we are ready to use FFmpeg’s libraries in our project. The structure of cpp’s subfolders looks as follow:

We have already linked FFmpeg’s libraries to our vpl.cpp wrapper using target_link_libraries function in CMakeLists.txt.

To use functions from libraries we have to include headers to our *.cpp:

Please, pay attention to the fact that we have to use extern "C" {} block.

Now we can use these samples or find out ffmpeg.c file to create new functions:

It is source code of FFmpeg’s command line tool.

In this way I created several functions which can do demuxing, rotation of display matrix, trimming of video, merging of audio and video streams from different files. The sample of function that returns duration:

I created the small library with these functions, so, if you need, you can add it to your project.

You can use it as follow:

I am going to extend this library and I we will be glad to merge you pull requests.

Conclusions

FFmpeg is a handy library for video and audio processing. There a lot of open source projects use it. You can check my sample out to be sure that FFmpeg is a fast and powerful tool. FFmpeg is a low-level library, so, you have to have base knowledge about video and audio processing. You have to know that operation of transcoding takes much time. So, don’t do transcoding if it is unnecessary.