Android MediaCodec “Decoded”

Posted: July 28, 2012 in Android, Uncategorized
Android has a great media library allowing all sorts of things. Until recently though, there was no way to encode/decode audio/video giving developers the ability to do literally anything. Fortunately Jelly Bean release introduced the android.media.MediaCodec API.
The API is designed following the same principles/architecture of  OpenMAX, a well known standard in the media Industry.
Transitioning from a pure high level MediaPlayer to the encoder/decoder level can be a big pain though. There is a lot more to be aware of when you are manipulating the tiny little bits that make great media 🙂
In this post I will describe how to use the API, highlighting the essential things to be aware of.
1.Get To Know Your Media
Another new class introduced in Jelly Bean is the android.media.MediaExtractorIt is pretty clear what it is all about, extract the metadata from your media and a lot more.
AssetFileDescriptor sampleFD = getResources().openRawResourceFd(R.raw.sample);

MediaExtractor extractor;
MediaCodec codec;
ByteBuffer[] codecInputBuffers;
ByteBuffer[] codecOutputBuffers;

extractor = new MediaExtractor();
extractor.setDataSource(sampleFD.getFileDescriptor(), sampleFD.getStartOffset(), sampleFD.getLength());

Log.d(LOG_TAG, String.format("TRACKS #: %d", extractor.getTrackCount()));
MediaFormat format = extractor.getTrackFormat(0);
String mime = format.getString(MediaFormat.KEY_MIME);
Log.d(LOG_TAG, String.format("MIME TYPE: %s", mime));
2. Create your Decoder
A decoder is generally seen as a NODE with INPUT and OUTPUT buffers. You take an input buffer from it, fill it and give it back to the decoder for decoding to take place. On the other side of the NODE, you take an output buffer and “render” it. This example will play an audio sample file using the android.media.AudioTrack API.
codec = MediaCodec.createDecoderByType(mime);</pre>
codec.configure(format, null /* surface */, null /* crypto */, 0 /* flags */);
codec.start();
codecInputBuffers = codec.getInputBuffers();
codecOutputBuffers = codec.getOutputBuffers();

extractor.selectTrack(0); // <= You must select a track. You will read samples from the media from this track!
3. It`s All About Buffers
Let the Buffer party begin 🙂 See bellow how the INPUT side of the decoder is managed:
int inputBufIndex = codec.dequeueInputBuffer(TIMEOUT_US);</pre>
if (inputBufIndex >= 0) {
    ByteBuffer dstBuf = codecInputBuffers[inputBufIndex];

    int sampleSize = extractor.readSampleData(dstBuf, 0);
    long presentationTimeUs = 0;
    if (sampleSize < 0) {
        sawInputEOS = true;
        sampleSize = 0;
    } else {
        presentationTimeUs = extractor.getSampleTime();
    }

    codec.queueInputBuffer(inputBufIndex,
                           0, //offset
                           sampleSize,
                           presentationTimeUs,
                           sawInputEOS ? MediaCodec.BUFFER_FLAG_END_OF_STREAM : 0);
    if (!sawInputEOS) {
        extractor.advance();
    }
 }
And now how to pull OUTPUT buffers with the decoded media from the decoder:
final int res = codec.dequeueOutputBuffer(info, TIMEOUT_US);</pre>
if (res >= 0) {
 int outputBufIndex = res;
 ByteBuffer buf = codecOutputBuffers[outputBufIndex];

 final byte[] chunk = new byte[info.size];
 buf.get(chunk); // Read the buffer all at once
 buf.clear(); // ** MUST DO!!! OTHERWISE THE NEXT TIME YOU GET THIS SAME BUFFER BAD THINGS WILL HAPPEN

 if (chunk.length > 0) {
 audioTrack.write(chunk, 0, chunk.length);
 }
 codec.releaseOutputBuffer(outputBufIndex, false /* render */);

 if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
 sawOutputEOS = true;
 }
} else if (res == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
 codecOutputBuffers = codec.getOutputBuffers();
} else if (res == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
 final MediaFormat oformat = codec.getOutputFormat();
 Log.d(LOG_TAG, "Output format has changed to " + oformat);
 mAudioTrack.setPlaybackRate(oformat.getInteger(MediaFormat.KEY_SAMPLE_RATE));
}
And that’s it. This is the most simple usage of this such powerful API. For further questions send me a note and I’ll give you more insights…
Advertisements
Comments
  1. vineeth says:

    Hi marques,
    I want to test the media codec api on jellybean . Could you share the source file with me to understand it more. Thank you

    • vineeth says:

      Hello Marques ,

      in the tutorial , you specified 3 parts….but how do they fit in the activity life cycle ???

      for part 1 and 2 in your explanation ……
      #################################
      public void onCreate(Bundle savedInstanceState) {
      super.onCreate(savedInstanceState);

      extractor = new MediaExtractor();
      extractor.setDataSource(“/sdcard/clip.mp4”);
      int numTracks = extractor.getTrackCount();

      for (int i = 0; i < numTracks; ++i) {
      MediaFormat extractedMediaFormat = extractor.getTrackFormat(i);
      String mime = extractedMediaFormat.getString(MediaFormat.KEY_MIME);

      codec = MediaCodec.createDecoderByType(mime);
      codec.configure(extractedMediaFormat, surface ??????
      null, 0);
      codec.start();
      codecInputBuffers = codec.getInputBuffers();
      codecOutputBuffers = codec.getOutputBuffers();

      extractor.selectTrack(i);

      extractor.release();
      extractor = null;

      }

      }

      }

      part 3 — How do I handle the buffers
      #############################

      I need to transcode a 1080p video to 480p video …

      • dpsm says:

        Hello Vineeth,

        None of the parts really relate to the Activity life-cycle. Most of the encoding/decoding happens on a separate background thread. What you might want to do in case your application is a player is to notify the background thread that decodes audio/video to stop itself based on a service/activity life cycles.

        I`ll post a sample soon on github and we`ll post the url here…

        Cheers,

        David

  2. vineeth says:

    Hi David,
    Thanks a lot . An eagerly waiting to fully harness this capability of android 🙂

  3. Kimi says:

    Hello
    I’m very glad that found you article and it’s gives me a guide about how to use the MediaCodec apis, but these codes posted are most used in the background thread, is there any sample of how to use the apis decoding a mp4 file or avc file. If there has one ,can you please send it to me? thanks.

  4. Hello sir
    please share sample source code for Live audio streaming by using new media API..
    Waiting for reply
    thank you

  5. here is my code please it show error please take a look at this

    public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.activity_main);

    MediaExtractor extractor = new MediaExtractor();
    extractor.setDataSource(“http://radio.fm/test-64.ogg”);
    int numTracks = extractor.getTrackCount();
    for (int i = 0; i = 0)
    {
    int trackIndex = (int) extractor.getSampleTime();
    long presentationTimeUs = extractor.getSampleTime();
    }

    MediaCodec codec = MediaCodec.createDecoderByType(mime);
    codec.configure(format, null /* surface */, null /* crypto */, 0 /* flags */);
    codec.start();
    ByteBuffer[] inputBuffers = codec.getInputBuffers();
    ByteBuffer[] outputBuffers = codec.getOutputBuffers();
    format = codec.getOutputFormat();
    Long timeoutUs=(long) 1;
    for (;;) {
    int inputBufferIndex = codec.dequeueInputBuffer(timeoutUs);
    if (inputBufferIndex >= 0) {
    // fill inputBuffers[inputBufferIndex] with valid data

    codec.queueInputBuffer(inputBufferIndex, 0, 128, 0,0);

    }
    MediaCodec.BufferInfo info = new BufferInfo();
    int outputBufferIndex = codec.dequeueOutputBuffer(info, timeoutUs);
    if (outputBufferIndex >= 0) {
    // outputBuffer is ready to be processed or rendered.

    codec.releaseOutputBuffer(outputBufferIndex, false);
    } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
    outputBuffers = codec.getOutputBuffers();
    } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
    // Subsequent data will conform to new format.
    format = codec.getOutputFormat();
    AudioTrack mAudioTrack = null;
    mAudioTrack.setPlaybackRate(format.getInteger(MediaFormat.KEY_SAMPLE_RATE));

    }
    codec.stop();
    codec.release();
    codec = null;

    }

    }
    }

    correct my code please

  6. xena says:

    I have a question how to obtain info which is passed to dequeueOutputBuffer(..,..) method? and how to do the same thing but for video track?

  7. xena says:

    Thank you for your answer. I’ve understood almost everything and successfully executed this example for audio. But I’ve got another issue: if I want to play video track, some samples are too big to be written into dstBuf with standard size 8192. If I cut the sample, then IllegalStateException arises during codec.queueInputBuffer(…). What should I do in this case? Can you show me a successful story of playing a video file?)

    • xena says:

      I’ve already resolved this issue….I have another question about synchronization of audio and video streams. How to achieve this?

      • BitFish says:

        I would love to know the answer to this one as well. If I need to create a file with both an audio and video track, how is this accomplished? Is there a “standard” api to combine audio and video?

      • BitFish says:

        I would love to know the answer to this one as well. If I need to create a file with both an audio and video track, how is this accomplished? Is there a “standard” api to combine audio and video?

      • BitFish says:

        I would love to know this as well. Can I create a file that has both audio and video (both from a custom source?) I would assume this is easy, but I know I shouldn’t. Is there a “standard” api method to combine audio and video after the encode?

      • Jignesh says:

        Hey,
        Have you got your code running successfully for video part?
        i am trying to decode video and facing some problems.
        hope you can help.

      • shem says:

        Hey xena, I’m trying to do so too, what have you done with this?

    • Andy Atom says:

      Hi, could you send me your code? I am trying to run a video stream, but with no success… :-/
      Here is my email: androidx86atom@gmail.com
      Thanks in advance

      Andy

  8. Larry says:

    Hello, Is there a way to split a video file into its sequence video frames and reassembling it internally into another format like say mp4 or 3gp without relying on libraries like ffmpeg and xuggler?

  9. Rahul01483 says:

    vineeth/dpsm,

    Any source code for this project? Thanks in advance.

  10. lydia says:

    final byte[] chunk = new byte[info.size];
    buf.get(chunk); // Read the buffer all at once

    The above code is not working.
    info.size is always 0

  11. Kevin says:

    This tutorial makes a lot of sense, but I’m having some trouble executing it. When I go to execute the line codec.configure(format, null, null, 0), I get an error in logcat that reads “OMX_getextensionindex failed”. I was wondering if you knew why this was happening or how I could go about fixing it. Thanks!

  12. sh124 says:

    Could you please tell me how can I configure the decoder without MediaExtractor? I created a MediaFormat object by the method MediaFormat.createVideoFormat(String mime, int width, int height) and used it to configure the mediaCodec object but it fails.

  13. timmocci says:

    I’ve successfully tested this example, but I failed when applied this to the video track, is there any sample about how to decode a mp4 file?

  14. fadden says:

    Android 4.3 (API 18) added Surface input to MediaCodec, and a new MediaMuxer class that provides a way to convert a raw video stream to .mp4. I put together some samples here: http://bigflake.com/mediacodec/

  15. Ivan Drago says:

    In your example you doesn’t show how you created an instance of AudioTrack.
    When the AudioTrack is created you need to provide the size of internal input buffer.
    What is the best way to find out what would be proper buffer size?
    Currently I do not see it in your code.

    If, suppose, AudioTrack’s buffer is smaller than the chunk you are writing, what’s going to happen?

    • dpsm says:

      Ivan,

      The “ideal” buffer size depends on several things (chunk size, target network speeds, etc). The size can’t be too small otherwise you will be blocking often on the AudioTrack writes which depending on your design might cause under-runs and in the other hand can’t be too big since it will use more memory than you need and might cause issues with low end devices.

      Put it this way, “How many seconds of audio in the buffer of the audio track do i think i need to avoid silence?”. Hopefully you can answer that question and narrow to a “good” size.

      David

  16. Syed says:

    Hi,

    I have a code for a custom decoder which is in C. Using Android ndk, i had build a basic player to render the decoded frames. So this means that i have a working code for android decoder.

    In Jelly bean, is it possible to use this code and along with the API you have specified above, so that i need not create a player with play and pause functionality.

    Please Provide some help regarding this.

    • dpsm says:

      Hello @Syed,

      There is no way to add custom decoders to the MediaCoded APIs at this moment. I would recommend using the available decoders on the phone and fallback to yours

      Regards

      David

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s