Posts Tagged - aac

GStreamer H264/MP4 decoding C/C++ basics and encoding/decoding buffers manipulations

Exploring GStreamer and pipelines

Before proceeding to code review, let’s look at what we can do without it. GStreamer includes useful utilities to work with, in particular:

  • gst-inspect-1.0 will allow you to see a list of available codecs and modules, so you can immediately see what will do with it and select a set of filters and codecs.
  • gst-launch-1.0 allows you to start any pipeline. GStreamer uses a decoding scheme where a stream passes through different components in series, from source to sink output. You can choose anything as a source: a file, a device, the output (sink) also may be a file, a screen, network outputs, and protocols (like RTP).

Simple example of using gst-launch-1.0 to connect elements and play audio:

1
gst-launch-1.0 filesrc location=/path/to/file.ogg ! decodebin ! alsasink
How to sink and src works

Filesrc will open file, decodebin - decode it, and alsasink will output audio.

Another more complex example of playing an mp4 file:

1
gst-launch-1.0 filesrc location=file.mp4 ! qtdemux ! h264parse ! avdec_h264 ! videoconvert ! autovideosink

The input accepts the mp4 file, which goes through the mp4 demuxer — qtdemux, then through the h264 parser, then through the decoder, the converter, and finally, the output.

You can replace autovideosink with filesink with a file parameter and output the decoded stream directly to the file.

Programming an application with GStreamer C/C++ API. Let’s try to decode

Now when we know how to use gst-launch-1.0, we are doing the same thing within our application. The principle remains the same: we are building in a decoding pipeline, but now we are using the GStreamer library and glib-events.

We will consider a live example of H264 decoding.

Initialization of the GStreamer application takes place once with the help of

1
gst_init (NULL, NULL);

Read More

Android NDK AAC decoder ADTS alignment

After a long thought, I finally decided to switch my blog articles to english, and continue to give some rare and mostly unintresting info in free form. And today we will talk more about Android, NDK and some undocumentated video/audio functionality, maybe will discover some new knowlage about AAC and maybe it will help your own problem, like it was for me. In a focus of this acrticle is Android AAC decoder, and a little detail how the decoding in android working behind NDK documentation.

AMediaCodec using steps

First let take a very very surface look how to start decoding using NDK:

  1. Create AMediaCodec using codec name.
  2. Configure AMediaCodec via AMediaCodec_configure.
  3. Ctart decoding AMediaCodec_start.
  4. Give a buffer using AMediaCodec_getInputBuffer.
  5. Back buffer with AMediaCodec_queueInputBuffer.
  6. Repeat while you have an buffer ;).

Looks very simple, and work good as well. I can end this article in this place, but I don’t tell you nothing about buffer requirenments and other stuffs, and in NDK/SDK also all simple like that. So what going on behind this android decoding? What if you getting some error with your buffer, or you don’t have sound in some rare cases? How the Android decoder works like, let take a look at AAC audio decoder as example. Let’s begin from simple.

Android AAC decoder architecture

As you see on this bad jpeg picture :) Android have different implementation of AAC decoders as OMX components. But that’s not all, beside some software implemetation on some platforms existed hardware implementation, like on Broadcom chips. Keep at mind, and will transport to SoftAAC2 decoder. Let take a look deeper.

Read More