A Cloud Guru is in the business of education. They provide high-quality, curated courses for IT professionals. Their students come from all over the world and because of this they have different connection types, speeds and devices so they need to be flexible with how they deliver their content.

(Want to know more? Check out our book: Serverless Architectures on AWS)

Producing and serving quality video is a must have. A typical course can have up to 50 different videos lectures that span from 2 minutes to 20 minutes. As Sam has written before the platform is built on serverless architecture. Naturally this extends to how it deals with media too.

At the moment A Cloud Guru uses S3 buckets & CloudFront to help serve video & audio content. Videos are placed in a bucket behind CloudFront and authorized users are granted a signed CloudFront URL that is valid for a limited time. Users play videos through the A Cloud Guru website and automatically re-authorize their access for as long as they continue to be members.

When work began on A Cloud Guru, the team manually (i.e. using the excellent Handbrake app) had to encode videos from 1080p down to 720p to other encodings & resolutions. Although it was fine initially, it didn’t really fit with the team’s ethos of complete automation. It was slow and, frankly, painful to do.

Recently I got a chance to help out to overhaul the transcoding process and make it automated using Elastic Transcoder, S3 and Lambda. Thanks to AWS I had an initial implementation done in less than a day. The process not only transcodes videos to 720p, webm and HLSv3 but also generates thumbnails, watermarks our videos and nicely organises everything in S3.

Elastic Transcoder

It is so easy to create an Elastic Transcoder pipeline

Elastic Transcoder serves as the backbone of our implementation. To get it up and running you need to do two things: define a pipeline and create a job. A pipeline essentially defines a queue for future jobs. To create a pipeline you need to specify the input bucket (where the source videos will be), the output bucket and a bucket for thumbnails. You can also, optionally, set SNS topics for various events that’ll happen in the system such as On Completion and On Error. We recommend setting these and then using SNS to fire-off notifications especially for the On Error case.

As with anything AWS there are optional extras to be set as well — such as storage classes for the files and extra permissions.

Having created a pipeline you can immediately create a job and kick off a transcoding task. The job involves specifying the name of the file, selecting one or more of the transcoding presets (e.g. Generic 720p, webm, HLS, etc..), setting up a playlist, metadata or overriding input parameters such as the frame rate or aspect ratio.

Naturally we played with various jobs to get a good sense of the Elastic Transcoder but our end-goal was total automation.

Enter Lambda

In a serverless approach one way to run custom code is to use AWS Lambda. At the moment, there is no way to automatically run an Elastic Transcoder job by uploading a file to a bucket. However, it is easy to invoke Lambda from an S3 event and from a Lambda function invoke Elastic Transcoder.

This is the whole process: S3 invokes Lambda and Lambda invokes Elastic Transcoder

Invoking a Lambda function from an S3 bucket is straightforward — open the required bucket in S3 and click on Properties. From there click on Events, Lambda and from the drop-down select the relevant Lambda function you want to execute. Finally, select the event (ObjectCreated (All)) that will trigger the process.

Here’s all you need to do to configure a Lambda function invocation

Digging in to Lambda

Previously I mentioned that you can create and execute jobs using the Elastic Transcoder console. That’s good but we need to be able to create and run Elastic Transcoder jobs from our Lambda function too. Thankfully it’s pretty easy to do using the AWS JavaScript SDK. Here are the main steps you would go through:

Create a basic Lambda function and add the AWS SDK to it. Extract the bucket and the filename of the uploaded file from the event object that is passed in to the handler function. Create a job and specify the required outputs as well as any other settings. Kick off the job and enjoy seeing the progress in the Elastic Transcoder console. If you signed up to an SNS topic (when you created the Elastic Transcoder pipeline) and configured your email, you’ll get notifications for the various stages of the job.

Monitor your jobs from the Elastic Transcoder console

An example Lambda function that meets our requirements is given below. Note that it creates a job with 3 outputs — generic 720p, webm and hlsv3.

‘use strict’; var AWS = require(‘aws-sdk’); var s3 = new AWS.S3({

apiVersion: ‘2012–09–25’

}); var eltr = new AWS.ElasticTranscoder({

apiVersion: ‘2012–09–25’,

region: ‘us-east-1’

}); exports.handler = function(event, context) {

console.log(‘Executing Elastic Transcoder Orchestrator’); var bucket = event.Records[0].s3.bucket.name;

var key = event.Records[0].s3.object.key;

var pipelineId = ‘112321321343–2abcc1’; if (bucket !== ‘acloud-video-input’) {

context.fail(‘Incorrect Video Input Bucket’);

return;

} var srcKey = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, " ")); //the object may have spaces

var newKey = key.split('.')[0]; var params = {

PipelineId: pipelineId,

OutputKeyPrefix: newKey + ‘/’,

Input: {

Key: srcKey,

FrameRate: ‘auto’,

Resolution: ‘auto’,

AspectRatio: ‘auto’,

Interlaced: ‘auto’,

Container: ‘auto’

}, Outputs: [{

Key: ‘mp4-’ + newKey + ‘.mp4’,

ThumbnailPattern: ‘thumbs-’ + newKey + ‘-{count}’,

PresetId: ‘1351620000001–000010’, //Generic 720p

Watermarks: [{

InputKey: ‘watermarks/logo-horiz-large.png’,

PresetWatermarkId: ‘BottomRight’

}],

},{

Key: ‘webm-’ + newKey + ‘.webm’,

ThumbnailPattern: ‘’,

PresetId: ‘1351620000001–100240’, //Webm 720p

Watermarks: [{

InputKey: ‘watermarks/logo-horiz-large.png’,

PresetWatermarkId: ‘BottomRight’

}],

},{

Key: ‘hls-’ + newKey + ‘.ts’,

ThumbnailPattern: ‘’,

PresetId: ‘1351620000001–200010’, //HLS v3 2mb/s

Watermarks: [{

InputKey: ‘watermarks/logo-horiz-large.png’,

PresetWatermarkId: ‘BottomRight’

}],

}]

}; console.log(‘Starting Job’); eltr.createJob(params, function(err, data){

if (err){

console.log(err);

} else {

console.log(data);

} context.succeed(‘Job well done’);

});

};

Presets

One detail you might notice in the Lambda function above is the use of presets (e.g. 1351620000001–000010). Presets describe how to encode the given file. The full list of available presets can be found in AWS documentation.

It’s also possible to create your own preset too. You can do so by clicking on the Presets link in the Elastic Transcoder console. You’ll see the full list of system presets there and you’ll be able to create your own by hitting the Create New Preset button.

Presets allow for a lot of granular control over how files are encoded. Naturally, you can find a lot of detail in the appropriate AWS documentation.

Next Steps

There is much more that the Elastic Transcoder can do that we haven’t yet touched on. Captions and DRM immediately come to mind and both are extremely important for the platform. The A Cloud Guru team have big plans for the serverless media encoding and serving system, so watch this space.

Have you used the Elastic Transcoder and what were your impressions? I am keen to hear about your experience and what you think about the platform.

A Cloud Guru

The mission of A Cloud Guru is to engage individuals in a journey to level-up their cloud computing skills by delivering the world’s leading educational content designed to evolve both mindsets and careers.

“Let no man in the world live in delusion. Without a Guru, none can cross over to the other shore.“ — Guru Nanak

Our courses are delivered by industry experts with a shared passion for cloud computing. We strive to serve our growing community of cloud gurus, who generously contribute their insights in our forums, workshops, meet-ups, and conferences.

Keep up with the A Cloud Guru crew @acloudguru.