Using Azure Media Services for everyday video content delivery

What’s Azure Media Services?

If you want the detailed explanation, here you go:

The short version, it’s a set of services geared towards enabling scalable video transcoding and publishing. If you are familiar with AWS then think of something along the lines of ElasticTranscoder. You will not need these types of services for making videos available online once in a while, to that end a simple file upload and exposing the file online does the job. However things change if you have to manage a lot of videos and video formats with constraints like Digital Rights Management (DRM) and on top need to cater for great scale and resilience. That is where Media Services kicks in. Ultimately, it does a lot of processing for you and makes your video content available via so called Streaming Endpoints.

In Microsoft Azure Media Services, a Streaming Endpoint represents a streaming service that can deliver content directly to a client player application, or to a Content Delivery Network (CDN) for further distribution. Media Services also provides seamless Azure CDN integration. The outbound stream from a Streaming Endpoint service can be a live stream or a video on demand Asset in your Media Services account.

Scaling out media delivery via Media Services works via scaling the Streaming Endpoints and/or using a CDN. At the moment that would have to be Verizon, though. For other CDN providers you have to apply some manual configuration.

How to put it all to practical use

In the next paragraphs I will give you a practical, technology/programming language independent implementation guide – if you are a NodeJS developer this guide will be in particular interesting, given that at the moment this article was wrote (1st of November 2016) there was no official Microsoft SDK for Media Services.

Couple of things on taxonomy and some preparation

So let us assume a simple video lifecycle: upload video, en-/transcode video, make that video available for streaming. Things could be more sophisticated thinking about DRM and I will shortly touch on that later but for now let us keep it all clean and simple.


Key thing for uploading – the Media Services REST API does not handle the actual Upload. That is done through the Storage REST API.

So all in all the Media Service API and the Storage API altogether with the more general ACS (Access Control Service) API are the main APIs we are looking at.

An asset is a container for multiple types or sets of objects in Media Services, including video, audio, images, thumbnail collections, text tracks, and closed caption files.

You need permissions for both the Media Service and the Storage Service it uses.

Here is a Postman collection that will make working with the API endpoints a lot easier, kindly contributed by John: get it now.

A recipe for using the Media Services

First things first, obtain an access token from ACS (Access Control Services employed by Azure).

Find out about the closest Media Endpoint API using with the header x-ms = 2.11.

That would be something like:

NOTE: The root URI for Media Services is You should initially connect to this URI, and if you get a 301 redirect back in response, you should make subsequent calls to the new URI. In addition, do not use any auto-redirect/follow logic in your requests. HTTP verbs and request bodies will not be forwarded to the new URI.

Next, you would create an asset first. (It’s metadata.) That works via {{Media Endpoint API}}/Assets.

A call to /Assets will create new Blob service container, making use of the Storage Account associated with the Media Service. The container URI looks like this in the response: “Uri”: “

Then you upload the physical video file, that happens via the Storage API, Blob service to be more precise.

Alternatively use the Azure CLI or one of the SDKs, like the NodeJS storage SDK that deals with BLOBs.

(NOTE: Storage of media files might or might not be that straight-forward as just dropping a file into Azure Blob Storage. You might have to encode the file, think of DRM – here is a C# coded Azure Function that demonstrates how this would work. Not relevant for content without DRM, yet good to be aware of.)

The AssetFile entity – metadata again, like the Asset – represents a video or audio file that is stored in a blob container. An asset file is always associated with an asset, and an asset may contain one or many AssetFiles. The Media Services Encoder task fails if an asset file object is not associated with a digital file in a blob container.

That is why you have to “merge” the existing BLOB previously uploaded with the Asset metadata using the {{Media Endpoint API}}/Files API.

For that you will need the asset id that was returned as id in your initial /Assets call and looks like this:

"Id": "nb:cid:UUID:8264c73e-6ba4-4c68-ba9c-62caf49e84a9"

Once this is done it’s time for encoding.

For this, create a /Jobs job (HTTP POST endpoint) and specify some default media processor to get started like the one that goes with the id: nb:mpid:UUID:ff4df607-d419-42f0-bc17-a481b1331e56. This is the H264 Multiple Bitrate 720p configuration and results in neat MP4s you can store right into the BLOB container where they will be ready for publishing.

Get more out-of-the-box media processors via /MediaProcessors.

Check the state of the job via the GET /Jobs endpoint. Status 3 indicates processing is done. (More details here.)

Alternatively you can establish a webhook using /NotificationEndPoints.
Last but not least you need to publish it all. For this you will have to do two additional things.

Going forth, you will need a so called Access Policy. (Can be shared furthermore, so you will need this only once.)

Hence, generate an Access Policy which will allow you to create a so-called Locator.

Both operations, for going via the REST API, will require you to deliver a whole bunch of HTTP headers. The Postman collection provided earlier will tell you in detail which ones.

In both cases you need to use the Media Endpoint API you found out about in the beginning of the sequence described here, not!

For the Access Policy you will get an ID like this:

"Id": "nb:pid:UUID:853c765f-04ca-46c3-a519-26ac2c817f4a"

Use this in conjunction with the Locator. The Locator finally determines when the video is published.

Note that there a choice of Progressive vs Streaming locators, reflected by the “Type” parameter in the call for creating the Locator. 1 = Progressive, 2 = On Demand Streaming.

How to test the streaming endpoints

In Azure you are given a couple of Streaming endpoints with every published video, simply because of different supported asset formats.

GET {{ApiEndpoint}}/Locators(‘<locatorid>’) returns a property named Path that you can use to construct the publishing URL.

Like so: <Path to streaming endpoint>/<name of the encoded asset file without suffix>.ism/manifest
Let’s say the movie you encoded was named then the URI should be something like this:


The .ism/manifest is for streaming. Whereas “Progressive Streaming” (=downloading) ones usually end in .mp4.

Test both types of endpoints in a browser by copying the full URL and into the resource text box here: 

Just make sure to cut out the “http:” part at the beginning, the streaming source URL starts with “//”.

If you want to set up the Azure Media Player yourself, well, that is easy enough, check out this JSFiddle for inspiration. (I might have taken the published streaming endpoint down by the time you read this and no video is displayed therefore – sorry for that.)


Doing all of this with Java:

Another NB:

Azure Media Services come with these limitations: 

Most of them are soft limitations though and can be lifted up on request.