What is Elastic Transcoder

This recipe explains what is Elastic Transcoder

What is Elastic Transcoder?

Elastic Transcoder is an AWS service that converts media files stored in an S3 bucket into media files supported by various devices.

 

Elastic Transcoder is a cloud-based media transcoder.

 

It is used to convert media files from their original source format into formats that can be played on smartphones, tablets, PCs, and other devices.

 

o It includes transcoding presets for popular output formats, so you don't have to guess which settings will work best on which devices.

 

If you use Elastic Transcoder, you must pay based on the number of minutes and the resolution at which you transcode.

 

Components of Elastic Transcoder

Elastic Transcoder is made up of four parts:

 

     Jobs

 

     Pipelines

 

     Presets

 

     Notifications

 

 
   
  • Jobs
   

The primary responsibility of the job is to complete the transcoding work. Each job can convert a file into up to 30 different formats. A single job, for example, creates files in eight different formats if you want to convert a media file into eight different formats. When you create a job, you must specify the name of the file to be transcoded.

   

   
  • Pipelines
   

Pipelines are the queues that contain your transcoding jobs. When you create a job, you must specify which pipeline you want to add your job to. If you specify multiple formats in a job, Elastic Transcoder creates files for each format in the order you specify.

    You can create either of the two pipelines, namely standard-priority jobs or high-priority jobs. The majority of jobs fall into the standard-priority category. When you need to transcode a file quickly, you can use the high-priority pipeline.
   

   
  • Presets
   

Presets are templates that include the settings for converting a media file from one format to another. Elastic transcoder includes some standard presets for common formats. You can also make your own presets that aren't in the default presets. When you create a job, you must specify a preset that you want to use.

   

   
  • Notifications
   

Notification is an optional field that can be set using the Elastic Transcoder. Notification Service is a service that keeps you up to date on the status of your job, such as when Elastic Transcoder begins processing your job, when Elastic Transcoder finishes its job, and whether or not Elastic Transcoder encounters an error condition. When you create a pipeline, you can configure Notifications.

   
 

Features

 
   
  • Easy to use
   

Amazon Elastic Transcoder is intended to be simple to use. To begin, use the AWS Management Console, the service API, or the SDKs. Transcoding presets in the system make it simple to get transcoding settings right the first time. We offer pre-defined presets for creating media files that will play on a variety of devices (such as smartphones and tablets), as well as presets for creating media files that are optimised for playback on a specific device (like the Amazon Kindle Fire HD or Apple iPod touch). You can also create segmented files and playlists for delivery to compatible devices via the HLS, Smooth, or MPEG-DASH protocols. Developers creating transcoding-required applications can use the AWS SDKs for Java,.NET, Node.js, PHP, Python and Ruby, and the new AWS Command Line Interface.

   

   
  • Elastically Scalable
   

Amazon Elastic Transcoder is built to scale with your media transcoding workload. Amazon Elastic Transcoder is designed to handle large amounts of media files as well as large file sizes. Transcoding pipelines allow you to perform multiple transcodes at the same time. To provide scalability and reliability, Amazon Elastic Transcoder makes use of other Amazon Web Services such as Amazon S3, Amazon EC2, Amazon DynamoDB, Amazon Simple Workflow (SWF), and Amazon Simple Notification Service (SNS).

   

   
  • Cost Effective
   

Amazon Elastic Transcoder has a content duration-based pricing model, which means you pay based on the length of the output media in minutes. For example, if the transcoded output of your video is 30 minutes long, you will be charged for 30 minutes of transcoding. Similarly, if you make a 20-minute video clip out of a 30-minute input file, you will be charged for 20 minutes of transcoding. Alternatively, if you combine two 5 minute input files to make a single 10 minute output file, you will be charged for 10 minutes of transcoding. There are no minimum transcoding volumes, monthly commitments, or long-term contracts with Amazon Elastic Transcoder.

   

   
  • Managed
   

Amazon Elastic Transcoder allows you to concentrate on your content rather than managing transcoding software in a distributed cloud environment. The service manages the process of keeping codecs up to date as well as scaling and operating the system. When combined with our service API and SDKs, this makes it simple to create media solutions that utilise Amazon Elastic Transcoder.

   

   
  • Secured
   

Your content is in your hands: your assets are stored in your own Amazon S3 buckets, which you grant us access to via IAM roles. This makes it simple to integrate into your existing security and identity framework without sacrificing control. We used security best practises learned while building other Amazon Web Services to build Amazon Elastic Transcoder. Please visit the AWS Security Center for more information on AWS security. AWS Compliance has more information on compliance, including MPAA best practises.

   

   
  • Seamless Delivery
   

You can store, transcode, and deliver your content using Amazon Elastic Transcoder, Amazon S3, and Amazon CloudFront. It is now a simple one-step process to transcode content with Amazon Elastic Transcoder and deliver the multiple output videos via progressive download or adaptive bitrate streaming (HLS, Smooth, or MPEG-DASH) with CloudFront by setting the S3 permissions for your CloudFront distribution in Amazon Elastic Transcoder.

   

   
  • AWS integration
   

Amazon Elastic Transcoder is a critical media building block for AWS end-to-end media solutions. For example, you can use Amazon Glacier to store master content, Amazon Elastic Transcoder to convert masters to renditions for distribution stored in Amazon S3, Amazon CloudFront to stream these renditions at scale over the Internet, and CloudWatch to monitor the health of your transcoding workflow.

   
 

What Users are saying..

profile image

Ray han

Tech Leader | Stanford / Yale University
linkedin profile url

I think that they are fantastic. I attended Yale and Stanford and have worked at Honeywell,Oracle, and Arthur Andersen(Accenture) in the US. I have taken Big Data and Hadoop,NoSQL, Spark, Hadoop... Read More

Relevant Projects

Build a Real-Time Spark Streaming Pipeline on AWS using Scala
In this Spark Streaming project, you will build a real-time spark streaming pipeline on AWS using Scala and Python.

Talend Real-Time Project for ETL Process Automation
In this Talend Project, you will learn how to build an ETL pipeline in Talend Open Studio to automate the process of File Loading and Processing.

Azure Stream Analytics for Real-Time Cab Service Monitoring
Build an end-to-end stream processing pipeline using Azure Stream Analytics for real time cab service monitoring

Deploy an Application to Kubernetes in Google Cloud using GKE
In this Kubernetes Big Data Project, you will automate and deploy an application using Docker, Google Kubernetes Engine (GKE), and Google Cloud Functions.

Build an ETL Pipeline with DBT, Snowflake and Airflow
Data Engineering Project to Build an ETL pipeline using technologies like dbt, Snowflake, and Airflow, ensuring seamless data extraction, transformation, and loading, with efficient monitoring through Slack and email notifications via SNS

Learn Real-Time Data Ingestion with Azure Purview
In this Microsoft Azure project, you will learn data ingestion and preparation for Azure Purview.

SQL Project for Data Analysis using Oracle Database-Part 4
In this SQL Project for Data Analysis, you will learn to efficiently write queries using WITH clause and analyse data using SQL Aggregate Functions and various other operators like EXISTS, HAVING.

Getting Started with Pyspark on AWS EMR and Athena
In this AWS Big Data Project, you will learn to perform Spark Transformations using a real-time currency ticker API and load the processed data to Athena using Glue Crawler.

Hands-On Real Time PySpark Project for Beginners
In this PySpark project, you will learn about fundamental Spark architectural concepts like Spark Sessions, Transformation, Actions, and Optimization Techniques using PySpark

Build a big data pipeline with AWS Quicksight, Druid, and Hive
Use the dataset on aviation for analytics to simulate a complex real-world big data pipeline based on messaging with AWS Quicksight, Druid, NiFi, Kafka, and Hive.