Create a Linode account to try this guide with a $ credit.
This credit will be applied to any valid services used during your first  days.


This reference architecture provides a concrete example of how to create a scalable, portable, and cost-effective media processing workflow. A traditional Video On-Demand (VOD) workflow is demonstrated in which a source video is output to an online distribution format. The prescribed architecture for this workflow can be cost-effectively scaled and extended to the formats required for your video transcoding use-cases. Some use-cases addressed by this workflow include:

  • Supporting content for blogs or social media
  • Producing video or digital assets for streaming services
  • Embedding content in business applications

Review the video transcoding architecture diagrams for a high-level depiction of a general video transcoding workflow, as well as a more granular version which prescribes specific technologies to implement the workflow.

Technologies Used

The workflow in this document is implemented on the Akamai Connected Cloud (in particular, the Linode Kubernetes Engine), Akamai CDN, and a GitHub Actions-powered CI/CD powered pipeline. The full accounting of technologies used includes:

  • Akamai Connected Cloud technologies:

    Linode Kubernetes Engine (LKE)A fully-managed K8s container orchestration engine for deploying and managing containerized applications and workloads
    NodeBalancersManaged cloud load balancers
    Object StorageS3-compatible Object Storage, used to manage unstructured data like video files
    Block StorageNetwork-attached block file storage volumes
    APIProgrammatic access to Linode products and services
    DNS ManagerDomain management, free for Akamai Connected Cloud customers
  • Other software and services:

    ArgoKubernetes-native workflow engine
    FFmpegEncoding/decoding/transcoding multimedia framework
    PyTranscoderPython wrapper for FFmpeg
    MediaInfoGathers metadata for audio/video files
    GitHubGit-based managed version control service
    TerraformInfrastructure-as-code provisioning tool
    HelmPackage manager for Kubernetes
    DockerHubContainer image library
    Let’s EncryptFree, automated, open certificate authority
    cert-managerCloud native certificate management
    NGINXLoad balancer, web server, and reverse proxy
    PrometheusMonitoring system and time series database
    GrafanaObservability platform

Business Benefits

  • Extensibility: This reference architecture supports a myriad of media output format types and workflow step definitions. It can be configured to output to any device, platform, or audience specification.

  • Scalability: This solution supports horizontal scalability by adding more Linodes within the Kubernetes cluster, which enables high throughput. Scaling this solution allows you to process a large amount of content in a short period of time. Scaling can support service launch or marketing campaign requirements.

  • Cost-effectiveness: Traditional media workflows have to keep a deployed capacity for peak usage. This reference architecture is built on Kubernetes and uses the Argo workflow engine, which supports dynamic pod scheduling and tear-down. Because of this dynamic resource usage, your cost footprint can be minimized.

This page was originally published on

Your Feedback Is Important

Let us know if this guide was helpful to you.