Building a Simple Video Processing Web App with Node.js and Coconut API


In this tutorial, we'll walk through using the Coconut Transcoding API with Node.js and Express.js to upload and transcode video files to MP4, as well as generate thumbnails. We'll cover the following steps:

  1. Creating an HTML form to receive a video file.
  2. Uploading the video file to Amazon S3 with a pre-signed URL.
  3. Creating a transcoding job with the Coconut Transcoding API.
  4. Receiving a webhook notification with the result URLs.
  5. Showing the video player to play the transcoded video.


Before we start, make sure you have the following:

  1. Node.js and npm installed on your local machine.
  2. An Amazon S3 bucket for storing uploaded files.
  3. An account on the Coconut Transcoding API website and your API key.

Let's begin with creating an HTML form to receive a video file.

Creating an HTML form to upload a video file directly to AWS S3

First, create a new HTML file named index.html and add the following code to create a basic form that allows users to select a video file:

<!DOCTYPE html>
<html lang="en">
  <meta charset="UTF-8">
  <meta name="viewport" content="width=device-width, initial-scale=1.0">
  <title>Transcode a video</title>
  <script src="https://cdn.tailwindcss.com"></script>
    .spinner {
      border: 2px solid rgba(0, 0, 0, 0.1);
      border-left-color: #818cf8;
      border-top-color: #818cf8;
      animation: spin 1s linear infinite;
      border-radius: 50%;
      width: 2rem;
      height: 2rem;

    @keyframes spin {
      to {
        transform: rotate(360deg);
<body class="bg-gray-100 flex items-center justify-center min-h-screen">
  <div class="w-full h-screen flex items-center justify-center" id="upload-form-container">
    <form id="upload-form" class="bg-white p-8 rounded-lg shadow-md w-full max-w-md">
      <h1 id="title" class="font-semibold text-xl text-center mb-4">Transcode a video</h1>
      <label for="file-input" id="file-label" class="bg-purple-500 text-white font-medium cursor-pointer py-2 px-4 rounded mr-4 block">Choose File</label>
      <span id="file-name" class="inline"></span>
      <input type="file" name="file" id="file-input" class="hidden" accept="video/*" required />
      <button type="submit" id="upload-button" class="hidden block bg-purple-500 text-white font-medium cursor-pointer py-2 px-4 rounded mt-4 w-full">Upload</button>
      <div class="progress h-4 w-full bg-gray-300 mt-4 rounded hidden">
        <div class="progress-bar h-full w-0 bg-purple-500 rounded"></div>
      <div id="spinner" class="flex justify-center items-center mt-4 hidden">
        <div class="spinner mx-auto"></div>
      <div id="video-container" class="w-full max-w-md mx-auto mt-8 hidden">
        <video id="video-player" controls preload="none" poster="" class="w-full">
          <source id="video-source" src="" type="video/mp4">
      <button id="reset-btn" class="mt-4 bg-purple-500 text-white py-2 px-4 rounded mx-auto" style="display: none;">New job</button>
  <script src="/static/coconut.js"></script>

This form contains an input field of type file with the accept attribute set to "video/*", which restricts the file selection to video files only. The form also includes a submit button for uploading the selected file.

The JavaScript code static/coconut.js is responsible for handling the file upload process, communicating with the server-side code to get the presigned URL to upload the file directly to AWS S3 (PUT request), and starting the transcoding process with the Coconut API when the upload is done.

const uploadForm = document.getElementById('upload-form');
const fileInput = document.getElementById('file-input');
const progressBar = document.querySelector('.progress-bar');
const progressContainer = document.querySelector('.progress');
const resetBtn = document.getElementById('reset-btn');
const uploadFormContainer = document.getElementById('upload-form-container');
const fileInputName = document.getElementById('file-name');
const spinner = document.getElementById('spinner');
const title = document.getElementById('title');

async function getPresignedUrl(filename, contentType, operation) {
  const response = await fetch('/getPresignedUrl', {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
    body: JSON.stringify({ filename, contentType, operation }),

  if (!response.ok) {
    throw new Error('Error fetching presigned URL');

  const { presignedUrl } = await response.json();
  return presignedUrl;

async function checkJobStatus(jobId) {
  const response = await fetch(`/status/${jobId}`);
  if (!response.ok) {
    throw new Error('Error fetching job status');

  const { status, resultUrls } = await response.json();
  return { status, resultUrls };

function handleProgressEvent(e) {
  if (e.lengthComputable) {
    const percentComplete = Math.round((e.loaded / e.total) * 100);
    progressBar.style.width = `${percentComplete}%`;

async function handleFileUpload(uploadedFileUrl) {
  const response = await fetch('/transcode', {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
    body: JSON.stringify({ url: uploadedFileUrl }),

  if (!response.ok) {
    throw new Error('Error starting transcoding');

  const data = await response.json();
  return data.job;

function initializeUploadUI() {
  fileInputName.style.display = 'none';
  progressContainer.style.display = 'none';
  spinner.style.display = 'block';
  title.textContent = 'Transcoding file';

async function monitorJobStatus(jobId) {
  const checkStatusInterval = setInterval(async () => {
    try {
      const { status, resultUrls } = await checkJobStatus(jobId);
      console.log(`Job #${jobId} status: ${status}`);
      if (status === 'completed') {
        console.log(`Job #${jobId} completed. Result URLs:`, resultUrls);

        resetBtn.style.display = 'block';
        spinner.style.display = 'none';
        title.textContent = 'Ready to play';
      } else if (status === 'error') {
        console.error(`Job #${jobId} encountered an error.`);
    } catch (error) {
      console.error('Error checking job status:', error);
  }, 2000);

function displayVideoPlayer(resultUrls) {
  const videoContainer = document.getElementById('video-container');
  const videoPlayer = document.getElementById('video-player');
  const videoSource = document.getElementById('video-source');

  videoPlayer.setAttribute('poster', resultUrls.jpg[0]);
  videoSource.setAttribute('src', resultUrls.mp4);


async function uploadFile(putPresignedUrl, file) {
  return new Promise((resolve, reject) => {
    const xhr = new XMLHttpRequest();
    xhr.upload.addEventListener('progress', handleProgressEvent);
    xhr.onreadystatechange = async () => {
      if (xhr.readyState === XMLHttpRequest.DONE) {
        if (xhr.status === 200) {
          const uploadedFileUrl = putPresignedUrl.split('?')[0];
          try {
            const job = await handleFileUpload(uploadedFileUrl);
            console.log('File uploaded and transcoding started');
            console.log('Job object:', job);

            await monitorJobStatus(job.id);
          } catch (error) {
            console.log('Error starting transcoding', error);
        } else {
          console.log('Error uploading file');

    xhr.open('PUT', putPresignedUrl, true);
    xhr.setRequestHeader('Content-Type', file.type);

    progressContainer.style.display = 'block';
    title.textContent = 'Uploading file';

async function submitForm(event) {

  document.getElementById('upload-button').style.display = 'none';
  fileInputName.textContent = '';

  const file = fileInput.files[0];
  if (!file) {
    alert('Please select a file to upload.');

  const fileKey = `uploads/${Math.random().toString(36).substring(2)}-${file.name}`;

  try {
    const putPresignedUrl = await getPresignedUrl(fileKey, file.type, 'put');
    await uploadFile(putPresignedUrl, file);
  } catch (error) {

function handleFileInputChange(event) {
  const fileName = event.target.files[0].name;
  fileInputName.textContent = fileName;
  document.getElementById('file-label').style.display = 'none';
  document.getElementById('upload-button').style.display = 'block';

function resetUI() {
  fileInput.value = '';

  fileInputName.style.display = 'block';
  fileInputName.textContent = '';
  document.getElementById('file-label').style.display = 'block';
  document.getElementById('upload-button').style.display = 'hidden';

  resetBtn.style.display = 'none';
  progressContainer.style.display = 'none';
  uploadForm.style.display = 'block';

  const videoContainer = document.getElementById('video-container');
  const videoPlayer = document.getElementById('video-player');
  const videoSource = document.getElementById('video-source');

  videoPlayer.currentTime = 0;
  videoSource.setAttribute('src', '');
  videoPlayer.setAttribute('poster', '');

  spinner.style.display = 'none';
  progressBar.style.width = '0';

  title.textContent = 'Transcode a video';

uploadForm.addEventListener('submit', submitForm);
fileInput.addEventListener('change', handleFileInputChange);
resetBtn.addEventListener('click', resetUI);

Adding CORS Policy to your S3 bucket

The S3 bucket needs to have the correct CORS (Cross-Origin Resource Sharing) policy in place to accept PUT requests directly from the browser. The CORS policy allows you to control which origins can access your S3 bucket and what methods they can use.

To enable PUT requests from the browser, you need to add a CORS policy to your S3 bucket that allows PUT requests from the desired origins. Here's an example of a CORS policy that allows PUT requests from any origin:


To add the CORS policy to your S3 bucket:

  1. Sign in to the AWS Management Console.
  2. Open the Amazon S3 console at https://console.aws.amazon.com/s3/.
  3. Click on the name of the bucket to which you want to add the CORS policy.
  4. Click on the "Permissions" tab.
  5. Scroll down to the "Cross-origin resource sharing (CORS)" section and click "Edit."
  6. Add the CORS policy XML in the text area.
  7. Click "Save changes."

After you've updated the CORS policy, your S3 bucket should accept PUT requests directly from the browser, as specified in the policy.

Generating pre-signed URLs server-side

In this step, we will create a server-side application using Node.js and Express.js to handle the generation of a pre-signed URL for uploading the video file directly to Amazon S3.

In the same directory, initialize a new Node.js project and install the required packages:

npm init -y
npm install express multer body-parser @aws-sdk/client-s3 @aws-sdk/s3-request-presigner @aws-sdk/util-create-request @aws-sdk/util-format-url

Next, create a config file inside the same directory config.json to save the S3 bucket name, credentials and other information.

  "S3_BUCKET": "your-bucket",
  "S3_REGION": "your-bucket-region",  
  "AWS_ACCESS_KEY_ID": "your-access-key-id",
  "AWS_SECRET_ACCESS_KEY": "your-secret-access-key",
  "COCONUT_API_KEY": "your-coconut-api-key",
  "NOTIFICATION_URL": "the-url-to-receive-notification", // We'll see this after
  "PORT": 3000

Now, create a new file named app.js and add the following code:

const { S3Client, GetObjectCommand, PutObjectCommand } = require('@aws-sdk/client-s3');
const { S3RequestPresigner } = require('@aws-sdk/s3-request-presigner');
const { createRequest } = require('@aws-sdk/util-create-request');
const { formatUrl } = require('@aws-sdk/util-format-url');
const express = require('express');
const bodyParser = require('body-parser');
const multer = require('multer');
const upload = multer();
const path = require('path');
const config = require('./config.json');

const s3Client = new S3Client({
  region: config.S3_REGION,
  accessKeyId: config.AWS_ACCESS_KEY_ID,
  secretAccessKey: config.AWS_SECRET_ACCESS_KEY,

const app = express();


app.get('/', (req, res) => {
  res.sendFile(path.join(__dirname, 'index.html'));

app.use('/static', express.static(path.join(__dirname, 'static')));

app.post('/getPresignedUrl', async (req, res) => {
  try {
    const { filename, contentType, operation } = req.body;

    if (!filename || !contentType || !operation) {
      res.status(400).json({ error: 'Missing required parameters' });

    const params = {
      Bucket: config.S3_BUCKET,
      Key: filename,
      ContentType: contentType,

    if (operation === 'put') {
      params.ACL = 'public-read';

    const command = operation === 'put' ? new PutObjectCommand(params) : new GetObjectCommand(params);
    const requestPresigner = new S3RequestPresigner(s3Client.config);
    const request = await createRequest(s3Client, command);
    const presignedUrl = formatUrl(await requestPresigner.presign(request, { expiresIn: 60 * 5 })); // Expires in 5 minutes


    res.json({ presignedUrl });
  } catch (error) {
    console.error('Error generating presigned URL:', error);
    res.status(500).json({ error: 'Error generating presigned URL' });

app.listen(config.PORT, () => {
  console.log(`Server running on port ${config.PORT}`);

When it receives a request, it generates a pre-signed URL for uploading the video file directly to Amazon S3 and returns the URL in the response.

Creating a transcoding job with the Coconut Transcoding API

Now that we have set up the server-side application to handle video file uploads, let's integrate the Coconut Transcoding API to transcode the uploaded video file to MP4, and generate 3 thumbnails.

First, install the coconutjs package:

npm install coconutjs

Now, update the app.js file with the following code:

const Coconut = require('coconutjs');

const coconut = new Coconut.Client(config.COCONUT_API_KEY);

coconut.notification = {
  'type': 'http',
  'url': config.NOTIFICATION_URL,

coconut.storage = {
  'service': 's3',
  'bucket': config.S3_BUCKET,
  'region': config.S3_REGION,
  'credentials': {
    'access_key_id': config.AWS_ACCESS_KEY_ID,
    'secret_access_key': config.AWS_SECRET_ACCESS_KEY,

async function createTranscodingJob(sourceUrl) {
  return new Promise((resolve, reject) => {
        input: { url: sourceUrl },
        outputs: {
          'mp4': { 
            key: 'mp4',
          	path: '/transcode/video.mp4',
            format: {
            	quality: 4
          'jpg:300x': {
          	key: 'jpg',
            number: 3,
            path: '/transcode/thumbnails/thumb_%04d.jpg',
      (job, err) => {
        if (err) return reject(err);

app.post('/transcode', async (req, res) => {
	const { url } = req.body;

  // Create transcoding job
  const job = await createTranscodingJob(url);

  res.status(200).json({ job: job });

This code initializes the Coconut client with your API key defined in config.json and sets up the notification and storage configurations. It also adds a new createTranscodingJob function that creates a transcoding job using the Coconut API with the uploaded video file's URL as input.

Now, the server-side application is ready to generate presigned URLs and create transcoding jobs using the Coconut Transcoding API.

Receiving a webhook notification with the result URLs

In this step, we will handle the webhook notification sent by the Coconut Transcoding API when the transcoding job is completed. The webhook will contain the result URLs for the converted video files and generated thumbnails.

We will save the result in a simple in-memory key-value store, so at the beginning of app.js, add the following code:

// Create a simple in-memory key-value store for our job results 
const DB = {};

const s3Client = ...

Update the app.js file with the following code to handle the webhook notification:

app.post('/webhook', (req, res) => {
  const notification = req.body;

  if (notification.event === 'job.completed') {
    // Handle the completed job notification
    console.log('Job completed:', notification);
    const outputs = notification.data.outputs;
    const resultUrls = {};

    for (const output of outputs) {
      if (output.type === 'video') {
        resultUrls[output.key] = output.url;
      } else if (output.type === 'image' || output.type === 'httpstream') {
        resultUrls[output.key] = output.urls;

    console.log('Result URLs:', resultUrls);
    DB[notification.data.id] = resultUrls;

  } else if (notification.event === 'job.error') {
    // Handle the error notification
    console.error('Job error:', notification);
  } else {
    console.log('Received event:', notification);


Showing a video player when the job is completed

In the static/coconut.js file, once the upload is completed, the job status is periodically checked thanks to the function checkJobStatus(jobId). When the job is completed, we use the mp4 URL in order to display the transcoded video within a video player. To accomplish this, insert the following code in app.js, which will be responsible for verifying the status and returning the corresponding result URLs:

app.get('/status/:jobID', (req, res) => {
  const jobID = req.params.jobID;

  if (DB.hasOwnProperty(jobID)) {
    res.status(200).json({ jobID, status: 'completed', resultUrls: DB[jobID] });
  } else {
    res.status(200).json({ jobID, status: 'processing', resultUrls: {} });

Using Ngrok to tunnel webhook notifications to your local Node.js app

In the development phase, you might want to test webhook notifications on your local machine. Ngrok is a handy tool that allows you to create a secure tunnel from a public endpoint to your local development environment. This way, you can receive webhook notifications from the Coconut Transcoding API directly to your local Node.js application.

First, download and install Ngrok for your platform from the official website: https://ngrok.com/download

Ensure your Node.js server is running by executing the following command in the terminal:

node app.js

This will start your server on port 3000.

Open a new terminal window and navigate to the directory where you downloaded the Ngrok executable. Run the following command to create a secure tunnel to your local server:

ngrok http 3000

After starting Ngrok, you will see a screen with information about the created tunnel. Look for the "Forwarding" section, which contains two URLs: one with HTTP and another with HTTPS. Copy the HTTPS URL, as it is the secure version.

Update the webhook URL in the config.json file:

	// json config
  "NOTIFICATION_URL": "https://your-ngrok-subdomain.ngrok.io/api/coconut/webhook"

Now, when the Coconut Transcoding API sends a webhook notification, it will use the Ngrok URL, which will forward the request to your local Node.js server. This enables you to test and debug webhook notifications in your development environment.

Note: Keep in mind that Ngrok tunnels are temporary and have limited usage for the free plan. When you restart Ngrok, you will receive a new subdomain. You will need to update your webhook URL accordingly. For a more permanent solution, consider using a paid plan or deploying your application to a public server.


In this article, we've explored how to integrate the Coconut Transcoding API with a Node.js and Express.js application to handle video transcoding tasks. We covered the process step by step, from setting up the server and creating an HTML form for uploading videos to configuring the transcoding job and receiving webhook notifications with the result URLs.

We also demonstrated how to use Ngrok to tunnel webhook notifications to a local development environment, enabling you to test and debug your application more efficiently.

By following this guide, you can build a robust video transcoding solution that leverages the power of the Coconut Transcoding API, allowing you to convert video files into various formats, generate thumbnails, and stream videos using HTTP Live Streaming (HLS) easily and efficiently. This will greatly enhance your multimedia capabilities and improve the user experience for your web application or platform.