Create movie from image and add background music

For my service, I would like to create a short movie from a still image and use an MP3 file for background music.

Is there an easy way to do this?

1 Like

By default it’s unfortunately not possible to use only an image and audio to create a movie using Bitmovin. However, a work-around is possible.

You will need:

  • Audio file (MP3, WAV etc.)
  • Video file (list of supported formats) - longer than or equal to the duration of the output file you would like - it could just be x minutes of black.
  • Still image (e.g. JPG, PNG)

What you will do is transcode the video file, use the still image as a full-size overlay with 0 transparency, and at the same time use the audio file as the input when creating the audio stream.

The following is using the FixedBitrateLadder examples as source: bitmovin-api-sdk-examples/FixedBitrateLadder.java at main · bitmovin/bitmovin-api-sdk-examples · GitHub

First create a couple of variables to hold the paths to the audio and video file (relative to the input storage):

// Reading links to input files from examples.properties file
String inputFilePathAudio = configProvider.getParameterByKey("AUDIO");
String inputFilePathVideo = configProvider.getParameterByKey("VIDEO");

Next you create IngestInputStreams based on the audio and video:

// Creating the ingest streams
IngestInputStream audioInputStream =
    createAudioIngestInputStream(encoding, input, inputFilePathAudio);
IngestInputStream videoInputStream =
    createVideoIngestInputStream(encoding, input, inputFilePathVideo);

As we want to trim the duration of the finished file to the length of the audio file (or use a fixed duration), we need, in advance, to know the duration of the audio file.

At the end of this guide, there is a function to extract the duration of a mediafile using ffprobe. This function uses something similar to:

-> % ffprobe -i "file_example_MP3_5MG.mp3" -show_entries format=duration -v quiet -of csv="p=0"
132.205714

Using the function, we can extract the duration from a given mediafile:

// Getting the duration of the audiofile
Double audioDuration =
    getDuration(
        configProvider.getHttpInputHost() + "/" + configProvider.getParameterByKey("AUDIO"));

Next we trim the duration of the inputStreams to the extracted value (more about trimming can be found here)

// Trimming the duration of the audio stream to the duration of the audioTimeBasedTrimmingInputStream audioTime =
    createTimeBasedTrimmingInputStream(encoding, audioInputStream, 0.0, audioDuration);

// Trimming the duration of the video stream to the duration of the audio// Current limitation: source video duration must be equal to or longer than audio durationTimeBasedTrimmingInputStream videoTime =
    createTimeBasedTrimmingInputStream(encoding, videoInputStream, 0.0, audioDuration);

The next is adding a watermarkfilter to the encoding (more about filters here)

// Adding the watermark-filter that creates the image as a full-frame overlay to the video// stream(s)List<Filter> filters = new ArrayList<>();
filters.add(createWatermarkFilter());

for (Stream videoStream : videoStreams) {
  createStreamFilterList(encoding, videoStream, filters);
}

The createWaterMarkFilter looks like this:

  private static EnhancedWatermarkFilter createWatermarkFilter() throws BitmovinException {
    EnhancedWatermarkFilter watermarkFilter = new EnhancedWatermarkFilter();
    watermarkFilter.setImage(configProvider.getWatermarkImagePath());
    watermarkFilter.setUnit(PositionUnit.PERCENTS);
    watermarkFilter.setHeight(100.00);
    watermarkFilter.setWidth(100.00);
    watermarkFilter.setLeft(0.00);
    watermarkFilter.setTop(0.00);

    return bitmovinApi.encoding.filters.enhancedWatermark.create(watermarkFilter);
  }

And you add it to the filters on the encoding

for (Stream videoStream : videoStreams) {
  createStreamFilterList(encoding, videoStream, filters);
}

Finally, in this example, we create an MP4 file for each video configuration:

for (Stream videoStream : videoStreams) {
  createMp4Muxing(
      encoding,
      output,
      videoStream.getCodecConfigId(),
      Arrays.asList(videoStream, audioStream),
      "video_h264.mp4");
}

And at the end, we execute the encoding:

executeEncoding(encoding);

Additional: Script to extract duration of mediafile

// Function to check OS family
  private static boolean isWindows() {
    String OSName = System.getProperty("os.name");
    return OSName.startsWith("Windows");
  }

  // Function to extract the duration of a mediafile using ffprobe
  private static Double getDuration(String mediaFileUrl) throws IOException {
    String[] command;
    Double duration = 0.0;

    // Setting the correct command based on platform
    if (isWindows()) {
      command =
          new String[] {
            "cmd",
            "/c",
            "ffprobe -show_entries format=duration -v quiet -of csv=\"p=0\" -i " + mediaFileUrl
          };
    } else {
      command =
          new String[] {
            "/bin/bash",
            "-c",
            "ffprobe -show_entries format=duration -v quiet -of csv=\"p=0\" -i " + mediaFileUrl
          };
    }

    // Executing the command
    Process p = Runtime.getRuntime().exec(command);

    // Buffers to hold output
    BufferedReader stdInput = new BufferedReader(new InputStreamReader(p.getInputStream()));
    BufferedReader stdError = new BufferedReader(new InputStreamReader(p.getErrorStream()));

    // Read the output from the command
    String s = null;
    while ((s = stdInput.readLine()) != null) {
      if (s.trim().length() > 0) {
        duration = Double.parseDouble(s.trim());
      }
    }

    // Read any errors from the attempted command
    while ((s = stdError.readLine()) != null) {
      logger.error(s);
    }

    return duration;
  }
1 Like