FFmpeg: FAPA (Frame-Averaged Pixel Array)


Preamble: When I create a blog-post about a film, I will often include a cryptic looking pixelated image somewhere in the body of the post. When possible, I will create one of these images for every film I watch. I create them as a type of 'fingerprint', showing overall tonality and temporal dynamics of the film's visuals.

The image contains all frames in a given film. Each pixel represents the average colour of its particular frame. This colour is calculated by doing no more than scaling the frame to dimensions of '1x1' in a FFmpeg 'scale' filter. The frames [pixels] are then tiled into a single image of suitable dimensions.

The example video is taken from 'Summer in February (2013)' and shows a scene involving tropospheric lightening near the end of the film. The section of the 'pixel array' image relating to this scene has been highlighted and magnified. The contrast in lighting between frames means each frame can be clearly discerned as the video plays, even without the aid of the arrow.



The Bash script outputs basic information before and while processing. The process will take a reasonable length of time to finish. The version here uses two instances of FFmpeg to process the video. This is so progress feedback is displayed during execution. A simple single instance alternative is included in the 'Notes' section of the script, as well as ideas for showing progress while using this version. The script has not been updated since its initial creation and can probably be improved upon.

#!/bin/bash
################################################################################
# Create a 'Frame-Averaged Pixel Array' of a given video. Works by reducing
# each frame to a single pixel, and appending all frames into single image.
# - Takes: $1=Filename [$2=width]
# - Requires: ffmpeg + ffprobe
#   ver. 1.1 - 10th November, 2015
# source: https://oioiiooixiii.blogspot.com
###############################################################################

width="${2:-640}" # If no width given, set as 640
duration="$(ffprobe "$1" 2>&1 \
            | grep Duration \
            | awk  '{ print $2 }')"
seconds="$(echo $duration \
           | awk -F: '{ print ($1 * 3600) + ($2 * 60) + $3 }' \
           | cut -d '.' -f 1)"
fps="$(ffprobe "$1" 2>&1 \
       | sed -n 's/.*, \(.*\) fps,.*/\1/p' \
       | awk '{printf("%d\n",$1 + 0.5)}')"
frames="$(( seconds*fps ))"
height="$(( frames/width ))"
filters="tile=${width}x${height}"

clear
printf "$(pwd)/$1
___Duration: ${duration::-1}
____Seconds: $seconds
________FPS: $fps
_____Frames: $frames
_____Height: $height
____Filters: $filters\n"

# First instance of FFmpeg traverses the frames, the second concatenates them.
ffmpeg \
   -y \
   -i "$1" \
   -vf "scale=1:1" \
   -c:v png \
   -f image2pipe pipe:1 \
   -loglevel quiet \
   -stats \
| ffmpeg \
    -y \
    -i pipe:0 \
    -vf "$filters" \
    -loglevel quiet \
    "${1%.*}_$width".png

################################ NOTES #######################################

# Single line solution, but doesn't show progress
# ffmpeg -i "$1" -frames 1 -vf "$filters" "${1%.*}".png -y
# filters="scale=1:1,tile=${width}x${height}" # Used with single line version
# View ingest progress using: pv "$1" | piped to ffmpeg
download: video2pixarray.sh

[Note: I have struggled with giving a name to this process since I created the script, and have left it as the first thing I thought of. Perhaps others whom have creating something similar have better names for it.]

film review: https://oioiiooixiii.blogspot.com/2017/11/summer-in-february-2013.html