Monday, October 21, 2013

FFMPEG - Creating timelapse with FFMPEG with JPEG

If you have a series of JPEG which are taken over a period of time, you can use the following command to create a timelapse video.

ffmpeg -f image2 -r 1 -i your_image%02d.jpg -r 15 -s hd1080 -vcodec libx264 your_output.mp4

Let me go through this command

In general, any options before -i denote the input file parameters. For this, we have -f image2 -r 1 -i your_image%02d.jpg

-f is to force the input file format as image2. Image2 denote input file as jpeg

-r is to denote the input frame rate. This is important and is a common mistake made by user. If you do not put -r, it will default -r as 25. Thus, if you see your ffmpeg output contains a lot of duplicate or dropped frame. You need to define the frame rate of the input file that match with the output frame rate.

-i is to set your input file names. %02 means that ffmpeg will take your_image00.jpg to your_image99.jpg as input consideration.

Next, any option after -i denote the output file parameters. For this, we have -r 15 -s hd1080 -vcodec libx264 your_output.mp4

-r is to denote the output frame rate.

-s is to set the output frame size to hd1080

-vcodec denote the codec to be used to encode the output file. libx264 is the standard encoder for H.264 encoding in FFMPEG

your_output.mp4 is the output name

Monday, October 7, 2013

Linux - Back To Basic Part 1 - Shell Command-line Processing, Quoting and Example

Command-line Processing

Be it bash or ksh, when you enter a command to command-line, they will process your input before running the command

Below is the flowchart for ksh command-line processing. See for more details

Figure 7.1


It is always good to "  " or '  ' your command parameter. Without quoting, it may result in unexpected result.

Quoting tells the shell to bypass certain steps in the above flowchart when the shell is performing command-line processing. Usually, we may want to ignore pipe characters, aliases, tilde substitution and wildcard expansion for the command parameters

See section 7.3 from the given link on Command-line Processing for more details


Command-line processing and quoting effect can easily be seen in GREP.

Grep expect the following syntax
    grep PATTERN file

where it expects PATTERN as follows

  • No space if quote are not used
  • If space are required in PATTERN, you need to quote your PATTERN string

Now, we have 3 general scenario in resolving the following command

ls /your/current/dir | grep filename_*

when filename_* is used as PATTERN without any quotes during command-line processing

1. No file with "filename_" in /your/current/dir

The command will be resolved to

ls /your/current/dir | grep filename_*

during command-line processing and it should return no result. The result is correct.

2. There is a single file call "filename_a" in /your/current/dir

 The command will be resolved to

ls /your/current/dir | grep filename_a

during command-line processing and it should return a single result with filename_a. The result is correct.

3. There are N files named, ie "filename_a, filename_b, filename_c" in /your/current/dir

 The command will be resolved to

ls /your/current/dir | grep filename_a filename_b filename_c

during command-line processing and it return no result. This is WRONG because we expect grep to return N files with prefix filename_

If you compare scenario 2 and 3, both scenario 2 and 3 use the same command

        ls /your/current/dir | grep filename_*

However, scenario 3 produced wrong result. Such unexpected error is due to wildcard expansion of command-line processing by the shell and grep uses filename_a as PATTERN and search on filename_b and filename_c as input files.

Thus, in order for the command to work correctly, your command should be

       ls /your/current/dir | grep "filename_*"

This command will be resolved to

      ls /your/current/dir | grep "filename_*"

during command-line processing in all 3 scenario and return correct result respectively.

Hadoop - How to setup a Hadoop Cluster

Below is a step-by-step guide which I had used to setup a Hadoop Cluster Scenario 3 VMs involved: 1) NameNode, ResourceManager - Host...