http://www.acticom.de/videometer.html

Patrick Seeling, Frank Fitzek, and Martin Reisslein

Summary:
(see also IEEE Network Jan 2003)
The VideoMeter is a software tool for the comparative evaluation of the
quality of raw video data in the YUV format. The tool gives the
differences in PSNR quality between two or three YUV video sequences and
includes also a player for YUV video streams.  In a typical application
scenario, the tool is used for the quality assessment of videos that
have been encoded with some lossy compression scheme and transported
over a lossy network.  The tool can be used to simultaneously play (1)
the original YUV video sequence, (2) the encoded (and subsequently
decoded) video sequence, and (3) the video sequence obtained after
encoding, network transport, and subsequent decoding.  The tool gives
the quality differences in PSNR (in dB) between the original video
sequence and the video sequences obtained after encoding and network
transport. The VideoMeter also displays the difference pictures for the
Y (luminance) component to visualize the errors.
The quality differences for the current frame and the average over the
past 20 frames are provided.

The video meter takes YUV video in the 4:2:0 format as input. Both the
CIF (352×288 pixels) and the QCIF (176×144 pixels) video frame format
are supported. Synchronization in the case of frame drops (either in a
rate-adaptive encoder or in the network) is achieved by supplying the
VideoMeter with a freeze file. The freeze file gives the frames that are
to be held for comparison since the following frame(s) have been
dropped. The VideoMeter has been tested on Linux (SuSe 7.x, 8, Mandrake
8/9, and RedHat 7.3) and requires an X11 terminal with 16, 24, or 32 bit
display depth.

Full Description:
Patrick Seeling, Frank Fitzek, and Martin Reisslein.
VideoMeter tool for YUV bistreams. Technical Report acticom-02-001,
Telecommunications Research Center, ASU, and acticom GmbH,
Tempe, AZ and Berlin, Germany. October 2002.

Download VideoMeter:  videometer-bin.tar.gz

Acknowledgment: Supported in part by the National
Science Foundation under Grant No. Career ANI-0133252 and
Grant No. ANI-0136774 as well as the State of Arizona through
the IT301 initiative.
Any opinions, findings and conclusions or recommendations
expressed in this material are those of the authors and do not
necessarily reflect the views of the National Science Foundation.