Digital particle image velocimetry (DPIV) is a non-intrusive analysis technique that is very popular for mapping flows quantitatively. To get accurate results, in particular in complex flow fields, a number of challenges have to be faced and solved: The quality of the flow measurements is affected by computational details such as image pre-conditioning, sub-pixel peak estimators, data validation procedures, interpolation algorithms and smoothing methods. The accuracy of several algorithms was determined and the best performing methods were implemented in a user-friendly, GUI based open-source tool for performing DPIV flow analysis in MATLAB.

Digital particle image velocimetry (DPIV) is a common technique for non-intrusive, quantitative and qualitative flow visualization. A number of research articles deals with the implementation and optimization of the DPIV technique [

In DPIV, the motion of a fluid (either gaseous or liquid) is visualized by illuminating a thin sheet of fluid containing reflective and neutrally buoyant tracer particles. A digital image sensor is positioned parallel to the illuminated sheet, capturing the movement of the particles (see Figure _{0} and _{0}+Δ

Principle of DPIV: A laser sheet illuminates the particles contained in the fluid. A high-speed camera records the displacement of the particle pattern.

PIVlab is programmed in MATLAB and additionally requires the image processing toolbox to run. The main file is ‘PIVlab_GUI.m’. It contains all GUI-related program parts and most of the functions that are accessible from the GUI. The most important functions for performing a PIV analysis (pre-processing: ‘PIVlab_preproc.m’; cross-correlation: ‘piv_FFTmulti.m’ and ‘piv_DCC.m’) are also accessible from the command line (see example in ‘PIVlab_commandline.m’) when the user desires to automate the whole process or to include it in another application.

A DPIV analysis typically consists of three main steps (image pre-processing, image evaluation, post-processing, see Figure

DPIV analyses in PIVlab. Overview of the workflow and the implemented features that are presented in the next sections.

The following section will give an overview of the relevant features and techniques that are accessible in PIVlab:

One common approach to improve the measurement quality is the enhancement of images before the actual image correlation takes place [

The effect of several pre-processing techniques, see text for a description.

Contrast limited adaptive histogram equalization (CLAHE) was developed to increase the readability of image data in medical imaging [

Inhomogeneous lighting can cause low frequency background information which can be removed by applying a high-pass filter that mostly conserves the high frequency information from the particle illumination [

The DPIV method assumes that all particles within an interrogation window have the same motion. This will not be the case in reality, as perfectly uniform flow does hardly exist. Bright particles or bright spots within the area will contribute statistically more to the correlation signal, which may bias the result in non-uniform flows [

The most sensitive part of a DPIV analysis is the cross correlation algorithm: Small sub images (interrogation areas) of an image pair are cross correlated to derive the most probable particle displacement in the interrogation areas. In essence, the cross-correlation is a statistical pattern matching technique that tries to find the particle pattern from interrogation area A back in interrogation area B. This statistical technique is implemented with the discrete cross correlation function [

where A and B are corresponding interrogation areas from image A and image B.

The location of the intensity peak in the resulting correlation matrix

There are two common approaches to solve equation 1: The most straightforward approach is to compute the correlation matrix in the spatial domain (see Figure

Calculation of the correlation matrix using DCC as it is performed in MATLAB. Interrogation area A (size 4·4 pixels) is correlated with interrogation area B (size 8·8 pixels) and yields the correlation matrix (size 9·9 pixels). Adapted from [

Another approach is to compute the correlation matrix in the frequency domain (discrete Fourier transform, DFT). The DFT is calculated using a fast Fourier transform [

The direct cross correlation computes the correlation matrix in the spatial domain. In DCC, the interrogation areas A and B can have two different sizes [

Correlation matrices of the DCC (top) and the DFT approach (bottom), interrogation area A is 64·64 pixels for both DCC and DFT. Area B is 128·128 pixels in DCC and 64·64 pixels in DFT. In DCC, the background noise does not increase up to a displacement of 32 pixels. In DFT, background noise immediately increases if the displacement is larger than 0 pixels. A displacement of more than 32 pixels will flip the correlation peak to the opposite side of the correlation matrix, and thus makes correct measurements impossible.

The potential drawback of DCC – the computational cost – can be resolved by calculating the correlation matrix in the frequency domain [

This disadvantage can be offset by running several passes of the DFT on the same dataset [

In real flows, the particle patterns will additionally be sheared and rotated; the non uniform particle motion will broaden the intensity peak in the correlation matrix and deteriorate the result. Several methods that account for the transformation of the interrogation areas have been proposed [

The choice of the peak finding technique is – similar to the choice of the cross correlation technique – another important factor for the accuracy of DPIV. The integer displacement of two interrogation areas can be determined straightforward from the location of the intensity peak of the correlation matrix. The location can be refined with sub-pixel precision using a range of methods [

Principle of the Gaussian 2·3-point fit: Sub-pixel precision is achieved by fitting a one-dimensional Gaussian function (solid line) to the integer intensity distribution of the correlation matrix (dots) for both axes independently (only one axis is shown here).

If the particle displacement within an interrogation area is exposed to shear or rotation or if the images suffer from excessive motion blur, the displacement peak may have an elliptical shape [

Post processing of DPIV data is generally required to obtain reliable results [_{lower}_{upper}

where _{u}

The user defined value of

A more universal outlier detection method that automatically adapts to local flow situations is the normalized median test [

After the removal of outliers, missing vectors should be replaced by interpolated data [

A certain amount of measurement noise will be inevitable in DPIV analyses [

Many DPIV studies reveal very complex flow patterns. Such a complexity is hard to describe purely with vector maps. The strength of PIVlab is that it offers a large number of possibilities to further process and distil the results: Derivatives, such as vorticity and divergence can be calculated, data can be extracted from paths or areas and integral quantities can be calculated comfortably.

Funtional testing has been performed on Windows XP and Windows 7 with MATLAB releases R2010a, R2011a and R2013b. Although the authors do not have access to additional operation systems and MATLAB versions, the reports from many users claim that PIVlab works flawlessly on Mac OS X and UNIX/Linux too. Bugs that have been discovered in some of the last ten PIVlab releases have been corrected before the next release. The user acceptance of PIVlab has been monitored and optimized while the software was used by more than 150 supervised students with several different operating systems and MATLAB releases.

Furthermore, extensive tests on the quality of the results obtained with PIVlab were performed using more than 6·10^{4} synthetic particle images. The image properties were modified in a way that allowed to calculate the accuracy of all important PIV image parameters according to [

The quality of the DPIV measurements in PIVlab was extensively evaluated using synthetic particle images with known properties. The effect of particle image diameter, particle density, sensor noise, particle pair loss, motion blur and shear was determined and is reported in detail elsewhere [

The performance of the most popular interpolation techniques and the boundary value solver are tested: A number of real and synthetic image pairs are analyzed using standard DPIV (see Figure

Procedure for testing several interpolation techniques. Left: Original velocity data. Middle: Data is removed at random positions. Right: Gaps are filled with interpolation and compared to the original velocity data.

Performance of popular interpolators. The boundary value solver performs best under the presence of larger amounts (> 10%) of missing data.

The performance of the smoothing algorithms is tested using DPIV data of synthetic particle images (more detail is given in [

Validation of smoothing algorithms. Left: Maximum absolute difference between the calculated velocities and the true velocities in percent of the maximum true velocity. Right: Mean absolute difference.

Support is available through the website of PIVlab (

Based on MATLAB 7.10.0 (R2010a): Windows, UNIX/Linux, Macintosh

MATLAB 7.10.0 (R2010a), upward compatible

MATLAB 7.10.0 (R2010a): 1 GB disk space, 1 GB RAM

MATLABs Image Processing Toolbox is required

WT: Programming of the tool and the graphical user interface (GUI); selection, implementation and evaluation of the algorithms; writing this article; support and maintenance of the tool.

EJS: Support with the layout of both the tool and the GUI; feedback, tips and discussions on the functionality.

CC-BY

William Thielicke

03/07/2014

English

The analysis of flow velocities is an integral part of many research disciplines. Here, DPIV has become a very popular method. The development of user-friendly, accurate and free DPIV tools greatly enhances the practicability and availability of this technique. In many cases, it can even be sufficient to use low cost digital cameras and lasers to study complex flows with PIVlab (e. g. [

PIVlab has proved its value in a number of research projects. The tool has been used for ’conventional’ measurements of fluid velocities, but also in a very different context, for example flow visualizations within cells, echocardiographic velocimetry of the human heart and deformation of sand and gravel [

PIVlab can be extended with custom functionalities and features using MATLABs GUI editor and many of MATLABs pre-built functions. It is e. g. the basis of ’PTVlab’, an independent software tool that was designed to track particles. Individual functions of PIVlab (e. g. the image correlation code) can be used for custom projects.

PIVlab is highly integrated with MATLAB and benefits from MATLABs extensive plotting and data handling features. But data can also be exported to generic ASCII files, which can be processed e. g. with Excel (Microsoft, Redmond, WA). As PIVlab can also export to binary vtk files, Paraview (Kitware, Inc., Clifton Park, NY) is another very appropriate possibility to visualize and explore the flow data.

See

inpaint_nans by John D’Errico,

The accuracy tests are part of the PIVlab documentation and they are made available on the website of our tool (