(1) Overview

Introduction

In rats and mice the facial whiskers (vibrissae) are repetitively and rapidly swept back and forth, to obtain information from the environment that can help guide many activities such as locomotion and object exploration [, ]. This movement, termed ‘whisking’, generally occurs in bouts of variable duration, and at fast rates of 3 - 25Hz [, ]. Movements of the whiskers are closely coordinated with those of the head and body, allowing the animal to locate interesting stimuli through whisker contact, and investigate them further using both the macrovibrissae (the large moveable whiskers) and an array of shorter, non-actuated microvibrissae on the chin and lips [, ]. The whisker system is of great interest to neuroscientists as each individual whisker is represented by a discrete corresponding structure in the cortex, in a topographic map []. In addition, the fast movements are controlled by a complex muscle architecture and modulated by multiple neural structures []. Analysing whisker movements in animals is becoming an increasingly popular research area within the fields of animal behaviour, physiology and healthcare. Indeed, recent research has looked at the relationship of whisker movements, muscles and associated brain areas, finding that whisker movements were correlated well with muscle atrophy and motor neuron decline in a mouse model of neurodegeneration [].

Analysing whisker movements and locomotion offers a way to gather quantitative behavioural data from rodents. Rodents remain a key species in generating translational data for healthcare research. Quantitative measurements of whisker and locomotion behaviours are integral to the non-invasive monitoring of rodent health and welfare, by reducing stress, and allowing the same animal to be studied over the time-course of a disease. Measurement of whisker movements and locomotion are carried out by analysing video footage (usually high-speed video) using automatic trackers [, , ]. However, these automatic trackers tend to be task-specific, so cannot be applied more generally, and in addition, have yet to be fully validated against manual data. Therefore, a manual tracker is required both to validate these automatic approaches, and also to annotate complex videos when these automatic versions fail. There is little freely available software to manually annotate whiskers and gather the desired data in an easy and intuitive manner. Many studies have developed their own tracking program to address this, but these programs have not been published and are not available for download [, , ]. Freely available programs have provided some good analytical tools for modelling an object’s trajectory or acceleration, but they were not designed for tracking multiple objects or handling whiskers [].

The aim of the software package is to increase annotation speed, compared to existing and freely available software. We will do this by i) creating an easy to use and intuitive interface; ii) allowing easy extraction of all raw data in an intuitive format; iii) creating an analytical suite to analyse the raw data; and iv) creating the software package in a format that is easy to install with as little effort required from the user as possible (i.e. no additional assemblies required).

Implementation and Architecture

Using C#, WPF and the .NET framework allows high performance, robust and scalable solutions for Windows []. It was developed using the Model View View-Model (MVVM) design pattern, and as such, the code can be split into 3 main sections, the View, Model and Repository. The View folder contains all the windows. Converters, Behaviours, Commands, Controls and other classes created specifically for the windows can be found in their respective folders. The Model layer contains all the objects which represent the real-world content and can be found in the Model folder. The interfaces for these objects can be found in the ModelInterface folder. The model and view are linked with a View-Model, and these can be found in the ViewModel folder. Finally, the repository handles loading and saving of data, which is in XML format for this project. The files for these can be found in the Repository folder. Using this pattern, additional modules can be easily added such as creating additional analytical functions, or creating an another object other than a whisker to track.

All image processing functions are provided by the open source image processing library OpenCV []. Although the program was developed for Microsoft Windows as this is the most widely used operating system by a considerable margin with 88% of the market share [], The Model layer, Repository, and OpenCV, can all be compiled in Mono [] which allows cross platform support for Mac OSX and Linux, with a only a new View and ViewModel layer being required.

Starting an Annotation

Researchers will annotate videos in a number of different ways. Many will only want to annotate a certain portion of the video, while others may want to only annotate every other frame. MWA aimed to be a complete package, and was developed to cater for all the different needs a researcher may have. The first step when starting a new annotation is to define the setup of the video. The user will enter the desired start and end frames, the frame interval, and the number/names of the whiskers/points they wish to track. Once this is complete, the user can begin the annotation process.

The annotation process simply involves the user clicking on the desired point. If the settings are left in their default state, once a single point is clicked, it will automatically progress onto the next point. This alone provides a significant decrease in time as the user doesn’t have to switch between each desired point after clicking as was done in previous annotators []. One of the problems discovered here was correctly calculating the exact co-ordinates where the user has clicked, when the image being displayed is not its original size. To solve this, instead of storing the co-ordinates of where the user clicked, the ratio of the point clicked and the display images size is saved. When exported, this ratio is multiplied by the original size of the video, resulting in the correct co-ordinates.

Many datasets are of low quality, and distinguishing the whiskers clearly can be incredibly difficult. Image equalisation was therefore provided to enhance the whisker features and make them more pronounced. The image equalisation methods are provided by OpenCV [].

Metrics

After the user has finished annotating, MWA provides several analysis methods. Several important metrics were identified for which researchers may want to analyse regarding whisking behaviour. Two distinct types of whisking behaviour have been found, exploratory and foveal []. Each type has a specific profile with regards to metrics, including amplitude, frequency and angular velocities [, ].

Whisker Curvature

Whisker curvature is used to estimate bending moments and forces in the whisker follicle in order to make associations with neural measurements [, , , ]. Curvature of the whiskers is usually approximated using Bezier curves [, ], hence Bezier curves are also used here []. Given that the user has manually annotated the points along a whisker, using either 2, 3 or 4 points per whisker, a Bezier equation (linear, quadratic or cubic respectively) was easy to implement. The option to extract whisker curvature is only available when the user tracks more than two points on the curve. The user is encouraged to spread their 3+ points equally along the whisker shaft in order for the program to reliably estimate whisker curvature. The program uses the user’s points, plus additional automatically-generated control points along the whisker shaft, to reliably fit a curve specifically to the user’s points. From this curve, any coordinate on the line can then be easily found by specifying a value of how far along the line you want to be. This value is known as the T Value of the Bezier curve and takes a value of between 0 and 1, with 0 equating to the tip of the whisker, and 1 the base of the whisker. Curvature is measured in pixels-1. If the user’s manual points are not equally distributed, then the program will display a curve that does not perfectly overlap the whisker; however, these instances are rare and the margin for error is relatively wide.

Whisker Angle/Mean Offset

Possibly the most important metric to detect is the angle of the whisker, and its average position through the clip (mean offset). This is due to the amount of other metrics which are calculated from the whisker angle and used in the subsequent studies; including, but not limited to, angular velocity, frequency and amplitude []. Mechanoreceptors situated around the follicle-sinus complex provide input to 150 – 400 neurons, and these neurons can encode the position of the whiskers with high precision by firing at specific deflection angles []. Mean offset is used by some studies to determine how far a whisker has moved [], or to compare the resting position of whiskers of experimental models against healthy groups after contact with an object [, , ].

Several methods were identified for calculating the whisker angle [, ], and a method was developed that would incorporate aspects of all of them. The main differences being where the angle was measured against (i.e. against the side of the face or the midline of the head), and where along the whisker shaft the angle should be taken (i.e. at the base, or at a specified point along the shaft). To solve this, 3 options are provided from where to take the angle from – the horizontal (in image plane), vertical (in image plane), and body center line (the midline of the head). Another option is then provided to choose how far along the whisker shaft to take the angle (0 being the tip, 0.5 the middle, 1 the base) as shown in Fig. 1. Angles are measured in degrees.

Figure 1 

A still frame from an annotated dataset with a nose point, orientation point, and 4 whiskers are being tracked. This figure shows how altering the T Value can change the position along the whisker shaft that the Whisker Angle is measured against (a) T Value = 0, (b) T Value = 0.5, (c) T Value = 1.

Whisker Protraction/Retraction Velocities

Every whisk cycle will begin with a protraction (forward movement of whiskers), followed by a retraction (backward movement of whiskers). The velocity of the protractions and retractions are closely linked to behaviour, and can achieve speeds of up to 1000 degrees/second []. During contact with the object, retraction velocity is thought to be reduced, which tends to lead to a longer contact durations []. Protraction velocities are slower than retraction velocities, and it was believed until recently that retraction was a passive process without muscular activation. This has been confirmed for foveal whisking, but retractions during exploratory whisking have been found to be under active control [].

Once the angle has been found, individual Protraction/Retraction cycles can be identified. By smoothing the angle signal, locating the peaks and then finding the nearest peaks on the unsmoothed signal, the cycles can be easily identified. The protraction and retraction velocities are measured in degrees/second.

Whisking Frequency

The frequency at which mice whisk is closely related to certain behaviours such as locomotion and object exploration, with lower frequencies observed during locomotion and higher frequencies observed during foveal whisking against objects []. Two methods are provided for calculating frequency: Autocorrelogram [] and Discrete Fourier Transforms [] (Fig. 2). The signal used to obtain the frequency is the angle of the whisker over the duration of the video. As there are multiple ways of defining the angle, the option to choose which method and the T Value is provided. The frequency is measured in Hertz (Hz).

Figure 2 

The frequency section in the analytical suite. The top graph shows the angle of the whisker over time, with the bottom graph showing the computed Discrete Fourier Transform.

Whisker Amplitude

During whisking, amplitudes vary from between 10 – 100 degrees [], with much larger amplitudes observed during exploration, and smaller amplitudes during foveal whisking []. Upon contact with an unexpected object, ipsilateral whiskers will cease protraction and reduce amplitude to investigate the object further []. This enables the whiskers to make light touches upon the surface, which is thought to improve the quality of the information gathered by the whisker follicle []. As the angles are already known, and the protraction/retraction cycles identified, the amplitude is calculated as the difference in angle between the beginning and end of a whisk cycle. The average amplitude, and minimum and maximum angles are given for each whisk cycle, and the amplitudes are measured in degrees.

Whisker Spread

Another important component in the whisking cycle is the spread of the whiskers. This spread varies over the course of a whisking cycle, with whiskers spreading out during protraction, and reducing in spread during retraction []. During tactile discrimination, whisker spread is substantially reduced in order to increase the number of whisker-surface contact points []. As each individual whisker being annotated is identifiable and the angle of the whisker is already known, calculating the spread simply finding the difference in angle between the whiskers. The only difficulty is identifying which side of the head the whisker is on. This can only be achieved if the center line is known, and as such, this metric can only be calculated if the user has chosen to include the nose point and the orientation point. Two spread measurements are given, the maximum spread (angle between foremost and rearmost whisker), and the average spread. Both are measured in degrees.

Head Orientation

As a rat orientates its head, whiskers will move asymmetrically in order to search the area that the head is moving towards []. It has also been found that mice will re-orientate their head after a whisker has come in contact with an object to align itself with the object for further investigation []. Head orientation can only be calculated by the software if the user has selected to include a nose point and an orientation point (mid-point of the head) in the annotation settings. If both of these parameters have been included, then head orientation is calculated as the angle between the vertical, and a line generated by the head point and orientation point. Head orientation is measured in degrees.

Locomotion Velocity/Distance Travelled

Whisking behaviour is also tightly coupled with locomotion. Rats use their whiskers as collision detectors when running, by ceasing whisking and positioning their whiskers far out in front of them [, ]. Due to the association with whisking and locomotion, locomotion always needs to be controlled for, to make sure observed differences in whisking are not simply caused by changes in locomotor activity []. Nose Displacement is the sum of distances travelled between frames by the nose point. Point Velocity is calculated as the change in position over time, and since we know how far the point has moved between frames, and the frame rate of the video, this was quite easy to calculate. If the animal in the video is moving though, and a point attached to the animal was not moving, the attached point would still appear to have a velocity. Because of this, nose point compensation was created, allowing the user to subtract the nose point’s velocity from the other points. The distance and velocity and measured in pixels and pixels/second respectively.

Analyse Everything

To try and ease the process of extracting data, an “Analyse Everything” section is also included. This will compile a list of selected measures that can be averaged over the entire video including, but not limited to, whisking frequency, distance travelled and average whisker position. These measures can then be exported in to a single excel file. Any measures that display information on a frame by frame basis such as position, or current speed, must be exported individually.

Calibrating the Metrics

All metrics that depend on time or distance are calculated in pixels and seconds, respectively. However, many videos are taken using a high speed camera, and sometimes the frame rate of the output video (.avi) file has a different frame rate to that of the original video, due to differences in replay speeds. In these videos, it is not possible to approximate the frame rates from the video metadata, thus making it impossible to predict any time-scales. In these instances, it is possible for the user to manually input the original frame rate of the video in to the program. Another calibration factor that needs to be taken in to account, is the ability to predict distance in mm, m or any other unit other than pixels, due to videos being at different resolutions and distances from the animal. To solve this, an option is provided to calibrate distance. Many videos already come with an additional calibration video, allowing the user to calculate the real-world size. The user can therefore load in one of these videos, specify the calibration points, and enter the real distance between these points. Doing this will allow the user to take measurements in m/s, mm/s or any other unit of choice, instead of pixels/s.

Shortcuts

Several shortcuts are provided to increase usability. Most notably the undo and repeat commands (Ctrl+Z and Crtl+R respectively). Undo will forget the last operation the user performed, and repeat will place the selected point in the same place it was placed on the previous frame.

Quality control

The tracker has been validated against known data from previous validations. Example data and video is included on the hosting website. As this tracker is a manual annotator, the user is responsible for the validity of the resulting metrics. The tracker will be able to track all videos where the head and whiskers are visible, although confidence in the tracking points might vary depending on the user experience or video quality. As with any manual annotator, it is recommended that anyone using the MWA to track whole datasets might want to manually validate their tracking error and inter-user reliability on their own specific datasets before use.

(2) Availability

Operating system

Windows 7/8/10

Programming language

C# v4.0

Additional system requirements

512MB RAM, 100MB, 2GHz

Dependencies

.NET Framework 4.5

List of contributors

Brett Hewitt is the software developer and wrote the paper.

Moi Hoon Yap is the advisor in image/video processing aspect and involved in writing.

Robyn Grant is the field expert to provide the software specifications and involved in writing.

Software location

Archive

Name: Zenodo

Persistent identifier: http://doi.org/10.5281/zenodo.30967

License: GNU GPL v3.0

Publisher: Brett Hewitt

Date published: 13/09/2015

Code Repository

Name: GitHub

Identifier: https://github.com/BrettHewitt/MWA

License: GNU GPL v3.0

Date published: 13/09/2015

Homepage and Documentation

http://mwa.bretthewitt.net/

Language

English

(3) Reuse potential

The MVVM design of the pattern allows components to be quickly and easily added or edited. Future modular components that are likely to be added to the MWA include i) specific locomotion tracking, including foot placements and ankle angles from a side-on view; and also ii) flight analysis. Behavioural flight analysis of birds and bats, would require exactly the same metrics (frequency, offset, amplitude, curvature, velocity) as those used in the MWA, calculated from similar points along a body and wing shaft. While these applications are likely to be incorporated in the MWA in the short-term, the software has the capacity for use in any research requiring to track a point or curve, and to generate quantifiable data from it. Therefore, this tracker could be adapted to many research application, perhaps including, but in no way limited to, biomechanics, projectiles, animal behaviour and robotics.

Competing Interests

The authors declare that they have no competing interests.