AR Face Database 22pt Markup

Introduction

The AR Face Database contains over 3000 mug shots of 130 individuals exhibiting various facial expressions. More information on the data set used to be available from:-

http://www2.ece.ohio-state.edu/~aleix/ARdatabase.html

Example Images

Example images, showing 13 expressions recorded in two separate sessions for one individual from the AR face database

m-016-01.jpg m-016-02.jpg m-016-03.jpg m-016-04.jpg m-016-05.jpg m-016-06.jpg m-016-07.jpg
m-016-08.jpg m-016-09.jpg m-016-10.jpg m-016-11.jpg m-016-12.jpg m-016-13.jpg m-016-14.jpg
m-016-15.jpg m-016-16.jpg m-016-17.jpg m-016-18.jpg m-016-19.jpg m-016-20.jpg m-016-21.jpg
m-016-22.jpg m-016-23.jpg m-016-24.jpg m-016-25.jpg m-016-26.jpg

Manual Landmarking

To enable detailed testing and model building the AR face images have been manually labelled with 22 facial features on each face. The 22 points chosen are consistent across all images. The landmark scheme is shown below:-

Markup

Click on the image or here
to see a larger version of this image

It is intended that these marked up points will be used for a variety of purposes:-

Points Data

Currently the 22 point markup is currently available for expressions 01,02,03,05. We are currently working on the markup for other expressions. This data is zipped up into the following file :-

ar_face_22pt_markup.zip

The filenames of the AR face images have the following format:-

X-Y-Z.raw

Here:-

The image files and point files have corresponding names. e.g. "m-016-06.raw" has points in file "m-016-06.pts"

The points file format is as follows:-

version: 1          (This points file version no. can be ignored)
n_points: 22        (The number of labelled points on the image)
{
xxxx  yyyy
.....
}

For each point xxxx is the x co-ord starting from the top-left corner and yyyy is the y co-ord similarly starting from the top-left corner of the image.

All points files contain 22 points with each point representing a specific point on the face (see diagram above).

The annotation has been funded by the FGNET project.