SAMM Dataset of Micro-Facial Movements

An example of a posed contempt micro-expression An example of a posed sad micro-expression
*Images shown are not representative of the SAMM dataset.

There are many established facial expression datasets created with relative ease, however creating a micro-movement dataset where participants exhibit real, spontaneous micro-movements when watching emotional stimuli is not so straight-forward. Creating an environment where micro-movements can occur naturally is difficult, so we attempted to keep our participants comfortable, and focused on the emotional stimuli being shown. A high-stakes environment was introduced, taking the form of a monetary incentive, to increase chances of suppression and therefore micro-facial movements.

The Spontaneous Actions and Micro-Movements (SAMM) dataset was created to address the limitations of current micro-expression datasets that lack demographical diversity, ground truth Facial Action Coding System (FACS) coding and spontaneous movements. The dataset contains 159 spontaneous micro-facial movements obtained through our emotional inducement experiment. We have collected micro-movements from 32 participants from a diverse demographic including 13 different ethnicities, a mean age of 33.24 years (standard deviation: 11.32, ages between 19 and 57) and an even gender split with 17 male and 16 female participants.

If you use this dataset, please cite:

A. K. Davison; C. Lansley; N. Costen; K. Tan; M. H. Yap, "SAMM: A Spontaneous Micro-Facial Movement Dataset," in IEEE Transactions on Affective Computing , vol.PP, no.99, pp.1-1 doi: 10.1109/TAFFC.2016.2573832

For access to the dataset, please fill in the Licence Agreement and send to M.Yap(at)mmu.ac.uk. Once approved we will send access information for you to download from the link below. Please be aware that it can take up to a week to process your request.

Licence Agreement

SAMM Dataset