Abstract

Faces provide not only cues to an individual's identity, age, gender and ethnicity, but also insight into their mental states. The ability to identify the mental states of others is known as Theory of Mind. Here we present results from a study aimed at extending our understanding of differences in the temporal dynamics of the recognition of expressions beyond the basic emotions at short presentation times ranging from 12.5 to 100 ms. We measured the effect of variations in presentation time on identification accuracy for 36 different facial expressions of mental states based on the Reading the Mind in the Eyes test (Baron-Cohen et al., 2001) and compared these results to those for corresponding stimuli from the McGill Face database, a new set of images depicting mental states portrayed by professional actors. Our results show that subjects are able to identify facial expressions of complex mental states at very brief presentation times. The kind of cognition involved in the correct identification of facial expressions of complex mental states at very short presentation times suggests a fast, automatic Type-1 cognition.

Schmidtmann, G., Jordan, M., Loong, J.T., Logan, A.J., Carbon, C.C., & Gold,I. Temporal processing of facial expressions of mental states. BioRxiv 602375; doi: https://doi.org/10.1101/602375 [PDF]

Posted
AuthorGunnar Schmidtmann

Schmidtmann, G., Jennings, B. J., Sandra, D. A., Pollock, J., & Gold, I. (2019). The McGill Face Database: validation and insights into the recognition of facial expressions of complex mental states. BioRxiv, 586453. https://doi.org/10.1101/586453 [PDF]

The McGill Face Database: validation and insights into the recognition of facial expressions of complex mental states

Current databases of facial expressions of mental states typically represent only a small subset of expressions, usually covering the basic emotions (fear, disgust, surprise, happiness, sadness, and anger). To overcome these limitations, we introduce a new database of pictures of facial expressions reflecting the richness of mental states. 93 expressions of mental states were interpreted by two professional actors and high-quality pictures were taken under controlled conditions in front and side view. The database was validated with two different experiments (N=65). First, a four-alternative forced choice paradigm was employed to test the ability of participants to correctly select a term associated with each expression. In a second experiment, we employed a paradigm that did not rely on any semantic information. The task was to locate each face within a two-dimensional space of valence and arousal (mental state - space) employing a "point-and-click" paradigm. Results from both experiments demonstrate that subjects can reliably recognize a great diversity of emotional states from facial expressions. Interestingly, while subjects' performance was better for front view images, the advantage over the side view was not dramatic. To our knowledge, this is the first demonstration of the high degree of accuracy human viewers exhibit when identifying complex mental states from only partially visible facial features. The McGill Face Database provides a wide range of facial expressions that can be linked to mental state terms and can be accurately characterized in terms of arousal and valence.

Posted
AuthorGunnar Schmidtmann

Ingo Fruend (York University, Toronto) and I demonstrate that only a small fraction of biologically relevant shapes can be represented by Radial Frequency (RF) pattern-based shapes and that this small fraction is perceptually distinct from the general class of all possible planar shapes. In this paper we derive a general method to compute the distance of a given shape's outline from the set of RF patterns, allowing us to scan large numbers of object outlines automatically. This analysis shows that only 1 to 6% of naturally smooth outlines can be exactly represented by RF patterns. In addition, we present results from visual search experiments, which revealed that searching RF patterns among non-RF patterns is efficient, whereas searching an RF pattern among other RF patterns is inefficient (and vice versa).

Our results suggest that RF patterns represent only a small and restricted subset of possible planar shapes and that results obtained with this special class of stimuli can not simply be expected to generalise to any arbitrary planar shape and shape representation in general.

Schmidtmann, G., & Fruend, I. (2019). Radial frequency patterns describe a small and perceptually distinct subset of all possible planar shapes. Vision Research, 154, 122–130.  [PDF]

Posted
AuthorGunnar Schmidtmann

Ania Zolubak, PhD candidate in Dr Garcia-Suarez’ Lab, presented a poster at the European Conference on Visual Perception in Trieste.

Scale-invariance for radial frequency patterns in peripheral vision.

Zolubak, A. B., Schmidtmann, G., Garcia-Suarez, L. 

Radial frequency (RF) patterns are sinusoidally modulated contours. Previous studies have shown that RF shape discrimination (RF vs circle) is scale-invariant, i.e. performance is independent of radius size when presented centrally.
This study aims to investigate scale-invariance in peripheral vision (0-20° nasal visual field, radius 1°, RF=6, SF=1 or 5cpd) by scaling radii according to the Cortical Magnification Factor (CMF) and its fractions (MF1=½, MF2=¼, MF3=1/8).
Results show that performance remains constant with eccentricity for CMF, MF1, MF2 and for two observers (N=4) for MF3. However, the average performance for MF2 was twice and for MF3 four times worse compared to CMF and MF1.
The scale-invariance found for larger stimuli indicates the involvement of global shape processing in the periphery. The higher, yet constant thresholds for smaller patterns suggest that the resolvability of the contours limits peripheral performance and may elicit processing by low-level mechanisms.

ECVP2018 posterF.jpg
Posted
AuthorGunnar Schmidtmann

The influence of face identity noise on face recognition in healthy subjects and patients with mild traumatic brain injury - an equivalent noise approach.

Schmidtmann, G., Wehbé, F., Sandra, D.A., Farivar, R.

McGill Vision Research, Department of Ophthalmology, McGill University

 

Schmidtmann_ECVP_2017.jpg

PDF

 

 

Posted
AuthorGunnar Schmidtmann