affectnet (31003 files)
anger/image0000006.jpg 383.58kB
anger/image0000007.jpg 80.47kB
anger/image0000012.jpg 146.93kB
anger/image0000035.jpg 317.19kB
anger/image0000060.jpg 49.94kB
anger/image0000061.jpg 63.92kB
anger/image0000066.jpg 102.32kB
anger/image0000068.jpg 50.13kB
anger/image0000106.jpg 431.82kB
anger/image0000132.jpg 389.00kB
anger/image0000138.jpg 66.02kB
anger/image0000182.jpg 32.22kB
anger/image0000195.jpg 76.04kB
anger/image0000213.jpg 724.55kB
anger/image0000228.jpg 220.77kB
anger/image0000294.jpg 456.54kB
anger/image0000333.jpg 611.45kB
anger/image0000343.jpg 281.05kB
anger/image0000346.jpg 83.93kB
anger/image0000356.jpg 892.11kB
anger/image0000368.jpg 27.29kB
anger/image0000371.jpg 147.54kB
anger/image0000374.jpg 223.18kB
anger/image0000390.jpg 175.92kB
anger/image0000399.jpg 66.93kB
anger/image0000400.jpg 40.34kB
anger/image0000407.jpg 385.66kB
anger/image0000415.jpg 140.00kB
anger/image0000416.jpg 58.95kB
anger/image0000441.jpg 619.31kB
anger/image0000450.jpg 210.95kB
anger/image0000463.jpg 21.90kB
anger/image0000470.jpg 537.76kB
anger/image0000511.jpg 136.41kB
anger/image0000513.jpg 62.37kB
anger/image0000546.jpg 293.00kB
anger/image0000593.jpg 485.85kB
anger/image0000602.jpg 113.78kB
anger/image0000643.jpg 493.58kB
anger/image0000690.jpg 323.73kB
anger/image0000697.jpg 47.28kB
anger/image0000722.jpg 364.14kB
anger/image0000723.jpg 62.76kB
anger/image0000724.jpg 119.90kB
anger/image0000758.jpg 253.25kB
anger/image0000776.jpg 228.47kB
anger/image0000825.jpg 402.87kB
anger/image0000908.jpg 76.50kB
anger/image0000923.jpg 292.60kB
Too many files! Click here to view them all.
Type: Dataset
Tags: Facial expression

Bibtex:
@article{,
title= {affectnet},
journal= {},
author= {Ali Mollahosseini and Behzad Hasani and Mohammad H. Mahoor},
year= {},
url= {http://mohammadmahoor.com/wp-content/uploads/2017/08/AffectNet_oneColumn-2.pdf},
abstract= {Automated affective computing in the wild setting is a challenging problem in computer vision. Existing annotated databases of facial expressions in the wild are small and mostly cover discrete emotions (aka the categorical model). There are very limited annotated facial databases for affective computing in the continuous dimensional model (e.g., valence and arousal). To meet this need, we collected, annotated, and prepared for public distribution a new database of facial emotions in the wild (called AffectNet). AffectNet contains more than 1,000,000 facial images from the Internet by querying three major search engines using 1250 emotion related keywords in six different languages. About half of the retrieved images were manually annotated for the presence of seven discrete facial expressions and the intensity of valence and arousal. AffectNet is by far the largest database of facial expression, valence, and arousal in the wild enabling research in automated facial expression recognition in two different emotion models. Two baseline deep neural networks are used to classify images in the categorical model and predict the intensity of valence and arousal. Various evaluation metrics show that our deep neural network baselines can perform better than conventional machine learning methods and off-the-shelf facial expression recognition systems.},
keywords= {Facial expression},
terms= {},
license= {},
superseded= {}
}


Send Feedback