Tiny Images Dataset
Antonio Torralba and Rob Fergus and William T Freeman

Tiny_Images_Dataset (7 files)
README.md 1.17kB
README.txt 1.17kB
data/tiny_code.zip 155.68kB
data/tiny_images.bin 243.62GB
data/tiny_index.mat 7.32MB
data/tiny_metadata.bin 60.90GB
data/tinygist80million.bin 121.81GB
Type: Dataset
Tags:

Bibtex:
@article{,
title= {Tiny Images Dataset},
journal= {},
author= {Antonio Torralba and Rob Fergus and William T Freeman},
year= {},
url= {},
abstract= {With the advent of the Internet, billions of images are now freely available online and constitute a dense sampling of the visual world. Using a variety of non-parametric methods, we explore this world with the aid of a large dataset of 79,302,017 images collected from the Internet. Motivated by psychophysical results showing the remarkable tolerance of the human visual system to degradations in image resolution, the images in the dataset are stored as 32 x 32 color images. Each image is loosely labeled with one of the 75,062 non-abstract nouns in English, as listed in the Wordnet lexical database. Hence the image database gives a comprehensive coverage of all object categories and scenes. The semantic information from Wordnet can be used in conjunction with nearest-neighbor methods to perform object classification over a range of semantic levels minimizing the effects of labeling noise. For certain classes that are particularly prevalent in the dataset, such as people, we are able to demonstrate a recognition performance comparable to class-specific Viola-Jones style detectors.},
keywords= {},
terms= {},
license= {},
superseded= {}
}

Hosted by users:

10 day statistics (2 downloads)

Average Time 1 days,01 hrs, 13 mins, 57 secs
Average Speed 4.69MB/s
Best Time 5 hrs, 22 mins, 30 secs
Best Speed 22.03MB/s
Worst Time 1 days,21 hrs, 05 mins, 24 secs
Worst Speed 2.63MB/s
Report