GPT-J-6B: A 6 Billion Parameter Autoregressive Language Model
Wang, Ben and Komatsuzaki, Aran

folder GPT-J-6B-weights (6 files)
fileGPT-J-6B-weights_meta.sqlite 20.48kB
fileGPT-J-6B-weights_meta.xml 0.67kB
filestep_383500_slim.tar.zstd 9.41GB
Type: Dataset
Tags:

Bibtex:
@misc{gpt-j,
author= {Wang, Ben and Komatsuzaki, Aran},
title= {GPT-J-6B: A 6 Billion Parameter Autoregressive Language Model},
howpublished= {\url{https://github.com/kingoflolz/mesh-transformer-jax}},
year= {2021},
month= {May},
license= {Apache-2.0},
url= {https://github.com/kingoflolz/mesh-transformer-jax},
keywords= {},
abstract= {},
terms= {},
superseded= {}
}


Send Feedback