GPT-J-6B: A 6 Billion Parameter Autoregressive Language Model
Wang, Ben and Komatsuzaki, Aran

GPT-J-6B-weights (6 files)
GPT-J-6B-weights_meta.sqlite 20.48kB
GPT-J-6B-weights_meta.xml 0.67kB
step_383500_slim.tar.zstd 9.41GB
Type: Dataset
Tags:

Bibtex:
@misc{gpt-j,
author= {Wang, Ben and Komatsuzaki, Aran},
title= {GPT-J-6B: A 6 Billion Parameter Autoregressive Language Model},
howpublished= {\url{https://github.com/kingoflolz/mesh-transformer-jax}},
year= {2021},
month= {May},
license= {Apache-2.0},
url= {https://github.com/kingoflolz/mesh-transformer-jax},
keywords= {},
abstract= {},
terms= {},
superseded= {}
}

Report