LLaMA Weights
Facebook

folder LLaMA (26 files)
filetokenizer_checklist.chk 0.05kB
filetokenizer.model 499.72kB
filellama.sh 1.93kB
file7B/params.json 0.10kB
file7B/consolidated.00.pth 13.48GB
file7B/checklist.chk 0.10kB
file65B/params.json 0.10kB
file65B/consolidated.07.pth 16.32GB
file65B/consolidated.06.pth 16.32GB
file65B/consolidated.05.pth 16.32GB
file65B/consolidated.04.pth 16.32GB
file65B/consolidated.03.pth 16.32GB
file65B/consolidated.02.pth 16.32GB
file65B/consolidated.01.pth 16.32GB
file65B/consolidated.00.pth 16.32GB
file65B/checklist.chk 0.48kB
file30B/params.json 0.10kB
file30B/consolidated.03.pth 16.27GB
file30B/consolidated.02.pth 16.27GB
file30B/consolidated.01.pth 16.27GB
file30B/consolidated.00.pth 16.27GB
file30B/checklist.chk 0.26kB
file13B/params.json 0.10kB
file13B/consolidated.01.pth 13.02GB
file13B/consolidated.00.pth 13.02GB
file13B/checklist.chk 0.15kB
Type: Dataset
Tags: chatgpt nlp llama

Bibtex:
@article{,
title= {LLaMA Weights},
journal= {},
author= {Facebook},
year= {},
url= {https://github.com/Elyah2035/llama-dl},
abstract= {https://github.com/facebookresearch/llama
https://github.com/Elyah2035/llama-dl},
keywords= {chatgpt nlp llama},
terms= {},
license= {},
superseded= {}
}


Send Feedback