Gpt-neox-20b
#27 opened 3 months ago
by
Shubham1611

how to use this with ollama
#26 opened 3 months ago
by
Pawankumar9413
Upload FlaxGPTNeoXForCausalLM
1
#24 opened over 1 year ago
by
heegyu

I have been asked to put ketchup in pie and take vitamin S!
#21 opened almost 2 years ago
by
sreeparna
Max context length/input token length.
#20 opened almost 2 years ago
by
gsaivinay
Is it possible to train this model on a commercially available cloud machine?
1
#19 opened almost 2 years ago
by
Walexum
<Response [422]>
#18 opened almost 2 years ago
by
skrishna

The generated results using inference API and the webpage are very different! Is the model called from the api the same as the one called from the webpage?
#17 opened almost 2 years ago
by
zouhanyi
Fine-Tuning GPT-Neox-20B using Hugging Face Transformers
1
#16 opened about 2 years ago
by
Dulanjaya
Unusual behaviour with inference using transformers library
1
#15 opened about 2 years ago
by
vmajor
