Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This seems like a great step; I’ve been able to run StableDiffusion locally, but with an older GPU none of the LLMs will run for me since I don’t have enough VRAM.

Oddly I don’t see a VRAM requirement listed. Anyone know if it has a lower limit?



> with an older GPU none of the LLMs will run for me since I don’t have enough VRAM.

I think you can run Pygmalion 6B on a 8GB GPU using DeepSpeed.

It's very underwhelming if you expect something like ChatGPT though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: