this post was submitted on 02 Jul 2023
10 points (100.0% liked)
LocalLLaMA
2249 readers
1 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
How do use ooba with rocm I looked at the python file where you can install amd and it will just say "amd not supported" and exit. I guess it just doesn't update the webui.py when I update ooba? I somewhere heard that llama.cpp with CLBlast wouldn't work with ooba, or am I wrong? Also is konoldcpp worth a shot? I hear some success with it
I can recommend kobold, it's a lot simpler to set up than ooba and usually runs faster too.
I will try that once in home! Ty for the suggestions can I use kobold also in sillytavern? iirc there was an option for koboldai or something is that koboldcpp or what does that option do?
EDIT: I got it working and its wonderful thank you for suggesting me this :) I had some difficulties setting it up especially with opencl-mesa since I had to install opencl-amd and then finind out the device ID and so on but once it was working its great!