Ollama
- have used this in a side project but only in local host, i do not know to deploy it on a server and connect it to a website.
ollama 2
- found an open source ai model
-
(uncensored ftw)
- link to the repo of ollama:
- https://github.com/jmorganca/ollama
Pros:
- understands general questions + answers them without beating around the bush
- answers general questions
- answers specific questions but it could also be a coincidence
Cons:
- tried running the ai model on my pi but the 4 gigs was not enough to run it.
- it even maxed out the swap partition.
- where will i run the model?
- is mr. mort’s server’s powerful enough?
phi
- i found a 2.7b model that is 1.6 gigs that will prob work with a pi and possibly on a server
example responses from the model:
response time takes less than a second and the model is accurate. this might be the model that i will use.
cons:
- talks too much
- doesnt know what to say
- provides false information
- doesnt understand general questions.