Skip to content

Work with ollama#155

Draft
ketsapiwiq wants to merge 4 commits intojoonspk-research:mainfrom
ketsapiwiq:llama
Draft

Work with ollama#155
ketsapiwiq wants to merge 4 commits intojoonspk-research:mainfrom
ketsapiwiq:llama

Conversation

@ketsapiwiq
Copy link

No description provided.

@sjdthree
Copy link

Hi -- just trying this now.

78, in <module> import ollama ModuleNotFoundError: No module named 'ollama'

just need to add ollama to requirements.txt

Second error:

raise ResponseError(e.response.text, e.response.status_code) from None ollama._types.ResponseError: model 'nomic-embed-text' not found, try pulling it first

need to do:
% ollama pull nomic-embed-text

same for:
ollama pull phi3:latest

but now I'm getting this error:
File "~/GitHub/generative_agents/reverie/backend_server/persona/prompt_template/run_gpt_prompt.py", line 380, in __func_clean_up duration = int(k[1].split(",")[0].strip()) IndexError: list index out of range

Any ideas?

@sjdthree
Copy link

but now I'm getting this error: File "~/GitHub/generative_agents/reverie/backend_server/persona/prompt_template/run_gpt_prompt.py", line 380, in __func_clean_up duration = int(k[1].split(",")[0].strip()) IndexError: list index out of range

Any ideas?

I restarted backend and frontend servers and executed a run 10 and it ran through the steps but ended with the same list index out of range error.

@sjdthree
Copy link

Hi @ketsapiwiq it looks like this repo: https://github.com/drudilorenzo/generative_agents
has a ton of fixes and is more up to date. Can I request you post your PR to that repo?

@sjdthree
Copy link

Hi @ketsapiwiq it looks like this repo: https://github.com/drudilorenzo/generative_agents has a ton of fixes and is more up to date. Can I request you post your PR to that repo?

specifically the fix-and-improve branch which looks like the default

@ketsapiwiq
Copy link
Author

drudilorenzo#1
This fork you sent me doesn't have an ollama config so it would need some manual merging.

@sjdthree
Copy link

drudilorenzo#1 This fork you sent me doesn't have an ollama config so it would need some manual merging.

yes it's only fixes on the original repo which I understand is no longer updated. @drudilorenzo appears to have fixed a bunch in the repo i sent.

After looking more closely at the original repo, it looks like there were some unfinished functions like __func_clean_up and those appear to be completely rewritten in @druidilorenzo repo. that is the source of many posted issues in original repo.

I hope it's not too heavy of a manual merge.

@ketsapiwiq
Copy link
Author

ketsapiwiq commented May 20, 2024

New PR drudilorenzo#2 : drudilorenzo#2
branch ketsapiwiq:drudi-llama should have the embedding part, and there should be a WIP of a merge of PR #1 and drudilorenzo#1 to allow another openai provider (ollama)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants