Replies: 9 comments 26 replies
-
|
Could you please refer to this issue? |
Beta Was this translation helpful? Give feedback.
-
|
我想到另一个问题,本地部署的推理服务器一般都不支持并发请求的吧,或许本地推理真的需要单独适配一下。 |
Beta Was this translation helpful? Give feedback.
-
|
谢谢,我试过用ollama本地部署llama-3.1:8b,这个模型您是提到过支持的模型之一。模型启动后,运行python main.py llm_client.base_url=http://127.0.0.1:11434/v1,返回下面的错误: Refer to the format of a trivial design above. Be very creative and give
|
Beta Was this translation helpful? Give feedback.
-
|
加上模型名称就可以运行了:) 可否请解释一下关于模型的选择,对模型的大小有什么要求,是否只能用列出来的模型?比如,如果要用qwen模型,只列出Qwen2-72B,https://ollama.com/library里 有没有其他qwen适合的? 最好能在14B以下。 |
Beta Was this translation helpful? Give feedback.
-
|
另外,直接运行main.py以解决TSP问题, 下面的结果符合预期吗? [] Average for 20: 3.8658526882735593 |
Beta Was this translation helpful? Give feedback.
-
|
这个问题已知的最优解是多少? 我用qwen2.5,得到的解明显好很多: [] Average for 20: 3.9038538902268956 |
Beta Was this translation helpful? Give feedback.
-
|
作者您好,示例中用的是tsp_aco问题,在解决其他问题例如tsp_gls时只需要改问题类型就可以了吗,需不需要添加额外参数 |
Beta Was this translation helpful? Give feedback.
-
|
请问问题是最大化还是最小化目标函数是在哪个配置文件里指定的?设置cfg/problem里的obj_type好像不起作用 |
Beta Was this translation helpful? Give feedback.
-
|
在dpp_ga这个例子里,如何要同时优化crossover, mutation函数,在cfg里的func_name要写上两个函数crossover, mutation的名称吗? 在promots里把 两个函数的信息都写到desc,signature, func这三个文件里?还是每个函数分别写这三个文件,命名规则是怎么样的? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello, can we possibly run ReEVO using local LLM models via Ollama? Thank you.
Beta Was this translation helpful? Give feedback.
All reactions