Skip to content

Erupt-AI use langchain4j#342

Merged
erupts merged 11 commits intodevelopfrom
erupt-ai-langchain4j
Mar 4, 2026
Merged

Erupt-AI use langchain4j#342
erupts merged 11 commits intodevelopfrom
erupt-ai-langchain4j

Conversation

@erupts
Copy link
Owner

@erupts erupts commented Mar 4, 2026

Function Call change to @tool with Method

@erupts erupts merged commit 673a0f8 into develop Mar 4, 2026
1 check failed
@erupts erupts deleted the erupt-ai-langchain4j branch March 4, 2026 11:21
@erupts erupts requested a review from Copilot March 4, 2026 11:22
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR migrates the erupt-ai module from a custom Function Call implementation to LangChain4j tools (@Tool / ToolExecutionRequest) and refactors the LLM integration accordingly, while also doing minor template/docs tweaks.

Changes:

  • Replaced custom AI Function Call framework (AiFunctionManager, AiFunctionCall, OpenAI streaming POJOs) with LangChain4j tool registration/execution (AiToolboxManager, @Tool).
  • Refactored LLM core + vendor adapters (OpenAI-compatible + Claude/Gemini/Ollama) to use LangChain4j chat/streaming models and new SSE streaming behavior.
  • Renamed chat entities (ChatAiChat, ChatMessageAiChatMessage) and adjusted controllers/services to use the new models.

Reviewed changes

Copilot reviewed 42 out of 44 changed files in this pull request and generated 9 comments.

Show a summary per file
File Description
erupt-tpl-frame/vue/README.md Adds Vue render README stub
erupt-tpl-frame/react/README.md Adds React render README stub
erupt-tpl-frame/angular/README.md Adds Angular render README + reference link
erupt-core/src/main/java/xyz/erupt/core/util/DateUtil.java Adjusts ISO_8601 format constant
erupt-core/src/main/java/xyz/erupt/core/service/EruptCoreService.java Comment formatting only
erupt-core/src/main/java/xyz/erupt/core/controller/EruptDataController.java Line-ending/formatting change + method rename (onChangeonchange)
erupt-ai/src/main/java/xyz/erupt/ai/tool/EruptAiToolbox.java New LangChain4j @Tool toolbox methods (schema/user/data/module list)
erupt-ai/src/main/java/xyz/erupt/ai/tool/AiToolboxManager.java New toolbox scanner + tool invocation dispatcher
erupt-ai/src/main/java/xyz/erupt/ai/service/LLMService.java Switches message model to LangChain4j + integrates tool execution + new SSE chunking
erupt-ai/src/main/java/xyz/erupt/ai/model/AiChat.java Renames chat entity and drill link to AiChatMessage
erupt-ai/src/main/java/xyz/erupt/ai/model/AiChatMessage.java Renames chat message entity + token type update
erupt-ai/src/main/java/xyz/erupt/ai/llm/OpenRouter.java Updates OpenAI-compatible endpoint building (chatApiPoint)
erupt-ai/src/main/java/xyz/erupt/ai/llm/Qwen.java Updates OpenAI-compatible endpoint building (chatApiPoint)
erupt-ai/src/main/java/xyz/erupt/ai/llm/GLM.java Updates OpenAI-compatible endpoint building (chatApiPoint)
erupt-ai/src/main/java/xyz/erupt/ai/llm/Doubao.java Adds new OpenAI-compatible provider adapter
erupt-ai/src/main/java/xyz/erupt/ai/llm/Ollama.java Replaces OpenAI-compatible HTTP impl with LangChain4j Ollama models
erupt-ai/src/main/java/xyz/erupt/ai/llm/Gemini.java Replaces OpenAI-compatible HTTP impl with LangChain4j Gemini models
erupt-ai/src/main/java/xyz/erupt/ai/llm/Claude.java Replaces OpenAI-compatible HTTP impl with LangChain4j Anthropic models
erupt-ai/src/main/java/xyz/erupt/ai/core/SseListener.java Refactors streaming listener payload to LangChain4j AiMessage + TokenUsage
erupt-ai/src/main/java/xyz/erupt/ai/core/OpenAi.java Replaces custom OkHttp OpenAI client with LangChain4j OpenAI models
erupt-ai/src/main/java/xyz/erupt/ai/core/LlmRequest.java Changes numeric config types (Float → Double)
erupt-ai/src/main/java/xyz/erupt/ai/core/LlmConfig.java Changes numeric config types (Float → Double)
erupt-ai/src/main/java/xyz/erupt/ai/core/LlmCore.java Adds LangChain4j chat/streaming integration + tool specification attachment
erupt-ai/src/main/java/xyz/erupt/ai/controller/ChatController.java Switches controllers to AiChat / AiChatMessage
erupt-ai/src/main/java/xyz/erupt/ai/controller/McpController.java Switches MCP tool listing/calls to LangChain4j tools
erupt-ai/src/main/java/xyz/erupt/ai/annotation/AiToolbox.java Replaces old parameter annotation with type-level toolbox marker
erupt-ai/src/main/java/xyz/erupt/ai/config/AiProp.java Adds SSE chunk size + delay config
erupt-ai/pom.xml Replaces OkHttp dependency with LangChain4j modules
erupt-ai-web/src/App.vue Adds error callback to messages fetch
erupt-ai/src/main/java/xyz/erupt/ai/pojo/ChatUsage.java Deletes old OpenAI response POJO
erupt-ai/src/main/java/xyz/erupt/ai/pojo/ChatCompletionStreamResponse.java Deletes old OpenAI response POJO
erupt-ai/src/main/java/xyz/erupt/ai/pojo/ChatCompletionResponse.java Deletes old OpenAI response POJO
erupt-ai/src/main/java/xyz/erupt/ai/pojo/ChatCompletionMessage.java Deletes old OpenAI request/role model
erupt-ai/src/main/java/xyz/erupt/ai/pojo/ChatCompletion.java Deletes old OpenAI request POJO
erupt-ai/src/main/java/xyz/erupt/ai/constants/MessageRole.java Deletes old role enum
erupt-ai/src/main/java/xyz/erupt/ai/call/AiFunctionManager.java Deletes custom function-call manager
erupt-ai/src/main/java/xyz/erupt/ai/call/AiFunctionCall.java Deletes custom function-call interface
erupt-ai/src/main/java/xyz/erupt/ai/call/ParamPromptTemplate.java Deletes function-call param prompting helper
erupt-ai/src/main/java/xyz/erupt/ai/call/impl/EruptUserInfo.java Deletes old function-call implementation
erupt-ai/src/main/java/xyz/erupt/ai/call/impl/EruptSchema.java Deletes old function-call implementation
erupt-ai/src/main/java/xyz/erupt/ai/call/impl/EruptModuleInfo.java Deletes old function-call implementation
erupt-ai/src/main/java/xyz/erupt/ai/call/impl/EruptList.java Deletes old function-call implementation
erupt-ai/src/main/java/xyz/erupt/ai/call/impl/EruptDataQuery.java Deletes old function-call implementation

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +60 to +71
chatMessages.add(0, SystemMessage.from(EruptSpringUtil.getBean(AiProp.class).getSystemPrompt()));
return chatModel.chat(chatMessages).aiMessage().text();
}

public void chatSse(LlmRequest llmRequest, List<ChatMessage> chatMessages, Consumer<SseListener> listener) {
StreamingChatModel streamingChatModel = this.buildStreamingChatModel(llmRequest, chatMessages, listener);
chatMessages.add(0, SystemMessage.from(EruptSpringUtil.getBean(AiProp.class).getSystemPrompt()));
List<ToolSpecification> specs = new ArrayList<>();
for (Method method : AiToolboxManager.getAiMethodMap().values()) {
specs.add(ToolSpecifications.toolSpecificationFrom(method));
}
ChatRequest request = ChatRequest.builder().messages(chatMessages).toolSpecifications(specs).build();
Copy link

Copilot AI Mar 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LlmCore.chat() mutates the provided chatMessages list by inserting the system prompt at index 0. If the same list is reused for subsequent calls (e.g., after tool execution), this will accumulate duplicate system prompts and change message ordering. Prefer creating a new list (or only prepending the system prompt when it is not already present) to avoid side effects.

Suggested change
chatMessages.add(0, SystemMessage.from(EruptSpringUtil.getBean(AiProp.class).getSystemPrompt()));
return chatModel.chat(chatMessages).aiMessage().text();
}
public void chatSse(LlmRequest llmRequest, List<ChatMessage> chatMessages, Consumer<SseListener> listener) {
StreamingChatModel streamingChatModel = this.buildStreamingChatModel(llmRequest, chatMessages, listener);
chatMessages.add(0, SystemMessage.from(EruptSpringUtil.getBean(AiProp.class).getSystemPrompt()));
List<ToolSpecification> specs = new ArrayList<>();
for (Method method : AiToolboxManager.getAiMethodMap().values()) {
specs.add(ToolSpecifications.toolSpecificationFrom(method));
}
ChatRequest request = ChatRequest.builder().messages(chatMessages).toolSpecifications(specs).build();
List<ChatMessage> messagesWithSystem = new ArrayList<>(chatMessages.size() + 1);
messagesWithSystem.add(SystemMessage.from(EruptSpringUtil.getBean(AiProp.class).getSystemPrompt()));
messagesWithSystem.addAll(chatMessages);
return chatModel.chat(messagesWithSystem).aiMessage().text();
}
public void chatSse(LlmRequest llmRequest, List<ChatMessage> chatMessages, Consumer<SseListener> listener) {
StreamingChatModel streamingChatModel = this.buildStreamingChatModel(llmRequest, chatMessages, listener);
List<ChatMessage> messagesWithSystem = new ArrayList<>(chatMessages.size() + 1);
messagesWithSystem.add(SystemMessage.from(EruptSpringUtil.getBean(AiProp.class).getSystemPrompt()));
messagesWithSystem.addAll(chatMessages);
List<ToolSpecification> specs = new ArrayList<>();
for (Method method : AiToolboxManager.getAiMethodMap().values()) {
specs.add(ToolSpecifications.toolSpecificationFrom(method));
}
ChatRequest request = ChatRequest.builder().messages(messagesWithSystem).toolSpecifications(specs).build();

Copilot uses AI. Check for mistakes.
.name(code)
.arguments(GsonFactory.getGson().toJson(params))
.build();
return (String) AiToolboxManager.invoke(request);
Copy link

Copilot AI Mar 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mcpCall() casts the invoked tool result to String. If any @Tool method returns a non-String (e.g., an object or list), this will throw ClassCastException and fail the MCP request. Consider returning String.valueOf(AiToolboxManager.invoke(request)) or enforcing a String-only return contract when registering tools.

Suggested change
return (String) AiToolboxManager.invoke(request);
return String.valueOf(AiToolboxManager.invoke(request));

Copilot uses AI. Check for mistakes.
Comment on lines +49 to +52
@Tool("Query erupt model data")
public String eruptDataQuery(@P("HQL (Hibernate Query Language)") String hql) {
List<?> result = eruptDao.getEntityManager().createQuery(hql).getResultList();
return GsonFactory.getGson().toJson(result);
Copy link

Copilot AI Mar 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

eruptDataQuery executes arbitrary HQL received from the model/client and returns the full result list without limits. This enables data exfiltration and potentially expensive queries (cartesian products / full table scans) if tool calling is enabled. Consider restricting to a safe query subset, enforcing tenant/permission checks, and applying maxResults/timeouts (or removing this tool from MCP/LLM exposure by default).

Copilot uses AI. Check for mistakes.
Comment on lines +38 to +45
if (StringUtils.isNotBlank(request.arguments())) {
JsonObject jsonObject = GsonFactory.getGson().fromJson(request.arguments(), JsonObject.class);
for (int i = 0; i < method.getParameters().length; i++) {
String paramName = method.getParameters()[i].getName();
if (jsonObject.has(paramName)) {
args[i] = GsonFactory.getGson().fromJson(jsonObject.get(paramName), method.getGenericParameterTypes()[i]);
}
}
Copy link

Copilot AI Mar 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

AiToolboxManager.invoke() extracts argument values using reflection parameter names (method.getParameters()[i].getName()). LangChain4j tool specs typically use @P values as the JSON argument keys, so parameters annotated with @P (e.g. "Erupt Name") will not be found and args will stay null. Use the @P value as the lookup key (fallback to reflection name) and keep the same key naming strategy used when building the ToolSpecification.

Copilot uses AI. Check for mistakes.
Comment on lines +57 to +60
for (Method method : clazz.getDeclaredMethods()) {
if (method.isAnnotationPresent(Tool.class)) {
aiMethodMap.put(method.getName(), method);
}
Copy link

Copilot AI Mar 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

aiMethodMap uses method.getName() as the key. If two @AiToolbox classes expose @Tool methods with the same name (or overloaded methods), later entries will overwrite earlier ones and tool execution will call the wrong method. Consider using a unique key (e.g., fully qualified "ClassName#methodName" or LangChain4j ToolSpecification name) and returning that as the tool name to the model/MCP.

Copilot uses AI. Check for mistakes.
Comment on lines +66 to +71
chatMessages.add(0, SystemMessage.from(EruptSpringUtil.getBean(AiProp.class).getSystemPrompt()));
List<ToolSpecification> specs = new ArrayList<>();
for (Method method : AiToolboxManager.getAiMethodMap().values()) {
specs.add(ToolSpecifications.toolSpecificationFrom(method));
}
ChatRequest request = ChatRequest.builder().messages(chatMessages).toolSpecifications(specs).build();
Copy link

Copilot AI Mar 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LlmCore.chatSse() always attaches toolSpecifications, even when erupt.ai.enableFunctionCall is false. This still advertises tools to the model and can cause tool execution requests that the rest of the pipeline won’t execute. Gate toolSpecifications creation on AiProp#isEnableFunctionCall (or pass a flag in) so disabling function calls fully disables tool usage.

Suggested change
chatMessages.add(0, SystemMessage.from(EruptSpringUtil.getBean(AiProp.class).getSystemPrompt()));
List<ToolSpecification> specs = new ArrayList<>();
for (Method method : AiToolboxManager.getAiMethodMap().values()) {
specs.add(ToolSpecifications.toolSpecificationFrom(method));
}
ChatRequest request = ChatRequest.builder().messages(chatMessages).toolSpecifications(specs).build();
AiProp aiProp = EruptSpringUtil.getBean(AiProp.class);
chatMessages.add(0, SystemMessage.from(aiProp.getSystemPrompt()));
ChatRequest.ChatRequestBuilder requestBuilder = ChatRequest.builder().messages(chatMessages);
if (aiProp.isEnableFunctionCall()) {
List<ToolSpecification> specs = new ArrayList<>();
for (Method method : AiToolboxManager.getAiMethodMap().values()) {
specs.add(ToolSpecifications.toolSpecificationFrom(method));
}
requestBuilder.toolSpecifications(specs);
}
ChatRequest request = requestBuilder.build();

Copilot uses AI. Check for mistakes.
Comment on lines +119 to +138
for (ToolExecutionRequest request : it.getAiMessage().toolExecutionRequests()) {
try {
Object rtn = AiToolboxManager.invoke(request);
if (null != rtn) {
functionCallRtn.add(rtn.toString());
}
} catch (Exception e) {
log.error("Function call error", e);
this.stopSse(emitter, chatMessage, llmModal, e.toString());
}
}
if (functionCallRtn.isEmpty()) {
message = "Completed !";
} else {
for (String s : functionCallRtn) {
chatMessages.add(AiMessage.from(s));
}
message = llm.chat(llmRequest, chatMessages);
this.sendSseMessage(emitter, message);
}
Copy link

Copilot AI Mar 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the tool-execution error path, stopSse() completes the emitter but execution continues (no return/break) and the code later merges/persists messages and completes the emitter again. After calling stopSse(), the callback should immediately abort further processing to avoid duplicate DB writes and SSE lifecycle errors.

Copilot uses AI. Check for mistakes.
Comment on lines +115 to +137
String message = it.getAiMessage().text();
if (aiProp.isEnableFunctionCall()) {
if (it.getAiMessage().hasToolExecutionRequests()) {
List<String> functionCallRtn = new ArrayList<>();
for (ToolExecutionRequest request : it.getAiMessage().toolExecutionRequests()) {
try {
Object rtn = AiToolboxManager.invoke(request);
if (null != rtn) {
functionCallRtn.add(rtn.toString());
}
} catch (Exception e) {
log.error("Function call error", e);
this.stopSse(emitter, chatMessage, llmModal, e.toString());
}
}
if (functionCallRtn.isEmpty()) {
message = "Completed !";
} else {
for (String s : functionCallRtn) {
chatMessages.add(AiMessage.from(s));
}
message = llm.chat(llmRequest, chatMessages);
this.sendSseMessage(emitter, message);
Copy link

Copilot AI Mar 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tool execution results are appended back into the conversation as AiMessage (chatMessages.add(AiMessage.from(s))) and then sent to the model via llm.chat(...). LangChain4j tool calling expects tool results to be returned as tool-result messages associated with the original ToolExecutionRequest (often including request.id), otherwise the model may ignore the results or repeatedly request the same tool. Use the appropriate LangChain4j message type for tool execution results and include the tool name/request id when adding them to chatMessages.

Copilot uses AI. Check for mistakes.
Comment on lines +156 to +163
if (null != p) {
description = p.value();
required.add(parameter.getName());
}
McpTool.SchemaProperties schema = new McpTool.SchemaProperties();
schema.setType(McpUtil.toMcp(parameter.getType()));
schema.setDescription(description);
mcpTool.getInputSchema().getProperties().put(parameter.getName(), schema);
Copy link

Copilot AI Mar 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MCP tool schemas are built using reflection parameter names (parameter.getName()) for both the JSON property keys and the required list, while @P is used as a description. If @P is intended to define the tool argument name (as in LangChain4j), MCP clients will send the wrong keys and AiToolboxManager.invoke() won’t be able to bind arguments. Use @P value as the schema property key/required entry (fallback to parameter.getName() when @P is absent), and keep it consistent with tool invocation argument parsing.

Suggested change
if (null != p) {
description = p.value();
required.add(parameter.getName());
}
McpTool.SchemaProperties schema = new McpTool.SchemaProperties();
schema.setType(McpUtil.toMcp(parameter.getType()));
schema.setDescription(description);
mcpTool.getInputSchema().getProperties().put(parameter.getName(), schema);
String argName = parameter.getName();
if (null != p) {
description = p.value();
if (description != null && !description.isEmpty()) {
argName = description;
}
required.add(argName);
}
McpTool.SchemaProperties schema = new McpTool.SchemaProperties();
schema.setType(McpUtil.toMcp(parameter.getType()));
schema.setDescription(description);
mcpTool.getInputSchema().getProperties().put(argName, schema);

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants