This is an experimental project to evaluate LLMs capabilities to analyze OpenAM configuration, spot potential issues and suggest improvements.
The project fetches the configuration, generates a prompt, and sends it to an LLM for analysis.
Requirements: JDK 17 or newer
- Adjust configuration settings defined in the
application.ymlfile according to your environment - Build the project with the command
./mvnw clean package - Run the project with the command
java -jar ./target/openam-ai-analyzer-*.jar --realm=root
| Flag | Description |
|---|---|
--analyze |
modules - authentication modules analysis (default); flows - auth chains analysis. |
--realm |
sets realm for analysis (default: root) |
| Option | Description |
|---|---|
spring.ai.openai.base_url |
LLM API source address |
spring.ai.openai.api-key |
Model API key |
spring.ai.openai.chat.options.model |
LLM model |
spring.ai.openai.chat.options.temperature |
Temperature. The lower the temperature, the more deterministic the response from the LLM |
prompt.system |
General system prompt |
prompt.modules.user |
User prompt for analyzing authentication modules |
prompt.modules.task |
LLM task prompt for analyzing authentication modules |
prompt.flows.user |
User prompt for analyzing authentication chains |
prompt.flows.task |
LLM task prompt for analyzing authentication chains |
openam.url |
OpenAM URL |
openam.login |
OpenAM account login |
openam.password |
OpenAM account password |