Skip to content

Adding files to deploy FinanceAgent application on ROCm vLLM #1890

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 36 commits into
base: main
Choose a base branch
from

Conversation

artem-astafev
Copy link
Contributor

Description

Adding files to deploy FinanceAgent application application on ROCm vLLM.

Issues

It is required to provide the ability to deploy the FinanceAgent application on ROCm vLLM

Type of change

List the type of change like below. Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds new functionality)
  • Breaking change (fix or feature that would break existing design and interface)
  • Others (enhancement, documentation, validation, etc.)

Dependencies

https://github.com/opea-project/GenAIComps repo is needed to build CI images

Tests

Testing was performed manually and by running a script FinanceAgent/tests/test_compose_on_vllm_rocm.sh This script is similar to the script that runs FinanceAgent testing on an Intel Gaudi HPU.

Signed-off-by: Artem Astafev <[email protected]>
Signed-off-by: Artem Astafev <[email protected]>
Signed-off-by: Artem Astafev <[email protected]>
Signed-off-by: Artem Astafev <[email protected]>
Signed-off-by: Artem Astafev <[email protected]>
Signed-off-by: Artem Astafev <[email protected]>
Signed-off-by: Artem Astafev <[email protected]>
Signed-off-by: Artem Astafev <[email protected]>
Signed-off-by: Artem Astafev <[email protected]>
Signed-off-by: Artem Astafev <[email protected]>
Signed-off-by: Artem Astafev <[email protected]>
Copy link

github-actions bot commented Apr 28, 2025

Dependency Review

✅ No vulnerabilities or license issues found.

Scanned Files

None

@artem-astafev artem-astafev marked this pull request as draft April 29, 2025 03:20
@artem-astafev artem-astafev marked this pull request as ready for review April 29, 2025 04:19
@artem-astafev artem-astafev requested a review from ashahba as a code owner May 7, 2025 10:01
@artem-astafev
Copy link
Contributor Author

Hi @chensuyue, could you please help with running FinanceAgent/tests/test_compose_on_vllm_rocm.sh test script for FinanceAgent.

Test in in GitHubAction queue, but I'm can't get why it has no scheduling on rocm agent.

Thank you in advance.

@chensuyue
Copy link
Collaborator

Hi @chensuyue, could you please help with running FinanceAgent/tests/test_compose_on_vllm_rocm.sh test script for FinanceAgent.

Test in in GitHubAction queue, but I'm can't get why it has no scheduling on rocm agent.

Thank you in advance.

Please rename test_compose_on_vllm_rocm.sh as test_compose_vllm_on_rocm.sh, the item after _on_ consider as node label, and CI can only recognize rocm rather than vllm_rocm.

2. [Generate a HuggingFace Access Token](#generate-a-huggingface-access-token)
3. [Deploy the Services Using Docker Compose](#deploy-the-services-using-docker-compose)
4. [Check the Deployment Status](#check-the-deployment-status)
5. [Test the Pipeline](#test-the-pipeline)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reference cannot be redirected
3. [Deploy the Services Using Docker Compose](#deploy-the-services-using-docker-compose) 4. [Check the Deployment Status](#check-the-deployment-status)

@ZePan110
Copy link
Collaborator

ZePan110 commented Jun 10, 2025

Need fix shellcheck errors and warnings.

find -name '*.sh' | xargs shellcheck --severity=error

In ./docker_compose/intel/hpu/gaudi/launch_vllm.sh line 1:
# Copyright (C) 2025 Intel Corporation
^-- SC2148: Tips depend on target shell and yours is unknown. Add a shebang.


In ./docker_compose/intel/hpu/gaudi/launch_dataprep.sh line 1:
# Copyright (C) 2025 Intel Corporation
^-- SC2148: Tips depend on target shell and yours is unknown. Add a shebang.


In ./docker_compose/intel/hpu/gaudi/launch_agents.sh line 1:

^-- SC2148: Tips depend on target shell and yours is unknown. Add a shebang.
find -name '*.sh' | xargs shellcheck --severity=warning

In ./docker_compose/amd/gpu/rocm/launch_vllm.sh line 1:
# Copyright (C) 2025 Intel Corporation
^-- SC2148: Tips depend on target shell and yours is unknown. Add a shebang.


In ./docker_compose/amd/gpu/rocm/launch_dataprep.sh line 1:
# Copyright (C) 2025 2025 Advanced Micro Devices, Inc.
^-- SC2148: Tips depend on target shell and yours is unknown. Add a shebang.


In ./docker_compose/amd/gpu/rocm/launch_dataprep.sh line 4:
export host_ip=${ip_address}
               ^-----------^ SC2154: ip_address is referenced but not assigned.


In ./docker_compose/amd/gpu/rocm/launch_dataprep.sh line 9:
export LLM_MODEL=$model
                 ^----^ SC2154: model is referenced but not assigned.


In ./docker_compose/amd/gpu/rocm/launch_dataprep.sh line 10:
export LLM_ENDPOINT="http://${ip_address}:${vllm_port}"
                                          ^----------^ SC2154: vllm_port is referenced but not assigned.


In ./docker_compose/amd/gpu/rocm/launch_agents.sh line 1:
# Copyright (C) 2025 Advanced Micro Devices, Inc.
^-- SC2148: Tips depend on target shell and yours is unknown. Add a shebang.


In ./docker_compose/amd/gpu/rocm/launch_agents.sh line 4:
export ip_address=$(hostname -I | awk '{print $1}')
       ^--------^ SC2155: Declare and assign separately to avoid masking return values.


In ./docker_compose/intel/hpu/gaudi/launch_vllm.sh line 1:
# Copyright (C) 2025 Intel Corporation
^-- SC2148: Tips depend on target shell and yours is unknown. Add a shebang.


In ./docker_compose/intel/hpu/gaudi/launch_dataprep.sh line 1:
# Copyright (C) 2025 Intel Corporation
^-- SC2148: Tips depend on target shell and yours is unknown. Add a shebang.


In ./docker_compose/intel/hpu/gaudi/launch_dataprep.sh line 4:
export host_ip=${ip_address}
               ^-----------^ SC2154: ip_address is referenced but not assigned.


In ./docker_compose/intel/hpu/gaudi/launch_dataprep.sh line 9:
export LLM_MODEL=$model
                 ^----^ SC2154: model is referenced but not assigned.


In ./docker_compose/intel/hpu/gaudi/launch_dataprep.sh line 10:
export LLM_ENDPOINT="http://${ip_address}:${vllm_port}"
                                          ^----------^ SC2154: vllm_port is referenced but not assigned.


In ./docker_compose/intel/hpu/gaudi/launch_agents.sh line 1:

^-- SC2148: Tips depend on target shell and yours is unknown. Add a shebang.


In ./docker_compose/intel/hpu/gaudi/launch_agents.sh line 5:
export ip_address=$(hostname -I | awk '{print $1}')
       ^--------^ SC2155: Declare and assign separately to avoid masking return values.


In ./tests/test_compose_vllm_on_rocm.sh line 6:
export WORKPATH=$(dirname "$PWD")
       ^------^ SC2155: Declare and assign separately to avoid masking return values.


In ./tests/test_compose_vllm_on_rocm.sh line 9:
export ip_address=$(hostname -I | awk '{print $1}')
       ^--------^ SC2155: Declare and assign separately to avoid masking return values.


In ./tests/test_compose_vllm_on_rocm.sh line 21:
vllm_volume=${HF_CACHE_DIR}
^---------^ SC2034: vllm_volume appears unused. Verify use (or export if used externally).


In ./tests/test_compose_vllm_on_rocm.sh line 54:
    docker build -t opea/agent:latest -f comps/agent/src/Dockerfile . --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy
                                                                                              ^----------^ SC2154: https_proxy is referenced but not assigned.
                                                                                                                                  ^---------^ SC2154: http_proxy is referenced but not assigned.


In ./tests/test_compose_vllm_on_rocm.sh line 106:
    local CONTENT=$(python3 $WORKPATH/tests/test_redis_finance.py --port $DATAPREP_PORT --test_option ingest)
          ^-----^ SC2155: Declare and assign separately to avoid masking return values.


In ./tests/test_compose_vllm_on_rocm.sh line 107:
    local EXIT_CODE=$(validate "$CONTENT" "200" "dataprep-redis-finance")
          ^-------^ SC2155: Declare and assign separately to avoid masking return values.


In ./tests/test_compose_vllm_on_rocm.sh line 117:
    local CONTENT=$(python3 $WORKPATH/tests/test_redis_finance.py --port $DATAPREP_PORT --test_option get)
          ^-----^ SC2155: Declare and assign separately to avoid masking return values.


In ./tests/test_compose_vllm_on_rocm.sh line 118:
    local EXIT_CODE=$(validate "$CONTENT" "Request successful" "dataprep-redis-finance")
          ^-------^ SC2155: Declare and assign separately to avoid masking return values.


In ./tests/test_compose_vllm_on_rocm.sh line 147:
    local CONTENT=$(python3 $WORKDIR/GenAIExamples/FinanceAgent/tests/test.py --prompt "$prompt" --agent_role "worker" --ext_port $agent_port)
          ^-----^ SC2155: Declare and assign separately to avoid masking return values.


In ./tests/test_compose_vllm_on_rocm.sh line 149:
    local EXIT_CODE=$(validate "$CONTENT" "15" "finqa-agent-endpoint")
          ^-------^ SC2155: Declare and assign separately to avoid masking return values.


In ./tests/test_compose_vllm_on_rocm.sh line 161:
    local CONTENT=$(python3 $WORKDIR/GenAIExamples/AgentQnA/tests/test.py --prompt "$prompt" --agent_role "worker" --ext_port $agent_port --tool_choice "get_current_date" --tool_choice "get_share_performance")
          ^-----^ SC2155: Declare and assign separately to avoid masking return values.


In ./tests/test_compose_vllm_on_rocm.sh line 162:
    local EXIT_CODE=$(validate "$CONTENT" "Johnson" "research-agent-endpoint")
          ^-------^ SC2155: Declare and assign separately to avoid masking return values.


In ./tests/test_compose_vllm_on_rocm.sh line 174:
    local CONTENT=$(python3 $WORKDIR/GenAIExamples/FinanceAgent/tests/test.py --agent_role "supervisor" --ext_port $agent_port --stream)
          ^-----^ SC2155: Declare and assign separately to avoid masking return values.


In ./tests/test_compose_vllm_on_rocm.sh line 176:
    local EXIT_CODE=$(validate "$CONTENT" "test completed with success" "supervisor-agent-endpoint")
          ^-------^ SC2155: Declare and assign separately to avoid masking return values.


In ./tests/test_compose_vllm_on_rocm.sh line 185:
    local CONTENT=$(python3 $WORKDIR/GenAIExamples/FinanceAgent/tests/test.py --agent_role "supervisor" --ext_port $agent_port --multi-turn --stream)
          ^-----^ SC2155: Declare and assign separately to avoid masking return values.


In ./tests/test_compose_vllm_on_rocm.sh line 187:
    local EXIT_CODE=$(validate "$CONTENT" "test completed with success" "supervisor-agent-endpoint")
          ^-------^ SC2155: Declare and assign separately to avoid masking return values.

@CICD-at-OPEA
Copy link
Collaborator

This PR is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.

@chensuyue chensuyue removed the Stale label Jul 14, 2025
@chensuyue
Copy link
Collaborator

Please fix the shell-check issue in CI test, @artem-astafev .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants