Skip to content

Commit 6023a20

Browse files
authored
Merge pull request #170 from vajain-rhods/fix_rocm_build
Fix vLLM ROCm build
2 parents af807bb + ff614c0 commit 6023a20

File tree

2 files changed

+1
-3
lines changed

2 files changed

+1
-3
lines changed

Dockerfile.rocm.ubi

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -210,7 +210,7 @@ RUN --mount=type=bind,from=build_amdsmi,src=/install,target=/install/amdsmi/ \
210210
--mount=type=bind,from=build_vllm,src=/workspace/dist,target=/install/vllm/ \
211211
--mount=type=cache,target=/root/.cache/pip \
212212
--mount=type=cache,target=/root/.cache/uv \
213-
export version="$(awk -F. '{print $1"."$2}' <<< $ROCM_VERSION)" && \
213+
export version="$(awk -F. '{print $1"."$2}' <<< ${ROCM_VERSION})" && \
214214
uv pip install \
215215
--index-strategy=unsafe-best-match \
216216
--extra-index-url "https://download.pytorch.org/whl/nightly/rocm${version}" \

requirements/rocm.txt

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,8 +4,6 @@
44
numba == 0.60.0; python_version == '3.9' # v0.61 doesn't support Python 3.9. Required for N-gram speculative decoding
55
numba == 0.61.2; python_version > '3.9'
66

7-
numba == 0.60.0 # v0.61 doesn't support Python 3.9. Required for N-gram speculative decoding.
8-
97
# Dependencies for AMD GPUs
108
awscli
119
boto3

0 commit comments

Comments
 (0)