Skip to content
This repository was archived by the owner on May 11, 2025. It is now read-only.

Commit 8f774d5

Browse files
the final piece (#760)
1 parent 45c1a72 commit 8f774d5

File tree

4 files changed

+38
-6
lines changed

4 files changed

+38
-6
lines changed

README.md

Lines changed: 12 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,20 @@
11
# News: The vLLM project has fully adopted AutoAWQ
22

3-
AutoAWQ is being deprecated and archived because of this. It is no secret that maintaining a project such as AutoAWQ that has 2+ million downloads, 7000+ models on Huggingface, and 2.1k stars is hard for a solo developer who is doing this in their free time.
3+
It is no secret that maintaining a project such as AutoAWQ that has 2+ million downloads, 7000+ models on Huggingface, and 2.1k stars is hard for a solo developer who is doing this in their free time.
44

5-
- vLLM Compressor now supports AutoAWQ: https://github.com/vllm-project/llm-compressor
5+
Important Notice:
6+
- AutoAWQ is officially deprecated and will no longer be maintained.
7+
- The last tested configuration used Torch 2.6.0 and Transformers 4.51.3.
8+
- If future versions of Transformers break AutoAWQ compatibility, please report the issue to the Transformers project.
9+
10+
Alternative:
11+
- AutoAWQ has been adopted by the vLLM Project: https://github.com/vllm-project/llm-compressor
612
- MLX-LM now supports AWQ for Mac devices: http://github.com/ml-explore/mlx-lm
713

14+
For further inquiries, feel free to reach out:
15+
- X: https://x.com/casper_hansen_
16+
- LinkedIn: https://www.linkedin.com/in/casper-hansen-804005170/
17+
818
# AutoAWQ
919

1020
<p align="center">

awq/__init__.py

Lines changed: 24 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,24 @@
1-
__version__ = "0.2.8"
2-
from awq.models.auto import AutoAWQForCausalLM
1+
import warnings
2+
3+
warnings.simplefilter("default", DeprecationWarning)
4+
5+
_FINAL_DEV_MESSAGE = """
6+
I have left this message as the final dev message to help you transition.
7+
8+
Important Notice:
9+
- AutoAWQ is officially deprecated and will no longer be maintained.
10+
- The last tested configuration used Torch 2.6.0 and Transformers 4.51.3.
11+
- If future versions of Transformers break AutoAWQ compatibility, please report the issue to the Transformers project.
12+
13+
Alternative:
14+
- AutoAWQ has been adopted by the vLLM Project: https://github.com/vllm-project/llm-compressor
15+
16+
For further inquiries, feel free to reach out:
17+
- X: https://x.com/casper_hansen_
18+
- LinkedIn: https://www.linkedin.com/in/casper-hansen-804005170/
19+
"""
20+
21+
warnings.warn(_FINAL_DEV_MESSAGE, category=DeprecationWarning, stacklevel=1)
22+
23+
__version__ = "0.2.9"
24+
from awq.models.auto import AutoAWQForCausalLM

scripts/download_wheels.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
#!/bin/bash
22

33
# Set variables
4-
AWQ_VERSION="0.2.8"
4+
AWQ_VERSION="0.2.9"
55
RELEASE_URL="https://github.com/casper-hansen/AutoAWQ/archive/refs/tags/v${AWQ_VERSION}.tar.gz"
66

77
# Create a directory to download the wheels

setup.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
from pathlib import Path
33
from setuptools import setup, find_packages
44

5-
AUTOAWQ_VERSION = "0.2.8"
5+
AUTOAWQ_VERSION = "0.2.9"
66

77
common_setup_kwargs = {
88
"version": AUTOAWQ_VERSION,

0 commit comments

Comments
 (0)