-
Notifications
You must be signed in to change notification settings - Fork 484
Upgrade dependencies for dependabot fixes #484
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
WalkthroughUpgrades many Python dependencies in Changes
Sequence Diagram(s)(Skipped — changes are dependency updates and a single import redirect; no meaningful new control flow to diagram.) Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes
Possibly related PRs
Poem
Pre-merge checks and finishing touches✅ Passed checks (3 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
📜 Recent review detailsConfiguration used: CodeRabbit UI Review profile: CHILL Plan: Pro 📒 Files selected for processing (2)
🔇 Additional comments (5)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
requirements.txt(4 hunks)
🧰 Additional context used
🪛 OSV Scanner (2.2.4)
requirements.txt
[HIGH] 53-53: transformers 4.48.0: undefined
(PYSEC-2025-40)
[HIGH] 53-53: transformers 4.48.0: Transformers is vulnerable to ReDoS attack through its DonutProcessor class
[HIGH] 53-53: transformers 4.48.0: Hugging Face Transformers vulnerable to Regular Expression Denial of Service (ReDoS) in the AdamWeightDecay optimizer
[HIGH] 53-53: transformers 4.48.0: Hugging Face Transformers is vulnerable to ReDoS through its MarianTokenizer
[HIGH] 53-53: transformers 4.48.0: Hugging Face Transformers Regular Expression Denial of Service (ReDoS) vulnerability
[HIGH] 53-53: transformers 4.48.0: Transformers Regular Expression Denial of Service (ReDoS) vulnerability
[HIGH] 53-53: transformers 4.48.0: Transformers vulnerable to ReDoS attack through its get_imports() function
[HIGH] 53-53: transformers 4.48.0: Transformers's Improper Input Validation vulnerability can be exploited through username injection
[HIGH] 53-53: transformers 4.48.0: Transformers's ReDoS vulnerability in get_configuration_file can lead to catastrophic backtracking
[HIGH] 53-53: transformers 4.48.0: Hugging Face Transformers Regular Expression Denial of Service
[HIGH] 53-53: transformers 4.48.0: Hugging Face Transformers library has Regular Expression Denial of Service
🔇 Additional comments (2)
requirements.txt (2)
9-9: LGTM: Minor/patch dependency updates.Lines 9, 49, 51, and 72 show safe minor and patch version updates:
- langsmith: 0.3.3→0.4.42
- authlib: 1.6.3→1.6.5
- requests: 2.32.3→2.32.5
- Pillow: 10.0.1→10.3.0
These are appropriate dependabot-driven security updates with low risk of breaking changes.
Also applies to: 49-49, 51-51, 72-72
54-55: I need to search for more specific information about torch 2.8.0 breaking changes to determine if there are definitive issues or if this is just a general precaution.Test torch 2.8.0 with sentence-transformers 4.0.2 in your environment.
sentence-transformers requires PyTorch 1.11.0+, which covers PyTorch 2.x. However, verify this combination works in your environment by running quick inference tests, as version mismatches between Sentence Transformers, Transformers, and PyTorch can break functionality or cause performance issues. torch 2.8.0 dropped support for older GPUs (Maxwell, Pascal, Volta), so confirm your target hardware is compatible.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (2)
requirements.txt (2)
30-36: Verify langchain code migration and deprecated import fix.The langchain ecosystem major version upgrades (1.0.5, langchain-core 1.0.4, langgraph 1.0.3) are correctly pinned, but this PR must include corresponding code changes. A past review identified a deprecated import that will break with these versions:
- Required fix:
app/modules/intelligence/tools/code_query_tools/code_analysis.py:569— Changefrom langchain.tools import StructuredTooltofrom langchain_core.tools import StructuredTool(langchain 1.0.5 removes deprecated APIs).Confirm that this code fix has been applied in this PR branch.
#!/bin/bash # Verify the deprecated import has been fixed grep -n "from langchain.tools import StructuredTool" app/modules/intelligence/tools/code_query_tools/code_analysis.py # Expected: No output (import should be from langchain_core.tools) # Verify the correct import is in place grep -n "from langchain_core.tools import StructuredTool" app/modules/intelligence/tools/code_query_tools/code_analysis.py # Expected: One match at line 569
54-54: 🔴 Critical: Upgrade transformers to resolve HIGH severity ReDoS vulnerabilities.The current pin
transformers>=4.48.0is incompatible with the PR objective ("Upgrade dependencies for dependabot fixes"). Static analysis flags 11 HIGH severity vulnerabilities in transformers 4.48.0, including:
- CVE-2025-6051: ReDoS in EnglishNormalizer
- CVE-2025-6921: ReDoS in AdamWeightDecay
- CVE-2025-3777: Improper URL validation in image_utils.py
- Multiple ReDoS attacks (DonutProcessor, MarianTokenizer, get_configuration_file)
A prior review already identified this issue and recommended upgrading to >=4.53.0 (or >=4.54.1 for latest stable).
Apply this diff:
-transformers>=4.48.0 +transformers>=4.53.0Verify torch 2.8.0 compatibility with transformers >=4.53.0 before merging.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
requirements.txt(4 hunks)
🧰 Additional context used
🪛 OSV Scanner (2.2.4)
requirements.txt
[HIGH] 54-54: transformers 4.48.0: undefined
(PYSEC-2025-40)
[HIGH] 54-54: transformers 4.48.0: Transformers is vulnerable to ReDoS attack through its DonutProcessor class
[HIGH] 54-54: transformers 4.48.0: Hugging Face Transformers vulnerable to Regular Expression Denial of Service (ReDoS) in the AdamWeightDecay optimizer
[HIGH] 54-54: transformers 4.48.0: Hugging Face Transformers is vulnerable to ReDoS through its MarianTokenizer
[HIGH] 54-54: transformers 4.48.0: Hugging Face Transformers Regular Expression Denial of Service (ReDoS) vulnerability
[HIGH] 54-54: transformers 4.48.0: Transformers Regular Expression Denial of Service (ReDoS) vulnerability
[HIGH] 54-54: transformers 4.48.0: Transformers vulnerable to ReDoS attack through its get_imports() function
[HIGH] 54-54: transformers 4.48.0: Transformers's Improper Input Validation vulnerability can be exploited through username injection
[HIGH] 54-54: transformers 4.48.0: Transformers's ReDoS vulnerability in get_configuration_file can lead to catastrophic backtracking
[HIGH] 54-54: transformers 4.48.0: Hugging Face Transformers Regular Expression Denial of Service
[HIGH] 54-54: transformers 4.48.0: Hugging Face Transformers library has Regular Expression Denial of Service
🔇 Additional comments (2)
requirements.txt (2)
55-55: Verify torch 2.8.0 compatibility with sentence-transformers and transformers.The torch version is upgraded from 2.5.1 to 2.8.0. Verify that this is compatible with:
sentence-transformers==4.0.2transformers>=4.53.0(after the security fix above)Ensure no regressions by running the test suite with the updated dependency stack.
3-3: Minor/patch version upgrades look good.The following upgrades are minor/patch version bumps and align with the PR objective:
- cryptography: 42.0.8 → 44.0.1
- langsmith: 0.3.3 → 0.4.42 (coordinated with langchain major upgrade)
- aiohttp: 3.11.9 → 3.12.14
- authlib: 1.6.3 → 1.6.5
- requests: 2.32.3 → 2.32.5
- Pillow: 10.0.1 → 10.3.0
These are low-risk security/stability updates. No concerns.
Also applies to: 9-9, 29-29, 50-50, 52-52, 73-73
|



Summary by CodeRabbit