Skip to content

Conversation

@vinikjkkj
Copy link
Collaborator

No description provided.

@whiskeysockets-bot
Copy link
Contributor

Thanks for opening this pull request and contributing to the project!

The next step is for the maintainers to review your changes. If everything looks good, it will be approved and merged into the main branch.

In the meantime, anyone in the community is encouraged to test this pull request and provide feedback.

✅ How to confirm it works

If you’ve tested this PR, please comment below with:

Tested and working ✅

This helps us speed up the review and merge process.

📦 To test this PR locally:

# NPM
npm install @whiskeysockets/baileys@WhiskeySockets/Baileys#improve-caches

# Yarn (v2+)
yarn add @whiskeysockets/baileys@WhiskeySockets/Baileys#improve-caches

# PNPM
pnpm add @whiskeysockets/baileys@WhiskeySockets/Baileys#improve-caches

If you encounter any issues or have feedback, feel free to comment as well.

Copy link
Collaborator

@jlucaso1 jlucaso1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice catch

@jlucaso1 jlucaso1 requested a review from purpshell October 8, 2025 20:14
@gusquadri
Copy link
Contributor

I dont think make sense to remove the LRU cache, it already had TTL and the fact node cache doesnt support update ttl on get will make it try to migrate/store again wrongly even when its being used constantly

@vinikjkkj
Copy link
Collaborator Author

I dont think make sense to remove the LRU cache, it already had TTL and the fact node cache doesnt support update ttl on get will make it try to migrate/store again wrongly even when its being used constantly

On big projects the memory can grow too fast with 7 days TTL, we can update manually the TTL (re-adding the key) what do you think?

@gusquadri
Copy link
Contributor

I dont think make sense to remove the LRU cache, it already had TTL and the fact node cache doesnt support update ttl on get will make it try to migrate/store again wrongly even when its being used constantly

On big projects the memory can grow too fast with 7 days TTL, we can update manually the TTL (re-adding the key) what do you think?

I think it would be a good approach too, or just decrease the TTL

Copy link
Member

@purpshell purpshell left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's not keep on jumping back and forth in dependencies.

Is there a fundamental issue with lru-cache itself? Why node-cache?

@wis-dev
Copy link

wis-dev commented Oct 14, 2025

Let's not keep on jumping back and forth in dependencies.

Is there a fundamental issue with lru-cache itself? Why node-cache?

I think lru-cache is the best option. If you have memory leaks read Storage Bounds Safety

@github-actions
Copy link
Contributor

github-actions bot commented Nov 3, 2025

This PR is stale because it has been open for 14 days with no activity. Remove the stale label or comment or this will be closed in 14 days

@github-actions github-actions bot added the Stale label Nov 3, 2025
Salientekill pushed a commit to Salientekill/Baileys that referenced this pull request Nov 3, 2025
…to avoid memory leaks

- Remove lru-cache dependency causing memory leaks
- Reduce cache TTLs to prevent memory buildup
- Add NodeCache for lid-mapping with 1h TTL
- Add message retry manager utilities
- Update libsignal with memory optimizations

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
@purpshell purpshell force-pushed the master branch 2 times, most recently from bd1c658 to f46e8b1 Compare November 21, 2025 15:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants