Skip to content

Conversation

@graemenail
Copy link
Member

Description

This PR reinstates a shape check on the entries of transformer cache. This PR is the minimal change required to address the issue highlighted in #881.

#881 was reported to have increased memory usage in normal operation. In that PR, the tensor shape was added to the cache key for better retrieval. However, this change led to many more cache entries and because this cache is a member of the Transformer, the lifetime of cache entries were tied to that of the Transformer and ultimately persisted too long.

Related: #881

List of changes:

  • Check shape instead of elements in transformer cache retrieval

Added dependencies: none

How to test

  • Passes CI tests
  • @emjotde has a benchmark that should be checked

Checklist

  • I have tested the code manually
  • I have run regression tests
  • I have read and followed CONTRIBUTING.md
  • I have updated CHANGELOG.md

@graemenail graemenail force-pushed the fix-transformer-cache branch from a817edd to d186d94 Compare August 24, 2022 13:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant