Skip to content

Conversation

@MrCreosote
Copy link
Member

Adds a LRU cache for tokens

Adds a LRU cache for tokens
@MrCreosote MrCreosote requested a review from briehl October 6, 2025 01:25
@codecov
Copy link

codecov bot commented Oct 6, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 98.87%. Comparing base (6096294) to head (fa5f80e).
⚠️ Report is 4 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main       #6      +/-   ##
==========================================
+ Coverage   96.26%   98.87%   +2.60%     
==========================================
  Files           4        4              
  Lines         107      177      +70     
==========================================
+ Hits          103      175      +72     
+ Misses          4        2       -2     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Member

@briehl briehl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A couple comments and suggestions.

base_url - the base url for the authentication service, for example
https://kbase.us/services/auth
cache_max_size - the maximum size of the token and user caches.
Copy link
Member

@briehl briehl Oct 9, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What happens if it goes over? I get it's an LRU cache but it's worth putting the delete policy in the docstring.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

self._token_url = base_url + "api/V2/token"
self._me_url = base_url + "api/V2/me"
if cache_max_size < 1:
raise ValueError("cache_max_size must be > 0")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FWIW, cacheout.LRUCache is fine with 0. That just removes them when they age out.
But it really doesn't matter, IMO.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, I don't want to allow infinite cache size

if tk:
return tk
if on_cache_miss:
on_cache_miss()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the use case for on_cache_miss? Show a warning or something? Or testing?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right now just testing, but could be used for stats or something



@pytest.mark.asyncio
async def test_get_token_cache_evict_on_time(auth_users):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The duplicate test code here embedded in a single test makes my brain itch.
It might work better with the sync assertion pieces in a separate function.
But, admittedly, I don't have strong feelings.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, I feel you and went back and forth on this myself. In the end I decided I wanted to make it very clear that the sync and async versions of the code needs to be tested equivalently and make it obvious that that's happening

@MrCreosote MrCreosote merged commit 543b8b3 into main Oct 9, 2025
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants