Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(event cache): wait for the initial previous-batch token, when storage's enabled #4724

Merged
merged 1 commit into from
Feb 27, 2025

Conversation

bnjbvr
Copy link
Member

@bnjbvr bnjbvr commented Feb 26, 2025

If during back-pagination, we lazy-load a chunk from storage and realize there's no previous chunk, we might think we've reached the start of the timeline and that we're done. This is not true, if this was the first default chunk, and we never waited long enough to get the initial gap from sync. This patch fixes that, and includes a regression test showing the error.

Part of #3280.

@bnjbvr bnjbvr requested a review from a team as a code owner February 26, 2025 12:30
@bnjbvr bnjbvr requested review from poljar and removed request for a team February 26, 2025 12:30
Copy link

codecov bot commented Feb 26, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 86.13%. Comparing base (54ab46d) to head (9c1e119).
Report is 4 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #4724      +/-   ##
==========================================
- Coverage   86.13%   86.13%   -0.01%     
==========================================
  Files         291      291              
  Lines       34300    34304       +4     
==========================================
+ Hits        29546    29549       +3     
- Misses       4754     4755       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Member

@Hywan Hywan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the test. I don't understand one thing but it's documented and tested, I approve this PR.

self.propagate_changes().await?;

// If we've never waited for an initial previous-batch token, and we now have at
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is it here? I feel like this is not the correct place, but it's a gut feeling.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a central location where to do it, since we'd need to do it every time after the linked chunk has been modified / after a gap has been potentially added. That happens during sync and when resolving a gap, so… here's likely the right place.

I'm tempting to rethink the waiting for a prev-batch token mechanism, because 1. it causes lot of code everywhere, 2. it's likely used not very often.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the explanation. Makes sense to me.

@bnjbvr bnjbvr force-pushed the bnjbvr/persisted-event-cache-waits-for-token branch from aa2ed02 to 9c1e119 Compare February 27, 2025 09:13
@bnjbvr bnjbvr merged commit 4742aa2 into main Feb 27, 2025
41 checks passed
@bnjbvr bnjbvr deleted the bnjbvr/persisted-event-cache-waits-for-token branch February 27, 2025 09:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants