Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create tests to measure performance #42

Open
troyhunt opened this issue Apr 28, 2023 · 2 comments
Open

Create tests to measure performance #42

troyhunt opened this issue Apr 28, 2023 · 2 comments

Comments

@troyhunt
Copy link
Contributor

This is related to #41 but I thought I'd break it out into its own issue as it's a discrete unit of work. Without tests, it's very hard to tell if even a really minor tweak to a regex (or similar) is detrimental to performance. I'm not sure of the best way to do this, but we need even a really rough order of magnitude test suite that can pick up on this sort of thing. The change referenced in the aforementioned issue has hit perf by up to 1000x (33 emails per sec in the 300s report versus 3,091 before the change), so I imagine it won't be hard to detect something that varies perf by that much. Hopefully we can get a more finely tuned measurement model together and even start chipping away further at the best perf stats we've achieved.

@jaimevisser
Copy link

It would be interesting to have some lifelike dataset to run tests against.

@troyhunt
Copy link
Contributor Author

troyhunt commented May 1, 2023

I actually generated a set of sample data back in #15 which I've now added to the readme. For brevity, it's here: https://mega.nz/file/Ls8U1ADK#c1We1C_CZi44P0k3OB8YpNVN7HMM3gE_4-fH06E454c

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants