Utility for checking a metric passes the threshold?
See original GitHub issueWhen I first started using the library, I was a little surprised to see it didn’t implement a canonical helper for determining if a metric passed the Core Web Vitals thresholds.
While our documentation on web.dev captures what the thresholds are, I can imagine there is value to a developer not needing to hardcode threshold values themselves (e.g imagine if they got this subtly wrong).
I currently implement custom scoring logic in the WIP web vitals extension, but thought I’d throw this idea over the fence to see what you thought.
I’ll defer on the API shape if you think this would be helpful, but you could loosely imagine…
getCLS((result) => logCurrentValue('cls', result.value, result.passesThreshold, result.isFinal));
Issue Analytics
- State:
- Created 3 years ago
- Comments:6 (6 by maintainers)
Top Results From Across the Web
Thresholds - Grafana k6
Thresholds are a pass/fail criteria used to specify the performance expectations of the system under test.
Read more >Types of alerting policies - Monitoring - Google Cloud
A metric-threshold condition triggers when the values of a metric are more than, or less than, the threshold for a specific duration window....
Read more >Static thresholds for metric events | Dynatrace Docs
A static threshold represents a hard limit that a metric should not violate. The limit can be a single value or a range....
Read more >Create a metrics threshold rule | Observability Guide [8.5]
Create a metrics threshold ruleedit · To access this page, go to Observability → Infrastructure. · On the Inventory page or the Metrics...
Read more >Metrics | Testspace
Code Coverage · Min threshold for the metric to pass. If there is no threshold value, then the Metric will not contribute to...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
How about a threshold set as an optional value for each metric? In the README, you can recommend the developer the optimal value to benchmark, and it still gives the developer the opportunity to change it when they prefer.
We could see this as a budget for those metrics, where the developer can customize the value where they are pushing for.
This is an interesting idea! If it’s possible to add this without increasing the file size too much, I agree it’d be a nice addition. (It may take a slight refactor since so much of the reporting code is shared by all metrics).