Metric is an object that records data points and calculates statistical values. Developers use metrics to track system behavior across multiple merit runs and make data-driven assertions about performance, accuracy, or other measurable properties.
Using Metric enables:
- Recording assertion results automatically as True/False values
- Calculating statistics (mean, std, percentiles) on collected data
- Tracking metrics at different scopes (session, suite, case)
- Composing metrics via dependency injection for hierarchical analysis
- Generating quality reports with measurable insights
Basic Usage
The most common pattern is to define a metric as a generator function, yield aMetric instance, then use the metrics() context manager to automatically track assertion results.
Recording assertions with selected metrics
When you usewith metrics(metric1, metric2):, any assertions inside that block are automatically recorded:
- Passing assertions record
True - Failing assertions record
False - recorded bool values available inside metric1.raw_values and metric2.raw_values
Metric Properties
TheMetric class provides statistical calculations on demand:
- Basic stats:
len,sum,min,max,mean,median - Variability:
variance,std,pvariance,pstd - Percentiles:
p25,p50,p75,p90,p95,p99, orpercentilesfor p1-p99 - Confidence intervals:
ci_90,ci_95,ci_99 - Distributions:
counter(frequency counts),distribution(proportions) - Raw data:
raw_values(all recorded values)
Recording Data Manually
Whilemetrics() automatically records assertion results, you can also manually record data:
Scopes: Session, Suite, Case
Metrics can be scoped to different lifecycle levels. This enables tracking both local statistics (per merit case) and global statistics (across all merits)."session": One metric instance for the entire merit run (default)"suite": One instance per merit file/module"case": New instance for each parametrized merit case
Composite Metrics via Dependency Injection
Metrics can depend on other metrics, enabling hierarchical analysis. This pattern is useful for tracking components separately while calculating aggregate statistics.Recommendations
1. Use metrics() for automatic assertion tracking
Themetrics() context manager automatically records assertion results, eliminating manual bookkeeping.
Don’t do this:
2. Scope metrics appropriately for your analysis needs
Choose scope based on what you’re measuring. Use case-level metrics for per-merit statistics and session-level metrics for aggregate analysis.3. Build composite metrics for hierarchical insights
Use dependency injection to create parent-child metric relationships. This enables drilling down from aggregate metrics to specific failure modes.4. Yield final values for report generation
After the metric completes, you can yield a final calculated value that will be captured inMetricResult for reports.
5. Use raw_values for custom calculations
Accessraw_values to perform calculations beyond the built-in statistical properties.