Jest Runtime Tracking with Timed Tests
Runtime tracking is crucial for identifying performance bottlenecks in your tests and ensuring your application remains responsive. This challenge asks you to implement a Jest reporter that tracks the execution time of each test case and provides a summary report at the end of the test run, highlighting slow tests. This will help you optimize your tests and, by extension, your application's performance.
Problem Description
You need to create a custom Jest reporter that measures and reports the execution time of each test case within a Jest test suite. The reporter should:
- Track Start and End Times: Record the start and end times of each test case.
- Calculate Execution Time: Calculate the duration of each test case in milliseconds.
- Report Results: After all tests have completed, generate a summary report that includes:
- The name of each test case.
- The execution time of each test case in milliseconds.
- The average execution time across all tests.
- The slowest test case and its execution time.
- Console Output: Display the report in a clear and readable format in the Jest console.
- Integration: The reporter should seamlessly integrate with Jest's existing reporting system.
Expected Behavior:
When Jest runs with your custom reporter, it should output a summary report to the console after all tests have finished. The report should clearly display the execution time for each test, the average execution time, and the slowest test. The reporter should not interfere with the standard Jest output (pass/fail status of tests).
Edge Cases to Consider:
- Tests that pass instantly: Handle tests that complete in a very short time (e.g., less than 1ms) gracefully.
- Tests that fail: The reporter should still track and report the execution time of failed tests.
- Large test suites: Ensure the reporter's performance doesn't significantly impact the overall test run time.
- Asynchronous tests: Properly handle asynchronous tests (using
async/awaitor Promises) to accurately measure their execution time.
Examples
Example 1:
Input: A Jest test suite with three tests: 'test.spec.ts' containing:
test('test1', () => { /* ... */ });
test('test2', async () => { /* ... */ });
test('test3', () => { /* ... */ });
Output:
Jest Runtime Tracking Report:
test1: 50ms
test2: 120ms
test3: 30ms
Average Execution Time: 70ms
Slowest Test: test2 (120ms)
Explanation: The reporter accurately tracks the execution time of each test and provides a summary report with the average and slowest test.
Example 2:
Input: A Jest test suite with one test that fails: 'test.spec.ts' containing:
test('test1', () => { throw new Error('Test failed'); });
Output:
Jest Runtime Tracking Report:
test1: 80ms
Average Execution Time: 80ms
Slowest Test: test1 (80ms)
Explanation: The reporter correctly tracks the execution time of the failing test and includes it in the report.
Constraints
- Language: TypeScript
- Jest Version: Compatible with Jest v27 or higher.
- Performance: The reporter's overhead should be minimal (ideally less than 5% of the total test run time).
- Output Format: The report should be easily readable in the console.
- No External Dependencies: Avoid using external libraries beyond those typically available in a Jest environment.
Notes
- You'll need to create a Jest reporter class that extends
jest-runner.TestRunner. - Utilize the
onTestStartandonTestEndmethods of theTestRunnerto track test execution times. - Consider using
console.tablefor a more visually appealing report. - Think about how to handle asynchronous tests correctly to ensure accurate timing.
awaiting promises within the reporter might be necessary. - The
jest-runnertype definitions are available in the@jest/typespackage. Refer to the Jest documentation for more details on creating custom reporters.