/setup-tests
Automatically detect your project type and bootstrap a complete testing infrastructure with framework setup, configuration, example tests, and CI integration.
Quick Start
/agileflow:setup-tests FRAMEWORK=auto COVERAGE=yesPurpose
This command creates a professional testing foundation by:
- Auto-detecting your project language and framework
- Installing appropriate testing framework dependencies
- Creating test configuration files
- Generating example unit, integration, and E2E tests
- Adding test scripts to your build configuration
- Integrating tests into CI/CD pipeline
- Creating testing documentation and best practices
- Running tests to verify the setup works
Parameters
| Parameter | Required | Default | Description |
|---|---|---|---|
FRAMEWORK | No | auto | auto, jest, mocha, pytest, rspec, go-test, cargo-test |
COVERAGE | No | yes | Enable coverage reporting: yes or no |
E2E | No | no | Include E2E tests: yes or no |
Examples
Auto-Detect and Setup
/agileflow:setup-testsDetects Node.js/Python/Ruby/Go/Rust and installs appropriate framework.
Force Specific Framework
/agileflow:setup-tests FRAMEWORK=jest COVERAGE=yesUses Jest for testing with coverage enabled.
Include E2E Tests
/agileflow:setup-tests E2E=yesSets up unit, integration, AND end-to-end tests (Playwright for web apps).
Python Project
/agileflow:setup-tests FRAMEWORK=pytestInstalls pytest with coverage and creates example tests.
Project Detection
The command automatically detects your tech stack:
| Tech Stack | Framework | Detected By |
|---|---|---|
| Node.js | Jest | package.json |
| Node.js (older) | Mocha | package.json |
| Python | pytest | requirements.txt, pyproject.toml |
| Ruby | RSpec | Gemfile |
| Go | go test | go.mod |
| Rust | cargo test | Cargo.toml |
| Java | JUnit | pom.xml, build.gradle |
| .NET | xUnit/NUnit | *.csproj |
Output Files
The command creates a complete testing setup:
Configuration Files
Jest (jest.config.js):
module.exports = {
preset: 'ts-jest',
testEnvironment: 'node',
coverageDirectory: 'coverage',
collectCoverageFrom: [
'src/**/*.{js,ts}',
'!src/**/*.d.ts',
],
coverageThreshold: {
global: {
branches: 70,
functions: 70,
lines: 70,
statements: 70
}
}
};pytest (pytest.ini):
[pytest]
testpaths = tests
python_files = test_*.py *_test.py
addopts = --cov=src --cov-report=html --cov-report=termRSpec (.rspec):
--format documentation
--require spec_helperDirectory Structure
Creates organized test directories:
Example Tests
Unit Test (tests/unit/example.test.ts):
describe('Example Test Suite', () => {
it('should pass this example test', () => {
expect(true).toBe(true);
});
it('should test basic math', () => {
expect(2 + 2).toBe(4);
});
});Component Test (tests/components/Button.test.tsx):
import { render, screen, fireEvent } from '@testing-library/react';
import Button from '@/components/Button';
describe('Button Component', () => {
it('renders with text', () => {
render(<Button>Click Me</Button>);
expect(screen.getByText('Click Me')).toBeInTheDocument();
});
it('calls onClick when clicked', () => {
const handleClick = jest.fn();
render(<Button onClick={handleClick}>Click</Button>);
fireEvent.click(screen.getByText('Click'));
expect(handleClick).toHaveBeenCalledTimes(1);
});
});Integration Test (tests/integration/api.test.ts):
import request from 'supertest';
import app from '@/app';
describe('API Integration Tests', () => {
it('GET / should return 200', async () => {
const response = await request(app).get('/');
expect(response.status).toBe(200);
});
it('POST /api/users should create user', async () => {
const response = await request(app)
.post('/api/users')
.send({ name: 'Test User', email: 'test@example.com' });
expect(response.status).toBe(201);
expect(response.body).toHaveProperty('id');
});
});E2E Test (tests/e2e/login.spec.ts):
import { test, expect } from '@playwright/test';
test('user can log in', async ({ page }) => {
await page.goto('http://localhost:3000/login');
await page.fill('input[name="email"]', 'test@example.com');
await page.fill('input[name="password"]', 'password123');
await page.click('button[type="submit"]');
await expect(page).toHaveURL('http://localhost:3000/dashboard');
});NPM Scripts
Updated package.json:
{
"scripts": {
"test": "jest",
"test:watch": "jest --watch",
"test:coverage": "jest --coverage",
"test:unit": "jest tests/unit",
"test:integration": "jest tests/integration",
"test:e2e": "playwright test"
}
}CI Integration
Adds test job to .github/workflows/ci.yml:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm'
- name: Install dependencies
run: npm ci
- name: Run tests
run: npm test -- --coverage
- name: Upload coverage
uses: codecov/codecov-action@v3
with:
files: ./coverage/lcov.infoDocumentation
Creates docs/02-practices/testing.md:
# Testing Guide
## Running Tests
npm test # Run all tests
npm run test:watch # Watch mode
npm run test:coverage # With coverage report
npm run test:unit # Unit tests only
## Writing Tests
### Unit Tests
- Test individual functions/classes in isolation
- Mock external dependencies
- Fast (\<10ms per test)
### Integration Tests
- Test multiple components together
- Use real dependencies when possible
- Medium speed (\<100ms per test)
### E2E Tests
- Test full user flows
- Run against real app
- Slow (seconds per test)
## Coverage Requirements
- Minimum 70% coverage (enforced in CI)
- New code should be 90%+ covered
- Critical paths require 100% coverage
## Best Practices
- Use descriptive test names (Given/When/Then)
- One assertion per test when possible
- Avoid test interdependence
- Use factories/fixtures for test dataTest Types
Unit Tests
What: Test individual functions/classes in isolation Speed: Fast (< 10ms each) Mocking: Yes, mock dependencies Coverage: Most code should have unit tests
Example: Test a utility function that calculates user age.
Integration Tests
What: Test multiple components working together Speed: Medium (< 100ms each) Mocking: Minimal, use real components Coverage: Critical workflows and API routes
Example: Test that authentication flow works end-to-end with database.
E2E Tests (Optional)
What: Test complete user flows in real application Speed: Slow (seconds per test) Mocking: None, real application Coverage: Critical user journeys only
Example: User signup → login → view dashboard flow.
Coverage Thresholds
Default thresholds (reasonable, not perfectionist):
- Branches: 70% - All decision paths covered
- Functions: 70% - All functions called
- Lines: 70% - All lines executed
- Statements: 70% - All statements run
Why not 100%?
- 100% coverage doesn't guarantee 100% correctness
- Some code is hard to test (UI, edge cases)
- Diminishing returns after 70-80%
- Focus on critical paths instead
Running Tests
During Development
npm run test:watch
Auto-runs tests when files change.
Before Commit
npm test
npm run test:coverageFull test suite with coverage check.
In CI
Automatically runs on every push:
npm test -- --coverage --ciWorkflow
The setup follows these steps:
-
Detect Language/Runtime
- Looks for package.json, Gemfile, go.mod, etc.
- Determines appropriate framework
-
Check Existing Setup
- Scans for test directories (test/, tests/, tests/)
- Checks for test config files
- Detects CI configuration
-
Show Setup Plan
Will install: - jest, @types/jest, ts-jest - @testing-library/react Will create: - jest.config.js - tests/ directory structure - Example tests Will update: - package.json (test scripts) - .github/workflows/ci.yml (test job) Proceed? (YES/NO) -
Install Dependencies
- npm install, pip install, bundle install, etc.
-
Create Configuration
- Framework-specific config files
-
Generate Examples
- Unit test example
- Integration test example
- E2E test example (if requested)
-
Update Scripts and CI
- Add test commands
- Integrate with existing workflow
-
Run Tests
- Verify setup works
- Show coverage report
-
Create Documentation
- Testing guide
- Best practices
- Coverage expectations
Next Steps
After setup completes:
-
Run Tests Locally
npm test npm run test:watch -
Write First Tests
- Start with unit tests for utilities
- Add integration tests for APIs
- Later add E2E tests for critical flows
-
Enable Coverage Checks
- CI requires minimum coverage
- Merge blocked if coverage drops
-
Monitor Coverage Trends
- Use codecov.io or similar
- Track coverage over time
- Celebrate improvements
Best Practices
- Write Tests Early - TDD or alongside features
- Keep Tests Fast - Mock slow operations
- Use Descriptive Names - Test name explains what it tests
- One Assert Per Test - When possible, makes failures clear
- DRY in Tests - Use fixtures, factories, setup/teardown
- Test Behavior - Not implementation details
- Avoid Flakiness - No timing issues, random failures
- Clean Up - Reset state between tests
Related Commands
/ci-setup- Set up CI/CD workflow/setup-deployment- Configure deployment/packages- Manage test dependencies
On This Page
/setup-testsQuick StartPurposeParametersExamplesAuto-Detect and SetupForce Specific FrameworkInclude E2E TestsPython ProjectProject DetectionOutput FilesConfiguration FilesDirectory StructureExample TestsNPM ScriptsCI IntegrationDocumentationTest TypesUnit TestsIntegration TestsE2E Tests (Optional)Coverage ThresholdsRunning TestsDuring DevelopmentBefore CommitIn CIWorkflowNext StepsBest PracticesRelated Commands