# Security Best Practices Standards for Jest
This document outlines security best practices for writing Jest tests. Following these guidelines reduces the risk of introducing vulnerabilities in your codebase, especially within test environments that might be overlooked in traditional security audits. It outlines common pitfalls and secure testing patterns, explaining WHY certain practices matter for overall security and how to implement them effectively within Jest.
## 1. General Security Principles for Jest Tests
Security in testing isn't just about finding bugs; it's about preventing them by writing tests that don't inadvertently introduce vulnerabilities. These are the foundational principles:
* **Principle of Least Privilege:** Tests should only have the permissions they absolutely need. Avoid running tests with elevated privileges unless necessary.
* **Input Validation:** Test inputs, _especially_ those coming from external sources (mocks, fixtures), should be rigorously validated to prevent test poisoning or unexpected behavior.
* **Secure Secrets Management:** Never hardcode sensitive information (API keys, passwords) in tests. Use environment variables or secure configuration management techniques.
* **Dependency Hygiene:** Keep Jest and its dependencies up to date to patch known vulnerabilities. Regularly scan dependencies for security issues.
* **Test Isolation:** Ensure tests are isolated from each other. A compromised test should not affect other tests or the system under test.
* **Defense in Depth:** Security should be layered. Don't rely on a single security measure; implement multiple safeguards.
## 2. Avoiding Common Vulnerabilities in Jest
Jest tests, like any other code, can be susceptible to vulnerabilities. Here's how to address some common ones:
### 2.1. Test Poisoning
Test poisoning occurs when malicious or unexpected input to tests causes them to fail or behave erratically, potentially masking real bugs or even destabilizing the test environment.
**Do This:**
* **Explicitly Define Valid Input Ranges:** Always define the expected range and format of test inputs.
* **Use Mocking Frameworks Safely:** When mocking external dependencies, define the exact expected input and output to prevent unexpected behavior with malformed data. Validate the incoming parameters in the mock implementation where possible.
* **Sanitize and Validate Data:** Before using data in assertions, sanitize it as you would in production code.
**Don't Do This:**
* **Use Unvalidated Input:** Don't use external or generated data directly in tests without validation.
* **Assume Input Type:** Don't assume the type or format of input data without explicit checks.
* **Allow Side Effects in Mocking:** Avoid mocks with side effects that could destabilize the test suite. A test should only impact the system under test, nothing else.
**Example:**
"""javascript
// Vulnerable code: Assuming 'id' is always a number
test('Deletes a user with the given id', async () => {
const id = request.params.id; // Assuming request object is available in the test, possibly via mocking.
await deleteUser(id);
expect(deleteUser).toHaveBeenCalledWith(id);
});
// Secure code: Validating the 'id'
test('Deletes a user with the given id (validated)', async () => {
const id = request.params.id;
if (!/^\d+$/.test(id)) {
throw new Error('Invalid ID format'); // Or fail the test gracefully. This prevents attempting to delete a user with an invalid id.
}
await deleteUser(id);
expect(deleteUser).toHaveBeenCalledWith(id);
});
// Secure code with mock validation: Validating parameter sent to a mocked function
test('Creates a new user with valid input', async () => {
const mockCreateUser = jest.fn();
const userData = { name: 'John Doe', email: 'john.doe@example.com' };
// Validate data before sending it to the function
if (typeof userData.name !== 'string' || userData.name.length === 0) {
throw new Error('Name must be a non-empty string')
}
if (typeof userData.email !== 'string' || !userData.email.includes('@')) {
throw new Error('Invalid email address')
}
mockCreateUser.mockImplementation((user) => {
// Parameter to the mock is also validated to prevent vulnerabilities.
if (typeof user.name !== 'string' || user.name.length === 0) {
throw new Error('Invalid user name passed to mock');
}
return Promise.resolve({ id: 123, ...user });
});
const newUser = await mockCreateUser(userData);
expect(mockCreateUser).toHaveBeenCalledWith(userData); // Validating that the mocked function was called.
expect(newUser.name).toBe(userData.name);
});
"""
**Why:** Validating inputs prevents unexpected data from compromising the test environment or leading to false positives/negatives. Mock validation ensures that your mock behaves as expected when fed invalid data.
### 2.2. Information Disclosure
Tests can inadvertently leak sensitive information, such as API keys, database passwords, or internal system details.
**Do This:**
* **Use Environment Variables:** Store sensitive information in environment variables and access them in tests using "process.env".
* **Avoid Logging Secrets:** Do not log sensitive data to the console or store it in test reports.
* **Use Secure Fixtures:** If using fixtures, ensure they don't contain real sensitive data. Use synthetic data instead.
* **Redact Sensitive Data in Snapshots:** When using snapshot testing, redact sensitive data from snapshots using custom serializers.
**Don't Do This:**
* **Hardcode Secrets:** Never hardcode API keys, passwords, or other sensitive information directly in test files.
* **Commit Secrets to Repositories:** Make sure environment variable files and any files containing secrets are added to ".gitignore".
* **Expose Sensitive Data in Logs:** Avoid logging responses or requests containing sensitive information.
**Example:**
"""javascript
// Vulnerable code: Hardcoding API key
test('Fetches data from API', async () => {
const apiKey = 'YOUR_API_KEY'; // DON'T DO THIS!
const data = await fetchData(apiKey);
expect(data).toBeDefined();
});
// Secure code: Using environment variables
test('Fetches data from API using environment variables', async () => {
const apiKey = process.env.API_KEY;
if (!apiKey) {
throw new Error('API_KEY environment variable not set');
}
const data = await fetchData(apiKey);
expect(data).toBeDefined();
});
// jest.config.js (or package.json jest configuration)
// setupFilesAfterEnv: ['/src/setupTests.js'], // Optional – for global environment setup
// Secure secrets management using "dotenv":
// setupTests.js -- this file must be configured in the jest.config.js
require('dotenv').config({ path: '.env.test' }); // Load env variables in a test environment
//.env.test (Example)
//API_KEY=secure_test_api_key
//Redacting sensitive data in Jest Snapshot Serializers
//In jest.config.js (or package.json jest configuration), add (or update) the snapshotSerializers array:
// snapshotSerializers: ["/src/redactSecrets.js"],
//redactSecrets.js
exports.test = (val) => val && val.hasOwnProperty('authorization'); // Check if the property to redact exists
exports.print = (val, serialize) => {
const redactedValue = '[REDACTED]';
const newObj = { ...val, authorization: redactedValue };
return serialize(newObj);
};
"""
**Why:** Preventing information disclosure protects sensitive data from being exposed and potentially exploited. Using environment variables and secure configuration management ensures that secrets are not stored in code repositories. Snapshot redaction ensures credentials and other secrets aren't persisted in snapshots.
### 2.3. Denial of Service (DoS)
Malicious tests could consume excessive resources (CPU, memory, disk space), leading to a denial of service. This is especially important in CI/CD environments.
**Do This:**
* **Limit Test Data Size:** Avoid using excessively large datasets in tests to prevent memory exhaustion.
* **Set Test Timeouts:** Configure Jest's "testTimeout" option to prevent tests from running indefinitely.
* **Control Parallelism:** Limit the number of tests running in parallel to avoid overloading the system. Adjust the "maxWorkers" configuration.
* **Monitor Resource Usage:** Monitor resource usage during test execution to identify tests that consume excessive resources.
**Don't Do This:**
* **Unbounded Loops:** Don't write tests with infinite loops or unbounded recursion.
* **Excessive Memory Allocation:** Avoid allocating large amounts of memory unnecessarily.
* **Uncontrolled External Requests:** Avoid making a large number of unchecked external requests in tests.
**Example:**
"""javascript
// Vulnerable code: Unbounded loop (simulated)
test('Processes a large dataset', () => {
const largeDataset = generateLargeDataset(); // Huge dataset
largeDataset.forEach(item => {
processItem(item); // Could be slow or inefficient
});
}, 60000); // Even with a timeout, this is problematic for large datasets
// Secure code: Limiting data size and using timeouts
test('Processes a sample dataset', () => {
const sampleDataset = generateSampleDataset(100); // Limited dataset
sampleDataset.forEach(item => {
processItem(item);
});
}, 5000); // Timeout for excessive processing time
// jest.config.js
module.exports = {
testTimeout: 5000, // Global timeout for all tests (milliseconds)
maxWorkers: 4, // Limit the number of parallel test workers
};
"""
**Why:** Protecting against DoS attacks ensures that tests don't consume excessive resources or destabilize the test environment. Timeouts, data size limits, and controlled parallelism help mitigate these risks.
### 2.4. Code Injection
If a test manipulates strings that are later interpreted as code (e.g., using "eval" or "Function"), it can be vulnerable to code injection attacks. While uncommon in Jest, improper mocking can create this scenario.
**Do This:**
* **Avoid "eval" and "Function":** Never use "eval" or "Function" to dynamically execute code in tests unless absolutely necessary.
* **Sanitize Untrusted Data:** If you must use "eval" or "Function", sanitize any untrusted data before passing it to these functions.
* **Careful Mocking:** Ensure that mocked return values do not include any executable code.
**Don't Do This:**
* **Dynamically Construct Code:** Avoid dynamically constructing code strings from user-controlled input.
* **Execute Untrusted Data:** Never execute untrusted data as code.
**Example:**
"""javascript
// Vulnerable code: Using eval with potentially malicious input
test('Processes data with a dynamic function', () => {
const maliciousInput = '}; console.log("Hacked!"); //'; // Example of malicious input
const funcString = 'function process(data) { return ' + maliciousInput + '}';
eval(funcString); // POTENTIAL CODE INJECTION
const result = process({ value: 10 });
expect(result).toBe(10);
});
// Secure code: Avoiding eval and using safe alternatives (if possible)
test('Processes data with a safe calculation', () => {
const inputData = { value: 10 };
const result = calculateResult(inputData.value); // Assuming calculateResult performs validation and is safe
expect(result).toBeGreaterThan(0);
});
function calculateResult(value) {
if (typeof value !== 'number') {
throw new Error('Invalid input: Value must be a number');
}
return value * 2; // Simple and safe operation without dynamic code execution
}
"""
**Why:** Preventing code injection attacks ensures that malicious code cannot be executed within the test environment. Avoid using "eval" and always sanitize user input if dynamic code execution is unavoidable. Mocking safety avoids unexpected code changes.
### 2.5. Dependency Vulnerabilities
Jest projects often rely on numerous dependencies, including testing libraries, mocking tools, and assertion frameworks. These dependencies may contain security vulnerabilities that can be exploited. One common attack vector is through transitive dependencies - dependencies of your direct dependencies.
**Do This:**
* **Regularly Audit Dependencies:** Use tools like "npm audit" or "yarn audit" to identify known vulnerabilities in your project's dependencies. Set up automated dependency scanning in your CI/CD pipeline using tools like Snyk or GitHub Dependabot.
* **Keep Dependencies Up-to-Date:** Update your project's dependencies regularly to patch known vulnerabilities. Use semantic versioning (semver) to manage dependency updates and minimize the risk of introducing breaking changes.
* **Use Specific Dependency Versions:** Avoid using wildcard or range-based version specifications in your "package.json" file. Pin specific dependency versions to ensure that your project uses a consistent and known-vulnerable set of dependencies.
* **Review Dependency Licenses:** Ensure that the licenses of your project's dependencies are compatible with your project's licensing terms and policies. Be aware of any restrictions or obligations associated with the use of open-source dependencies.
* **Minimize Transitive Dependencies:** Reduce the number of transitive dependencies in your project by using direct dependencies that have fewer dependencies themselves. Consider using lighter-weight alternatives to reduce your project's dependency footprint.
* **Use a Software Bill of Materials (SBOM):** Generate and maintain an SBOM for your project to provide a comprehensive inventory of all your dependencies, including direct and transitive dependencies. Use the SBOM to track and manage dependency vulnerabilities and license compliance.
**Don't Do This:**
* **Ignore Dependency Audits:** Don't ignore warnings or errors reported by dependency auditing tools. Address identified vulnerabilities promptly by updating dependencies or applying patches.
* **Use Outdated Dependencies:** Don't continue using outdated versions of dependencies that have known security vulnerabilities.
* **Blindly Update Dependencies:** Don't blindly update all dependencies to the latest versions without testing and verifying compatibility. Thoroughly test your project after updating dependencies to ensure that no regressions or breaking changes are introduced.
* **Use Unverified Dependencies:** Don't use dependencies from untrusted sources or repositories. Only use dependencies that have been scanned for vulnerabilities and verified to be safe.
**Example:**
"""bash
# Run npm audit to identify vulnerabilities in your project's dependencies
npm audit
# Run yarn audit to identify vulnerabilities in your project's dependencies
yarn audit
# Update a specific dependency to the latest version
npm update
# Install a security patch for a specific dependency
npm install @
#Generate an SBOM
npm install -g @cyclonedx/bom
cyclonedx-bom //Generates a CycloneDX format SBOM.
"""
**Why:** Regularly auditing, updating, and managing dependencies is crucial for maintaining a secure Jest testing environment. Addressing dependency vulnerabilities proactively prevents potential exploits that could compromise your tests and expose sensitive data.
## 3. Secure Coding Patterns in Jest
Using secure coding patterns in Jest tests helps to create more robust and reliable tests, and reduces the risk of introducing vulnerabilities.
### 3.1. Mocking Strategies
Mocking external dependencies is a common practice in Jest to isolate the system under test and ensure repeatable test results. However, improper mocking can introduce security vulnerabilities.
**Do This:**
* **Validate Mocked Methods:** Ensure mocks return expected data types and structures.
* **Scope Mocks:** Only mock what's necessary for the current test. Avoid global mocks that could affect other tests unexpectedly.
* **Use Mock Implementations:** Use Jest's "mockImplementation" or "mockResolvedValue" to define the exact behavior of mocked functions, including error handling. Consider using "spyOn" to patch a method and assert the calls if you want to run real code.
**Don't Do This:**
* **Overshadow Modules:** Don't mock entire modules unless necessary, as this can mask underlying issues with the system under test.
* **Return Non-Deterministic Output:** Avoid mocks that return random or non-deterministic data, as this can lead to flaky tests and hide vulnerabilities.
* **Ignore Error Conditions:** Don't create mocks that always succeed, ignoring potential error conditions in the real system. This can lead to false positives if the code isn't correctly handling errors.
**Example:**
"""javascript
// Vulnerable code: Mocking without validation
jest.mock('./externalService', () => ({
fetchData: jest.fn(() => Promise.resolve({ data: 'Some data' })), // No validation of return type.
}));
// Secure code: Mocking with validation
jest.mock('./externalService', () => ({
fetchData: jest.fn(() => {
const data = { data: 'Some data' };
if (typeof data.data !== 'string') {
throw new Error('Invalid data type');
}
return Promise.resolve(data);
}),
}));
//Scoping mocks:
describe('User creation', () => {
it('Creates a user successfully', async () => {
const createUser = jest.fn(() => Promise.resolve({ id: 1, name: 'Test User' }));
//Here we call "createUser" that is scoped only in this test case, and does not affect the other test cases
const user = await createUser({ name: 'Test User' });
expect(user).toEqual({ id: 1, name: 'Test User' });
});
it('Handles errors when user creation fails', async () => {
const createUser = jest.fn(() => Promise.reject(new Error('User creation failed')));
//Here we call "createUser" that is scoped only in this test case, and does not affect the other test cases
await expect(createUser({ name: 'Test User' })).rejects.toThrow('User creation failed');
});
});
"""
**Why:** Secure mocking prevents tests from being misled by unexpected behavior from mocked dependencies. Validating mock outputs and scoping mocks ensure test reliability and reduce the risk of masking vulnerabilities.
### 3.2. Assertion Strategies
The way assertions are written can impact test effectiveness and security. Overly permissive or poorly written assertions can mask errors.
**Do This:**
* **Precise Assertions:** Use precise assertions that check for specific values or states, rather than generic assertions that could pass even with errors.
* **Error Handling Assertions:** Test both success and failure scenarios and assert that errors are handled correctly.
* **Boundary Condition Assertions:** Test boundary conditions and edge cases to ensure that the system handles unusual inputs correctly.
**Don't Do This:**
* **Tautological Assertions:** Avoid assertions that always pass regardless of the system's behavior (e.g., "expect(true).toBe(true)").
* **Ignoring Errors:** Don't ignore errors or exceptions during test execution. Always assert that expected errors are thrown.
* **Asserting Unimportant Details:** Don't focus assertions on trivial implementation details. Check functionality rather than exact data format.
**Example:**
"""javascript
// Vulnerable code: Generic assertion
test('Processes data', async () => {
const result = await processData();
expect(result).toBeDefined(); // Too generic; doesn't check the actual data
});
// Secure code: Precise assertion
test('Processes data and returns the correct value', async () => {
const result = await processData();
expect(result).toBe(123); // Checks for a specific, expected value
});
test('Handles invalid input gracefully', async () => {
await expect(processData(null)).rejects.toThrow('Invalid input'); // Assertion of a thrown error in a promise rejection
});
test('Handles invalid input gracefully (sync)', () => {
expect(() => processDataSync(null)).toThrow('Invalid input'); // Assertion of a thrown error in synchronous code
});
"""
**Why:** Precise assertions ensure that tests accurately verify the system's behavior. Error handling assertions help identify cases where errors are not handled correctly, potentially leading to vulnerabilities.
### 3.3. Test Data Management
How test data is created and managed is critical to the reliability and security of tests.
**Do This:**
* **Data Generation:** If you need sample user data, use a library like "faker.js" to generate this.
* **Test-Specific Data:** Create data that is specific to the test being run.
* **Sanitize Data:** Make sure the dataset is sanitized to prevent introducing vulnerabilities to the data.
**Don't Do This:**
* **Using Real User PII:** Don't use PII.
* **Leaving behind any "backdoors."** Avoid creating test-specific endpoints - these can potentially be left in your production code.
**Example:**
"""javascript
const { faker } = require('@faker-js/faker');
test('can POST a valid user', async () => {
const newUser = {
firstName: faker.person.firstName(),
lastName: faker.person.lastName(),
email: faker.internet.email(),
}
const result = await postNewUser(newUser);
expect(newUSer.email).toContain('@');
});
"""
**Why:** When a test doesn't depend on external data sources, it is less likely that it will break. When you're not using PII (personally identifiable information), you are less likely to have a security breach via the testing suite.
## 4. Test Environment Security
Securing the test environment is as crucial as securing the production environment. A compromised test environment can lead to vulnerabilities being introduced into the codebase.
### 4.1. Isolation and Sandboxing
Ensure that tests are isolated from each other and the external environment to prevent interference and security breaches.
**Do This:**
* **Use Containerization:** Run tests in isolated containers (e.g., Docker) to prevent them from affecting the host system.
* **Virtualization:** Use virtualization technologies (e.g., VMs) to create separate test environments for different projects or test suites.
* **Mock External Dependencies:** Mock external dependencies (e.g., databases, APIs) to prevent tests from interacting with real systems.
**Don't Do This:**
* **Run Tests Directly on Production Systems:** Don't run tests directly on production systems, as this can lead to data corruption or service interruptions.
* **Share Test Environments:** Don't share test environments between different projects or teams, as this can lead to conflicts and security vulnerabilities.
### 4.2. Access Control
Restrict access to the test environment and test data to authorized personnel only.
**Do This:**
* **Use Role-Based Access Control (RBAC):** Implement RBAC to control access to test resources based on user roles and responsibilities.
* **Multi-Factor Authentication (MFA):** Enforce MFA for access to the test environment to prevent unauthorized access.
* **Regularly Review Access Permissions:** Review access permissions regularly to ensure that they are still appropriate and that no unauthorized users have access to the test environment.
**Don't Do This:**
* **Use Default Credentials:** Don't use default credentials for the test environment, as these are easily compromised.
* **Grant Excessive Permissions:** Don't grant excessive permissions to users, as this can increase the risk of unauthorized access and data breaches.
### 4.3. Monitoring and Logging
Implement monitoring and logging to detect and respond to security incidents in the test environment.
**Do This:**
* **Implement Security Monitoring:** Implement security monitoring to detect suspicious activity in the test environment, such as unauthorized access attempts or data exfiltration.
* **Centralized Logging:** Configure centralized logging to collect and analyze logs from all components of the test environment.
* **Security Incident Response Plan:** Develop a security incident response plan to guide the response to security incidents in the test environment.
**Don't Do This:**
* **Disable Logging:** Don't disable logging in the test environment, as this can make it difficult to detect and respond to security incidents.
* **Ignore Security Alerts:** Don't ignore security alerts from monitoring systems, as these may indicate a security incident.
**Example:**
"""bash
#Example of restricting access to the test databases:
# Create a dedicated user for tests with limited privileges
CREATE USER 'test_user'@'localhost' IDENTIFIED BY 'test_password';
# Grant only the necessary permissions
GRANT SELECT, INSERT, UPDATE, DELETE ON test_database.* TO 'test_user'@'localhost';
# Revoke unnecessary privileges to further limit access
REVOKE ALL PRIVILEGES ON test_database.* FROM 'test_user'@'localhost';
FLUSH PRIVILEGES;
"""
**Why:** Test Environment Security ensures that those with ill-intent can't penetrate the codebase. When you use access restriction, monitor environments, and isolate them, it provides a safe system.
By following these guidelines, you can create more secure and reliable Jest tests that help to protect your codebase from vulnerabilities. Remember that security is an ongoing process, and it requires continuous vigilance and improvement. These standards should be regularly reviewed and updated to reflect the latest security threats and best practices.
danielsogl
Created Mar 6, 2025
This guide explains how to effectively use .clinerules
with Cline, the AI-powered coding assistant.
The .clinerules
file is a powerful configuration file that helps Cline understand your project's requirements, coding standards, and constraints. When placed in your project's root directory, it automatically guides Cline's behavior and ensures consistency across your codebase.
Place the .clinerules
file in your project's root directory. Cline automatically detects and follows these rules for all files within the project.
# Project Overview project: name: 'Your Project Name' description: 'Brief project description' stack: - technology: 'Framework/Language' version: 'X.Y.Z' - technology: 'Database' version: 'X.Y.Z'
# Code Standards standards: style: - 'Use consistent indentation (2 spaces)' - 'Follow language-specific naming conventions' documentation: - 'Include JSDoc comments for all functions' - 'Maintain up-to-date README files' testing: - 'Write unit tests for all new features' - 'Maintain minimum 80% code coverage'
# Security Guidelines security: authentication: - 'Implement proper token validation' - 'Use environment variables for secrets' dataProtection: - 'Sanitize all user inputs' - 'Implement proper error handling'
Be Specific
Maintain Organization
Regular Updates
# Common Patterns Example patterns: components: - pattern: 'Use functional components by default' - pattern: 'Implement error boundaries for component trees' stateManagement: - pattern: 'Use React Query for server state' - pattern: 'Implement proper loading states'
Commit the Rules
.clinerules
in version controlTeam Collaboration
Rules Not Being Applied
Conflicting Rules
Performance Considerations
# Basic .clinerules Example project: name: 'Web Application' type: 'Next.js Frontend' standards: - 'Use TypeScript for all new code' - 'Follow React best practices' - 'Implement proper error handling' testing: unit: - 'Jest for unit tests' - 'React Testing Library for components' e2e: - 'Cypress for end-to-end testing' documentation: required: - 'README.md in each major directory' - 'JSDoc comments for public APIs' - 'Changelog updates for all changes'
# Advanced .clinerules Example project: name: 'Enterprise Application' compliance: - 'GDPR requirements' - 'WCAG 2.1 AA accessibility' architecture: patterns: - 'Clean Architecture principles' - 'Domain-Driven Design concepts' security: requirements: - 'OAuth 2.0 authentication' - 'Rate limiting on all APIs' - 'Input validation with Zod'
# Component Design Standards for Jest This document outlines the coding standards for designing components in Jest. These standards aim to promote reusable, maintainable, and testable components, taking into account modern Jest practices. ## 1. Component Structure and Organization ### 1.1. Standard: Single Responsibility Principle (SRP) **Do This:** Design components that have a single, well-defined responsibility. A component should do one thing and do it well. **Don't Do This:** Create "god components" that handle multiple unrelated tasks. **Why:** Components adhering to SRP are easier to understand, test, and reuse. Changes to one component are less likely to impact other parts of the application. **Example:** """jsx // Good: Button component only handles button-related logic function Button({ onClick, children, disabled }) { return ( <button onClick={onClick} disabled={disabled}> {children} </button> ); } // Bad: Component handling button logic and authentication function MultiPurposeComponent({ onClick, children, disabled, authenticate }) { const handleClick = () => { if (authenticate()) { onClick(); } } return ( <button onClick={handleClick} disabled={disabled}> {children} </button> ); } """ ### 1.2. Standard: Component Composition **Do This:** Favor composition over inheritance. Build complex components by composing smaller, reusable components. **Don't Do This:** Create deep inheritance hierarchies that are difficult to understand and maintain. **Why:** Composition allows for greater flexibility and reusability. It avoids the rigid structure imposed by inheritance. **Example:** """jsx // Good: Composing a Layout component with other components function Layout({ children }) { return ( <Header /> {children} <Footer /> ); } // Bad: Inheritance leading to complex component hierarchy class BaseComponent extends React.Component { // ... common logic } class ExtendedComponent extends BaseComponent { // ... more logic } """ ### 1.3. Standard: Component Naming Conventions **Do This:** Use descriptive and consistent naming conventions for components and their props. Aim for clarity and readability. Use PascalCase for components. **Don't Do This:** Use vague or ambiguous names. Avoid abbreviations or acronyms unless they are widely understood within the team. **Why:** Clear naming conventions improve the readability and maintainability of the codebase. **Example:** """jsx // Good: function UserProfileCard({ user, onEdit }) { return ( {user.name} <button onClick={onEdit}>Edit</button> ); } // Bad: function UPC({ u, oe }) { return ( {u.name} <button onClick={oe}>Edit</button> ); } """ ### 1.4. Standard: Directory Structure **Do This:** Organize components into a logical directory structure. Consider grouping related components into modules or folders. Utilize an "index.js" file to expose components from a module. **Don't Do This:** Create a flat directory structure with all components in a single folder. **Why:** A well-organized directory structure improves navigation and maintainability, especially in larger projects. **Example:** """ src/ components/ Button/ Button.jsx Button.test.jsx index.js Card/ Card.jsx Card.test.jsx index.js """ "src/components/Button/index.js": """javascript export { default as Button } from './Button'; """ ## 2. Component Props and Data Flow ### 2.1. Standard: Explicit Prop Types **Do This:** Use prop types to define the expected type, shape, and requirements for each prop. Use TypeScript or PropTypes for explicit type checking. **Don't Do This:** Rely on implicit type inference or skip prop type validation altogether. **Why:** Prop types catch errors early, improve code readability, and provide a clear contract for how components should be used. TypeScript is generally preferred for larger projects. **Example (TypeScript):** """tsx interface Props { name: string; age?: number; // Optional onClick: () => void; } function UserCard({ name, age, onClick }: Props) { return ( {name} {age && <p>Age: {age}</p>} <button onClick={onClick}>Click Me</button> ); } """ **Example (PropTypes - Less preferred):** """javascript import PropTypes from 'prop-types'; function UserCard({ name, age, onClick }) { return ( {name} {age && <p>Age: {age}</p>} <button onClick={onClick}>Click Me</button> ); } UserCard.propTypes = { name: PropTypes.string.isRequired, age: PropTypes.number, onClick: PropTypes.func.isRequired, }; """ ### 2.2. Standard: Immutable Data **Do This:** Treat component props as immutable data. Avoid modifying props directly within the component. If data needs to be modified, create a local copy or use state. **Don't Do This:** Mutate props directly, as this can lead to unpredictable behavior and rendering issues. **Why:** Immutable data improves performance, simplifies debugging, and prevents unintended side effects. **Example:** """jsx // Good: Creating a local copy of the prop function NameDisplay({ name }) { const [localName, setLocalName] = React.useState(name); const handleChange = (e) => { setLocalName(e.target.value); }; return ( <input type="text" value={localName} onChange={handleChange} /> <p>Original Name: {name}</p> ); } // Bad: Mutating the prop directly function NameDisplay({ name }) { const handleChange = (e) => { name = e.target.value; // Avoid this! }; return <input type="text" onChange={handleChange} />; } """ ### 2.3. Standard: Controlled vs. Uncontrolled Components **Do This:** Decide whether a component should be controlled or uncontrolled based on its use case. Use controlled components when you need tight control over user input and data flow. Use Uncontrolled components for simpler form elements where you don't need to manage every change. **Don't Do This:** Mix and match controlled and uncontrolled component patterns within the same form or UI element without a clear understanding of the consequences. **Why:** Understanding the controlled vs uncontrolled pattern is critical for managing data flow in React. **Example (Controlled):** """jsx function ControlledInput() { const [value, setValue] = React.useState(''); const handleChange = (e) => { setValue(e.target.value); }; return <input type="text" value={value} onChange={handleChange} />; } """ **Example (Uncontrolled):** """jsx function UncontrolledInput() { const inputRef = React.useRef(null); const handleSubmit = () => { alert("Value: ${inputRef.current.value}"); }; return ( <input type="text" ref={inputRef} /> <button onClick={handleSubmit}>Submit</button> ); } """ ### 2.4. Standard: Avoiding Prop Drilling **Do This:** Avoid passing props through multiple layers of components that don't need them ("prop drilling"). Use context, state management libraries (Redux, Zustand, Jotai), or component composition to provide data where it's needed. **Don't Do This:** Create deeply nested prop chains that make it difficult to track the flow of data. **Why:** Prop drilling increases complexity and reduces the maintainability of the codebase. **Example:** """jsx // Good: Using Context import React, { createContext, useContext, useState } from 'react'; const ThemeContext = createContext(); function ThemeProvider({ children }) { const [theme, setTheme] = useState('light'); const toggleTheme = () => { setTheme(theme === 'light' ? 'dark' : 'light'); }; return ( <ThemeContext.Provider value={{ theme, toggleTheme }}> {children} </ThemeContext.Provider> ); } function ThemedComponent() { const { theme, toggleTheme } = useContext(ThemeContext); return ( Current theme: {theme} <button onClick={toggleTheme}>Toggle Theme</button> ); } function App() { return ( <ThemeProvider> <ThemedComponent /> </ThemeProvider> ); } // Bad: Prop Drilling function App() { const [theme, setTheme] = useState('light'); return <Layout theme={theme} setTheme={setTheme} />; } function Layout({ theme, setTheme }) { return <Content theme={theme} setTheme={setTheme} />; } function Content({ theme, setTheme }) { return <ThemedComponent theme={theme} setTheme={setTheme} />; } function ThemedComponent({ theme, setTheme }) { return ( Current theme: {theme} <button onClick={() => setTheme(theme === 'light' ? 'dark' : 'light')}> Toggle Theme </button> ); } """ ## 3. Component State Management ### 3.1. Standard: Local State vs. Global State **Do This:** Use local state for component-specific data that doesn't need to be shared across the application. Use global state management (Context, Redux, etc.) for data that needs to be accessed and modified by multiple components. **Don't Do This:** Overuse global state for data that could be managed locally. Avoid managing local state with global tools. **Why:** Proper state management improves performance and reduces complexity. Understanding when to use each approach is essential. **Example (Local State):** """jsx function Counter() { const [count, setCount] = React.useState(0); return ( Count: {count} <button onClick={() => setCount(count + 1)}>Increment</button> ); } """ **Example (Global State with Context):** """jsx import React, { createContext, useContext, useState } from 'react'; const CountContext = createContext(); function CountProvider({ children }) { const [count, setCount] = useState(0); return ( <CountContext.Provider value={{ count, setCount }}> {children} </CountContext.Provider> ); } function Counter() { const { count, setCount } = useContext(CountContext); return ( Count: {count} <button onClick={() => setCount(count + 1)}>Increment</button> ); } function DisplayCount() { const { count } = useContext(CountContext); return <p>Current Count: {count}</p> } function App() { return ( <CountProvider> <Counter /> <DisplayCount /> </CountProvider> ); } """ ### 3.2. Standard: State Updates **Do This:** When updating state based on the previous state, use the functional form of "setState" to avoid race conditions and ensure that you're working with the correct previous state. Prefer "useState" hook whenever possible. **Don't Do This:** Directly modify the state object. Rely on the implicit behavior of "setState". **Why:** Using the functional form of "useState" or older "setState" guarantees that you are always working with the most up-to-date state. **Example:** """jsx // Good: Using the functional form of useState function Counter() { const [count, setCount] = React.useState(0); const increment = () => { setCount((prevCount) => prevCount + 1); }; return ( Count: {count} <button onClick={increment}>Increment</button> ); } """ ### 3.3 Standard: Minimizing State Variables **Do This:** Aim to derive state where possible instead of storing redundant information in multiple state variables. Calculate values using functions or memoization rather than duplicating data within the component’s state. **Don't Do This:** Create multiple state variables if a single variable and computed properties can achieve the same result. Avoid redundancy by keeping the component state lean and focused on the essential data. **Why:** Reducing the number of state variables simplifies the component logic and makes it easier to manage updates. **Example:** """jsx // Good: Deriving full name from first and last names function UserProfile({ firstName, lastName }) { const fullName = "${firstName} ${lastName}"; return ( Full Name: {fullName} ); } // Bad: Storing full name as a separate state function UserProfile({ firstName, lastName }) { const [fullName, setFullName] = React.useState("${firstName} ${lastName}"); React.useEffect(() => { setFullName("${firstName} ${lastName}"); }, [firstName, lastName]); return ( Full Name: {fullName} ); } """ ## 4. Component Rendering and Performance ### 4.1. Standard: Memoization **Do This:** Use "React.memo" to prevent unnecessary re-renders of pure functional components. Use "useMemo" and "useCallback" hooks to memoize expensive calculations and function references. **Don't Do This:** Overuse memoization, as it adds overhead. Only memoize components or values that are likely to cause performance bottlenecks. **Why:** Memoization can significantly improve performance by reducing the number of re-renders. **Example:** """jsx // Memoizing a pure functional component const MyComponent = React.memo(function MyComponent({ data }) { console.log('Rendering MyComponent'); return <div>{data.value}</div>; }); // Memoizing a value function MyComponentContainer({ data }) { const expensiveValue = React.useMemo(() => { // Perform an expensive calculation return data.value * 2; }, [data.value]); return <div>{expensiveValue}</div>; } //Memoizing a callback function MyComponentContainer({ onClick }) { const memoizedCallback = React.useCallback(() => { onClick(); }, [onClick]); return <button onClick={memoizedCallback}>Click me</button> } """ ### 4.2. Standard: Virtualization **Do This:** Use virtualization techniques (e.g., "react-window", "react-virtualized") to efficiently render large lists or tables. **Don't Do This:** Render all items in a large list at once, as this can lead to poor performance. **Why:** Virtualization only renders the visible items, improving performance for large datasets. **Example (react-window):** """jsx import { FixedSizeList } from 'react-window'; function Row({ index, style }) { return ( Row {index + 1} ); } function MyList() { return ( <FixedSizeList height={500} width={300} itemSize={50} itemCount={1000} > {Row} </FixedSizeList> ); } """ ### 4.3. Standard: Code Splitting **Do This:** Use code splitting to break down large bundles into smaller chunks that can be loaded on demand. Employ "React.lazy" and "Suspense" for component-level code splitting. **Don't Do This:** Load all code upfront, as this can increase initial load time. **Why:** Code splitting improves initial load time by only loading the code that is needed for the current page or component. **Example:** """jsx import React, { Suspense } from 'react'; const MyComponent = React.lazy(() => import('./MyComponent')); function App() { return ( <Suspense fallback={<div>Loading...</div>}> <MyComponent /> </Suspense> ); } """ ### 4.4. Standard: Avoiding Inline Styles and Functions **Do This:** Define styles in CSS files or use CSS-in-JS libraries. Define event handlers outside of the render method or use "useCallback" to memoize them. **Don't Do This:** Use inline styles and functions, as they can cause unnecessary re-renders and make it difficult to maintain styles. **Why:** Using external styles and memoizing event handlers improves performance and makes it easier to maintain styles. Defining event handlers within the render method creates a new function on every render, preventing "React.memo" from working correctly. **Example:** """jsx // Good: External styles and memoized event handler import styles from './MyComponent.module.css'; function MyComponent({ onClick }) { const handleClick = React.useCallback(() => { onClick(); }, [onClick]); return <button className={styles.button} onClick={handleClick}>Click Me</button>; } // Bad: Inline style and function function MyComponent({ onClick }) { return ( <button style={{ backgroundColor: 'blue', color: 'white' }} onClick={() => onClick()} > Click Me </button> ); } """ ## 5. Testing and Maintainability ### 5.1. Standard: Unit Testing **Do This:** Write unit tests for all components, focusing on testing the behavior of individual components in isolation. Use Jest and React Testing Library for testing. **Don't Do This:** Skip unit tests, as this can lead to regressions and make it difficult to maintain the codebase. **Why:** Unit tests verify that components are working correctly and provide a safety net for future changes. **Example:** """jsx // MyComponent.jsx function MyComponent({ message }) { return <div>{message}</div>; } export default MyComponent; // MyComponent.test.jsx import React from 'react'; import { render, screen } from '@testing-library/react'; import MyComponent from './MyComponent'; test('renders message', () => { render(<MyComponent message="Hello, world!" />); const messageElement = screen.getByText(/Hello, world!/i); expect(messageElement).toBeInTheDocument(); }); """ ### 5.2. Standard: Component Documentation **Do This:** Add detailed documentation comments (JSDoc style, or similar) to your React components, ensuring that each prop is thoroughly described. Additionally, keep the README.md file in the component's directory updated with usage examples and any specific considerations. **Don't Do This:** Neglect documenting component props and usage, assuming that other developers will understand the component's purpose and functionality without guidance. **Why:** Clear documentation is helpful internally with team members and when opensourcing. **Example:** """jsx /** * A simple button component. * * @param {Object} props - The component props. * @param {string} props.text - The text to display on the button. * @param {Function} props.onClick - The function to call when the button is clicked. * @param {boolean} [props.disabled=false] - Whether the button is disabled. * @returns {JSX.Element} A button element. */ function MyButton({ text, onClick, disabled = false }) { return ( <button onClick={onClick} disabled={disabled}> {text} </button> ); } export default MyButton; """ ### 5.3. Standard: Accessibility **Do This:** Make sure components are accessible to users with disabilities. Use semantic HTML, ARIA attributes, and keyboard navigation. Test components with accessibility testing tools. **Don't Do This:** Ignore accessibility considerations, as this can exclude users with disabilities. **Why:** Accessibility ensures that everyone can use the application, regardless of their abilities. **Example:** """jsx // Good: Using semantic HTML and ALT text function ImageWithAlt({ src, alt }) { return <img src={src} alt={alt} />; } // Bad: Missing ALT text function ImageWithoutAlt({ src }) { return <img src={src} alt="" />; // Avoid this } """ ### 5.4. Standard: Error Handling **Do This:** Implement robust error handling in components. Use try-catch blocks to catch errors, display error messages to the user, and log errors for debugging. **Don't Do This:** Ignore errors or allow them to crash the application. **Why:** Error handling prevents crashes and provides a better user experience. **Example:** """jsx function ApiComponent({ url }) { const [data, setData] = React.useState(null); const [error, setError] = React.useState(null); React.useEffect(() => { async function fetchData() { try { const response = await fetch(url); const json = await response.json(); setData(json); } catch (e) { setError(e); } } fetchData(); }, [url]); if (error) { return <div>Error: {error.message}</div>; } if (!data) { return <div>Loading...</div>; } return <div>Data: {JSON.stringify(data)}</div>; } """ These standards provide a solid foundation for designing components in Jest. By following these guidelines, developers can create reusable, maintainable, and testable components that contribute to the overall quality and success of the project.
# Deployment and DevOps Standards for Jest This document outlines the coding standards for Jest, focusing specifically on deployment and DevOps practices. These standards are designed to ensure maintainable, performant, and reliable tests in continuous integration/continuous deployment (CI/CD) pipelines and production environments. Adhering to these guidelines will enable teams to leverage Jest effectively in modern software development lifecycles. ## 1. Build Processes and CI/CD Integration ### 1.1. Standards for Build Configuration * **Do This:** Define explicit Jest configuration within your "package.json" or a dedicated "jest.config.js" file. * **Don't Do This:** Rely on implicit or undocumented default configurations. * **Why:** Explicit configuration ensures consistent behavior across different environments and allows for fine-tuning test execution. """json // package.json { "name": "my-project", "version": "1.0.0", "scripts": { "test": "jest --config jest.config.js" }, "devDependencies": { "jest": "^29.0.0" } } """ """javascript // jest.config.js module.exports = { testEnvironment: 'node', verbose: true, collectCoverage: true, coverageReporters: ['lcov', 'text'], }; """ ### 1.2. Standards for CI/CD Pipeline Integration * **Do This:** Integrate Jest tests into your CI/CD pipeline as a mandatory build step. * **Don't Do This:** Allow code to be merged without passing all Jest tests. * **Why:** Automated testing ensures that regressions are caught early, preventing broken code from reaching production. **Example (GitHub Actions):** """yaml # .github/workflows/ci.yml name: CI on: push: branches: [ main ] pull_request: branches: [ main ] jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - name: Use Node.js 18 uses: actions/setup-node@v3 with: node-version: 18 - run: npm ci - run: npm test - name: Upload coverage to Codecov uses: codecov/codecov-action@v3 with: token: ${{ secrets.CODECOV_TOKEN }} # Optional fail_ci_if_error: true """ ### 1.3. Standards for Parallel Test Execution * **Do This:** Utilize Jest's parallel test execution capabilities by adjusting the "maxWorkers" configuration. * **Don't Do This:** Run tests serially in CI/CD, especially for large test suites. * **Why:** Parallel execution significantly reduces test execution time, speeding up the CI/CD process. """javascript // jest.config.js module.exports = { testEnvironment: 'node', maxWorkers: '50%', // Use 50% of available CPU cores // OR // maxWorkers: 4, // Use 4 CPU cores }; """ In a containerized environment, "maxWorkers" often defaults to the number of vCPUs allocated to the container. Ensure that your container has adequate resources. ### 1.4. Standards for Caching * **Do This:** Enable Jest's caching mechanism to speed up subsequent test runs. Configure the "cacheDirectory" option. * **Don't Do This:** Disable caching without a valid reason, especially in CI environments after initial setup. * **Why:** Caching avoids re-transforming unchanged files, improving performance drastically on repeated runs. Note that cache invalidation is automatically handled based on file changes and dependency updates. """javascript // jest.config.js module.exports = { cacheDirectory: '.jest/cache', // Other configurations }; """ * **CI Considerations:** Store the cache between CI runs, e.g., using GitHub Actions' caching feature. """yaml steps: - uses: actions/checkout@v3 - uses: actions/setup-node@v3 with: node-version: 18 - uses: actions/cache@v3 with: path: ~/.npm key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }} restore-keys: | ${{ runner.os }}-node- - run: npm ci """ ### 1.5. Standards for Test Reporting * **Do This:** Generate comprehensive test reports, including coverage reports, in formats suitable for CI/CD systems (e.g., JUnit XML, LCOV). * **Don't Do This:** Rely only on console output for test results. * **Why:** Structured reports facilitate analysis of test failures, coverage gaps, and long-term trends. """javascript // jest.config.js module.exports = { testEnvironment: 'node', reporters: ['default', 'jest-junit'], coverageReporters: ['lcov', 'text', 'cobertura'], coverageDirectory: '<rootDir>/coverage', // Configure jest-junit reporter "jest-junit": { "outputDirectory": "test-results/junit", "outputName": "junit.xml", } }; """ * Install "jest-junit": "npm install --save-dev jest-junit" **Integration with CI/CD:** CI systems like Jenkins or CircleCI can interpret these reports to provide visual dashboards. GitHub Actions can use the "test-results" action. """yaml - name: Upload test results uses: actions/upload-artifact@v3 if: always() with: name: test-results path: test-results/junit retention-days: 5 """ ### 1.6. Standards for Environment Variables * **Do This:** Use environment variables to configure Jest for different environments (e.g., CI, development, production). * **Don't Do This:** Hardcode environment-specific settings in your Jest configuration. * **Why:** Environment variables provide a flexible and secure way to adapt Jest’s behavior to various deployment contexts. """javascript // jest.config.js module.exports = { testEnvironment: 'node', testMatch: process.env.CI ? ['**/__tests__/**/*.[jt]s?(x)', '**/?(*.)+(spec|test).[jt]s?(x)'] : ['<rootDir>/src/**/*.test.js'], // conditionally change based on env variable }; """ * **CI/CD Example:** Inside a CI/CD pipeline configuration: """yaml jobs: test: runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - name: Run tests run: npm test env: CI: true # Set the CI environment variable """ ## 2. Production Considerations ### 2.1. Standards for Test Data Management * **Do This:** Use realistic but sanitized test data for integration and end-to-end tests. * **Don't Do This:** Use sensitive production data directly in tests. * **Why:** Protect real user data and avoid unintended side effects in production systems. **Example:** Mocking a database query with predefined data: """javascript // __mocks__/db.js const mockUsers = [ { id: 1, name: 'Test User 1' }, { id: 2, name: 'Test User 2' }, ]; module.exports = { getUsers: jest.fn(() => Promise.resolve(mockUsers)), }; """ """javascript // user.service.test.js jest.mock('./db'); import { getUsers } from './user.service'; import * as db from './db'; // Import the mocked module describe('UserService', () => { it('should return a list of users', async () => { const users = await getUsers(); expect(db.getUsers).toHaveBeenCalled(); // Verify that the mocked function was called expect(users).toEqual([ { id: 1, name: 'Test User 1' }, { id: 2, name: 'Test User 2' }, ]); }); }); """ ### 2.2. Standards for Environment Isolation * **Do This:** Ensure tests run in isolated environments to prevent interference with production systems. Use technologies like Docker or VMs. * **Don't Do This:** Run tests directly against production databases or APIs. * **Why:** Avoid data corruption, performance degradation, and security vulnerabilities. **Example (Docker):** Create a Dockerfile to define a test environment. """dockerfile # Dockerfile FROM node:18 WORKDIR /app COPY package*.json ./ RUN npm install COPY . . CMD ["npm", "test"] """ * Build and run the container in your CI/CD pipeline. """bash docker build -t my-test-env . docker run my-test-env """ ### 2.3. Standards for Feature Flags * **Do This:** Use feature flags to enable or disable new features in production, allowing for A/B testing and gradual rollout. Write tests that cover both states of the feature flag. * **Don't Do This:** Deploy code without feature flags for critical features. * **Why:** Feature flags provide a safety net, reducing the risk associated with deploying new code. **Example:** """javascript // feature-flags.js const flags = { newFeatureEnabled: process.env.NEW_FEATURE_ENABLED === 'true', }; export default flags; """ """javascript // component.test.js import MyComponent from './component'; import featureFlags from './feature-flags'; jest.mock('./feature-flags', () => ({ __esModule: true, // This is important for mocking ES modules default: { newFeatureEnabled: false, // Default mock value }, })); describe('MyComponent', () => { it('should render the old UI when the new feature is disabled', () => { featureFlags.newFeatureEnabled = false; const { container } = render(<MyComponent />); expect(container).toHaveTextContent('Old UI'); }); it('should render the new UI when the new feature is enabled', () => { // Update the mock feature flag value using jest.requireActual if needed. jest.resetModules(); jest.doMock('./feature-flags', () => ({ __esModule: true, default: { newFeatureEnabled: true, }, })); // Explicit mock const featureFlags = require('./feature-flags').default; // Re-require to get the fresh mock const { container } = render(<MyComponent />); expect(container).toHaveTextContent('New UI'); // Make assertions based on the new feature being enabled }); }); """ **Note:** "jest.resetModules()" and "jest.doMock" are essential here to ensure that the "featureFlags" module is re-evaluated with the updated mock value for the second test case. Also, make sure you mock the ES module correctly using "__esModule: true". ### 2.4. Standards for Performance Monitoring * **Do This:** Monitor the performance of your tests in CI/CD. Track metrics like test execution time, memory usage, and CPU utilization. * **Don't Do This:** Ignore performance regressions in your test suite. * **Why:** Performance bottlenecks in tests can impact the overall CI/CD pipeline and deployment speed. **Example:** Using "jest-circus" (the default runner): Use "jest.retryTimes()" with caution, as it can mask underlying issues. Address flaky tests instead of relying solely on retries. Measure test durations directly within your tests (less robust but sometimes necessary): """javascript // my.test.js describe('My Test Suite', () => { it('My Test', async () => { const start = performance.now(); // Perform test logic await new Promise(resolve => setTimeout(resolve, 500)); // Simulate some work const end = performance.now(); const duration = end - start; console.log("Test duration: ${duration}ms"); // Log duration for analysis expect(duration).toBeLessThan(1000); // Optional: Add an assertion }); }); """ Also, you should invest to custom reporters to report more detailed information for advanced statistics. ### 2.5. Standards for Rollback Strategies * **Do This:** Implement rollback strategies in case of failed deployments. This might involve reverting to a previous version or using feature flags to disable problematic features. * **Don't Do This:** Deploy changes without a clear plan for handling failures. * **Why:** Rollback strategies minimize the impact of deployment errors and ensure business continuity. **Example:** If a test fails after a deployment (e.g., a smoke test), automatically trigger a rollback in your CI/CD pipeline. ### 2.6. Standards for Smoke Tests * **Do This:** Implement smoke tests to quickly verify the basic functionality of your application after deployment. These tests should be fast and cover critical user flows. * **Don't Do This:** Rely solely on comprehensive integration tests for post-deployment verification, as they can be slow and may not catch critical issues quickly. * **Why:** Smoke tests provide early warning of deployment problems, allowing for quick intervention. """javascript // smoke.test.js describe('Smoke Tests', () => { it('should load the homepage', async () => { const response = await fetch('https://example.com'); expect(response.status).toBe(200); }); it('should allow users to log in', async () => { // Simulate a login attempt const response = await fetch('https://example.com/login', { method: 'POST', body: JSON.stringify({ username: 'test', password: 'password' }), }); expect(response.status).toBe(200); // Add more specific assertions based on the login response }); }); """ ## 3. Modern Approaches and Patterns ### 3.1. Snapshot Testing with Caution * **Do This:** Use snapshot testing to detect unexpected UI changes. Treat snapshots as code and review them carefully. * **Don't Do This:** Blindly update snapshots without understanding the underlying changes. * **Why:** Snapshot testing can be a powerful tool, but it requires careful review to avoid masking regressions. ### 3.2. Contract Testing * **Do This:** Implement contract tests to ensure that APIs and services adhere to agreed-upon contracts. Use tools like Pact or similar libraries. * **Don't Do This:** Neglect contract testing in microservices architectures. * **Why:** Contract testing prevents integration issues caused by API changes. ### 3.3. Property-Based Testing * **Do This:** Explore property-based testing to generate a wide range of inputs for your tests automatically. Use libraries like fast-check. * **Don't Do This:** Rely solely on example-based tests, which may not cover all edge cases. * **Why:** Property-based testing increases test coverage and helps uncover unexpected bugs. ### 3.4. Monitoring and Observability for Tests * **Do This:** Integrate test runs with monitoring and observability tools. Capture metrics like test execution time, CPU usage, and memory consumption. Send traces to understand complex test flows. * **Don't Do This:** Treat tests as a black box. * **Why:** Enhanced monitoring and observability provide insights into test performance and help identify bottlenecks. ### 3.5. Use Mock Service Worker (MSW) * **Do This**: Embrace Mock Service Worker (MSW) for mocking network requests during integration tests. This allows you to intercept and mock fetch or XHR requests at the network level, providing more realistic testing scenarios. * **Don't Do This**: Over-rely on simple "jest.fn()" mocks for network requests, especially when testing complex interactions. * **Why**: MSW provides a more reliable and maintainable way to simulate API behavior in your tests, closely mimicking real-world conditions without hitting actual endpoints during testing. This decouples your tests from backend dependencies and makes them more predictable and faster. **Example:** """javascript // src/mocks/handlers.js import { rest } from 'msw' export const handlers = [ rest.get('https://api.example.com/users', (req, res, ctx) => { return res( ctx.status(200), ctx.json([ { id: 1, name: 'John Doe' }, { id: 2, name: 'Jane Smith' }, ]) ) }), ] """ """javascript // src/mocks/browser.js import { setupWorker } from 'msw' import { handlers } from './handlers' // This configures a request mocking server with the given request handlers. export const worker = setupWorker(...handlers) """ """javascript // src/setupTests.js import { worker } from './mocks/browser' import '@testing-library/jest-dom' // Establish API mocking before all tests. beforeAll(() => worker.listen()) // Reset any request handlers that we may add during the tests, // so they don't affect other tests. afterEach(() => worker.resetHandlers()) // Clean up after the tests are finished. afterAll(() => worker.close()) """ """javascript // component.test.js import React from 'react'; import { render, screen, waitFor } from '@testing-library/react'; import UserList from './UserList'; // Assuming your component is named UserList describe('UserList', () => { it('fetches and displays a list of users', async () => { render(<UserList />); // Wait for the data to load await waitFor(() => screen.getByText('John Doe')); // Or any other user's name // Assert that the users are displayed expect(screen.getByText('John Doe')).toBeInTheDocument(); expect(screen.getByText('Jane Smith')).toBeInTheDocument(); }); }); """ ## 4. Security best practices specific to Jest ### 4.1. Dependencies security checks * **Do This:** Implement automated dependency security audits in your CI/CD pipeline using tools like "npm audit", "yarn audit" or "snyk". * **Don't Do This:** Ignore security vulnerabilities reported by dependency audits. * **Why:** Ensure that your project's dependencies don't contain known security vulnerabilities. """yaml # GitHub Actions example steps: - name: Run npm audit run: npm audit --audit-level high """ ### 4.2. Input Sanitization in Tests * **Do This:** Sanitize any external input used in your tests, especially when simulating user input or API responses. * **Don't Do This:** Directly use untrusted input without validation, which could lead to injection attacks. * **Why:** Prevent potential security exploits that might be triggered during testing, reflecting real-world vulnerabilities. ### 4.3. Avoid Exposing Sensitive Information * **Do This:** Store sensitive information like API keys and credentials securely using environment variables or secrets management tools. Pass these secrets to your tests through environment variables. * **Don't Do This:** Hardcode sensitive information directly in your test files or configuration. * **Why:** Protect sensitive data from unauthorized access, especially in shared environments. ### 4.4. Disable "testResultsProcessor" in production builds * **Do This:** Ensure that the "testResultsProcessor" configuration option is not used in production builds or deployments. * **Don't Do This:** Leave "testResultsProcessor" enabled as it could potentially expose test results and internal data. * **Why:** "testResultsProcessor" is typically used for custom reporting or data transformation during testing and should not be included in production environments. ### 4.5. Evaluate Custom Reporters Carefully * **Do This:** When using custom reporters, ensure they are from trusted sources and thoroughly review their code. * **Don't Do This:** Blindly install and use custom reporters without understanding their functionality and potential security implications. * **Why:** Malicious custom reporters could potentially compromise your test environment or expose sensitive data. ### 4.6. Isolate test environment * **Do This:** Run your tests in an isolated environment, such as a Docker container, to protect against external interferences and unwanted side effects. Limit the container's access to network resources to only what is necessary. * **Don't Do This:** Give a tests container too much privileges as a compromised test environment could be used as a pivot point to attack other systems. * **Why:** Isolate tests to achieve more reliable, predictable and secure tests. ## 5. Conclusion By adhering to these deployment and DevOps standards, development teams can ensure that Jest tests are reliable, efficient, and secure. This will lead to faster CI/CD pipelines, reduced risk of production issues, and higher-quality software. Regularly review and update these standards to stay current with the latest Jest features and best practices.
# State Management Standards for Jest This document provides coding standards for managing state within Jest tests. These standards ensure tests are predictable, maintainable, and avoid common pitfalls associated with stateful testing. We'll cover approaches suitable for various state management patterns, including those used in React (Redux, Zustand, Context), Vue (Vuex, Pinia), and other JavaScript frameworks, focusing on the latest Jest features and best practices. ## 1. General Principles ### 1.1. Stateless Tests **Rule:** Tests should ideally be stateless to prevent interference and ensure consistent results. Each test should be independent and not rely on the state left by previous tests. **Do This:** * Reset the state before each test or suite. * Avoid modifying global variables directly within tests. * Use mock implementations or spies to control the behavior of dependencies. **Don't Do This:** * Rely on the order of test execution. * Persist state between tests without proper cleanup. **Why:** Stateful tests are prone to creating flaky tests, where the results can vary depending on environment and order of execution. This makes debugging difficult and reduces confidence in the test suite. ### 1.2. Clear State Initialization **Rule:** Explicitly define the initial state within your tests or mock implementations if the system under test relies on it. **Do This:** * Provide default values when mocking state. * Use "beforeEach" or "beforeAll" hooks to initialize state before running tests. * Document the state initialization process. **Don't Do This:** * Rely on implicit or undefined state. * Leave state initialization ambiguous or undocumented. **Why:** Clearity in state setup avoids assumptions that can lead to errors and inconsistencies. Documented initialization helps new developers understand the dependencies within the tested system. ### 1.3. Controlled Mutations **Rule:** When state needs to be mutated during a test, do so in a controlled and predictable manner. **Do This:** * Use controlled mutation functions on state objects. * Make assertions on the expected state after interaction. * Where applicable, test the reducer/state-altering function in isolation. **Don't Do This:** * Mutate state directly without proper tracking. * Make assumptions about the state after actions without explicitly asserting it. **Why:** Controlled mutations are crucial for verifying the system's behavior following state transition. Assertions help ensure the transition is correct. ## 2. Managing Application State ### 2.1. Mocking State Management Libraries (Redux, Zustand, Pinia etc.) **Rule**: When testing components or modules that interact with state management libraries, mock the library to control the state in a test environment and decouple your tests from the complexities of the actual state management setup. **Do This:** * Use "jest.mock()" to replace the actual state management instances with mock implementations. * Create getter and setter mock functions for the state within the mock implementation. * Implement "dispatch" or "update" function mocks that simulate state changes. **Don't Do This:** * Test the state management library itself. Focus on testing the component's behavior in response to the state changes. * Import the real store/state directly into your tests, which tightly couples the test to the real implementation. **Why:** Mocking decouples the test from the actual state management implementation, making tests faster, more resilient to changes, and easier to understand. **Example (Redux):** """javascript // myComponent.test.js import React from 'react'; import { render, screen, fireEvent } from '@testing-library/react'; import { Provider } from 'react-redux'; import configureStore from 'redux-mock-store'; // Install redux-mock-store! import MyComponent from './MyComponent'; // Mock the redux store const mockStore = configureStore([]); describe('MyComponent', () => { it('should update state on button click', () => { const initialState = { myReducer: { count: 0, } }; const store = mockStore(initialState); render( <Provider store={store}> <MyComponent /> </Provider> ); const button = screen.getByText('Increment'); fireEvent.click(button); const actions = store.getActions(); expect(actions[0].type).toEqual('INCREMENT_COUNT'); }); }); //MyComponent import React from 'react'; import { useDispatch, useSelector } from 'react-redux'; const MyComponent = () => { const dispatch = useDispatch(); const count = useSelector(state => state.myReducer.count); const handleIncrement = () => { dispatch({ type: 'INCREMENT_COUNT' }); }; return ( <div> <p>Count: {count}</p> <button onClick={handleIncrement}>Increment</button> </div> ); }; export default MyComponent; """ **Example (Zustand):** """javascript // myComponent.test.js import React from 'react'; import { render, screen, fireEvent } from '@testing-library/react'; import useMyStore from './myStore'; // Assuming you have a Zustand store import MyComponent from './MyComponent'; // Jest mock the store. Always do before imports! jest.mock('./myStore', () => { const mockSet = jest.fn(); // This will track calls to set return { __esModule: true, // This is important for ES module mocks! default: () => ({ count: 0, increment: mockSet, // Mock the increment method set: mockSet }), mockSet: mockSet }; }); describe('MyComponent', () => { it('should call increment when button is clicked', () => { render(<MyComponent />); const button = screen.getByText('Increment'); fireEvent.click(button); // Assert that set was called correctly expect(useMyStore().increment).toHaveBeenCalled(); }); it('should display the initial count', () => { render(<MyComponent />); expect(screen.getByText('Count: 0')).toBeInTheDocument(); }); }); // myStore.js import { create } from "zustand"; const useMyStore = create((set) => ({ count: 0, increment: () => set((state) => ({ count: state.count + 1 })), })); export default useMyStore; // MyComponent.js import React from 'react'; import useMyStore from './myStore'; const MyComponent = () => { const { count, increment } = useMyStore(); return ( <div> <p>Count: {count}</p> <button onClick={increment}>Increment</button> </div> ); }; export default MyComponent; """ **Example (Pinia):** """javascript // myComponent.test.js import { render, screen, fireEvent } from '@testing-library/react'; import { useMyStore } from './myStore'; import MyComponent from './MyComponent'; import { setActivePinia, createPinia } from 'pinia'; // Import Pinia functions // Jest mock the store. Always do before imports! jest.mock('./myStore', () => { const mockIncrement = jest.fn(); return { __esModule: true, // Important for ES module mocks useMyStore: () => ({ count: 0, increment: mockIncrement, $patch: jest.fn() //mock patch }), mockIncrement: mockIncrement }; }); describe('MyComponent', () => { beforeEach(() => { // setup Pinia for each test setActivePinia(createPinia()) }) it('should call increment when button is clicked', () => { render(<MyComponent />); const button = screen.getByText('Increment'); fireEvent.click(button); // Assert that increment was called on the mock store expect(useMyStore().increment).toHaveBeenCalled(); }); it('should display the initial count', () => { render(<MyComponent />); expect(screen.getByText('Count: 0')).toBeInTheDocument(); }); }); // myStore.js import { defineStore } from 'pinia' export const useMyStore = defineStore('myStore', { state: () => { return { count: 0 } }, actions: { increment() { this.count++ }, }, }) // MyComponent.js import React from 'react'; import { useMyStore } from './myStore'; const MyComponent = () => { const myStore = useMyStore(); const { count, increment } = myStore; return ( <div> <p>Count: {count}</p> <button onClick={increment}>Increment</button> </div> ); }; export default MyComponent; """ **Common Anti-Patterns:** * **Testing Implementation Details:** Testing the internal workings of the state management library rather than the component's behavior is brittle. * **Skipping Mocks:** Not mocking leads to integration tests instead of isolated unit tests, making it hard to pinpoint issues. ### 2.2. Context API **Rule**: When testing components using React's Context API, mock the context provider to control the context value within the test environment. **Do This:** * Create a mock context provider. * Provide the mock provider with a controlled value. * Wrap the component under test with the mock provider. **Don't Do This:** * Rely on the actual context implementation during testing. * Modify the global context directly in tests. **Why:** This ensures that only the necessary value is tested without dependencies on the broader implementation details. **Example:** """javascript // theme-context.js (Simplified context example) import React, { createContext, useState, useContext } from 'react'; const ThemeContext = createContext(); export const ThemeProvider = ({ children }) => { const [theme, setTheme] = useState('light'); const toggleTheme = () => { setTheme(prevTheme => (prevTheme === 'light' ? 'dark' : 'light')); }; return ( <ThemeContext.Provider value={{ theme, toggleTheme }}> {children} </ThemeContext.Provider> ); }; export const useTheme = () => useContext(ThemeContext); //ThemedComponent.js import React from 'react'; import { useTheme } from './theme-context'; const ThemedComponent = () => { const { theme, toggleTheme } = useTheme(); return ( <div style={{ backgroundColor: theme === 'light' ? 'white' : 'black', color: theme === 'light' ? 'black' : 'white' }}> <p>Current Theme: {theme}</p> <button onClick={toggleTheme}>Toggle Theme</button> </div> ); }; export default ThemedComponent; // ThemedComponent.test.js import React from 'react'; import { render, screen, fireEvent } from '@testing-library/react'; import ThemedComponent from './ThemedComponent'; import { ThemeContext } from './theme-context'; // Mock context provider const MockThemeProvider = ({ theme, toggleTheme, children }) => ( <ThemeContext.Provider value={{ theme, toggleTheme }}> {children} </ThemeContext.Provider> ); describe('ThemedComponent', () => { it('should display the correct theme from context', () => { render( <MockThemeProvider theme="dark" toggleTheme={() => {}}> <ThemedComponent /> </MockThemeProvider> ); expect(screen.getByText('Current Theme: dark')).toBeInTheDocument(); }); it('should call toggleTheme when the button is clicked', () => { const toggleThemeMock = jest.fn(); render( <MockThemeProvider theme="light" toggleTheme={toggleThemeMock}> <ThemedComponent /> </MockThemeProvider> ); fireEvent.click(screen.getByText('Toggle Theme')); expect(toggleThemeMock).toHaveBeenCalledTimes(1); }); }); """ **Common Anti-Patterns:** * **Directly Importing Context:** Avoid importing the actual "ThemeProvider" in test files, which would defeat the purpose of isolation. * **Ignoring Context:** Neglecting to mock the context can lead to tests unknowingly relying on default or global context value. ### 2.3. State Machines (XState, etc.) **Rule**: When your application uses state machines, test the state transitions and side effects of each transition. **Do This:** * Instantiate the state machine with a mock context. * Send events to the state machine. * Assert that the state machine transitions to the expected state. * Assert that any side effects are triggered (through mocked functions). **Don't Do This:** * Test the state machine library itself. Test *your* machine definition. * Ignore testing side effects that depend on the state machine. **Why:** State machines are deterministic; ensuring correctness requires explicitly verifying transitions and side effects. **Example (XState):** """javascript // bookingMachine.js (Simplified XState example) import { createMachine, assign } from 'xstate'; const bookingMachine = createMachine({ id: 'booking', initial: 'idle', context: { passengers: 0, }, states: { idle: { on: { ADD_PASSENGER: { actions: assign({ passengers: (context) => context.passengers + 1 }) } } }, // ... more states } }); export default bookingMachine; // bookingMachine.test.js import bookingMachine from './bookingMachine'; import { Interpreter } from 'xstate'; describe('bookingMachine', () => { it('should increment passengers when ADD_PASSENGER event is sent', () => { const service = new Interpreter(bookingMachine.withContext({ passengers: 0 })); //Provide a context for our machine to use to set our initial passengers value to 0 service.start(); service.send({ type: 'ADD_PASSENGER' }); expect(service.state.context.passengers).toBe(1); service.send({ type: 'ADD_PASSENGER' }); expect(service.state.context.passengers).toBe(2); }); }); """ **Explanation:** 1. **(bookingMachine.js)**: Define the state machine using "createMachine" from XState. This machine manages passengers. 2. **(bookingMachine.test.js)**: * Import the state machine definition. * Create an "Interpreter" instance and start it. * Send events to the state machine using "service.send()". * Assert the state changes using "expect(service.state.value).toBe(...)" and context changes with "expect(service.state.context.passengers)" ## 3. Dealing with Global State ### 3.1. Isolating Global State Modifications **Rule**: Modifications to global state should be isolated and reverted after each test to prevent cross-test contamination. **Do This:** * Use "jest.spyOn" and "mockRestore" to restore original implementations. * Store previous values of global variables and restore them in "afterEach" or "afterAll" blocks. **Don't Do This:** * Leave global state modifications without reverting them. * Rely on default browser or Node.js environments to handle global state. **Why:** Modifying globals can create unpredictable behavior that is difficult to trace. **Example:** """javascript // moduleThatModifiesGlobal.js export function modifyGlobal(newValue) { global.myGlobalVariable = newValue; } // moduleThatReadsGlobal.js export function readGlobal() { return global.myGlobalVariable; } // moduleThatModifiesGlobal.test.js import { modifyGlobal } from './moduleThatModifiesGlobal'; import { readGlobal } from './moduleThatReadsGlobal'; describe('Global State Modification', () => { const originalValue = global.myGlobalVariable; // Store original value afterEach(() => { global.myGlobalVariable = originalValue; // Restore value after each test }); it('should modify global state', () => { modifyGlobal('new value'); expect(readGlobal()).toBe('new value'); }); it('should not affect other tests', () => { expect(readGlobal()).toBe(originalValue); //Make sure it is the original value }); }); """ ### 3.2. Mocking Browser APIs **Rule:** When testing code that relies on browser APIs (e.g., "window", "document", "localStorage"), mock these objects to ensure tests are not affected by the actual browser environment. **Do This:** * Use "jest.spyOn" to mock methods and properties of browser APIs. * Create mock implementations for complex objects like "localStorage". * Restore mocks after each test using "mockRestore". **Don't Do This:** * Run tests directly in a browser environment without mocking. * Hardcode browser-specific values in tests without mocking. **Why:** Using mocks will make the tests more predictable regardless of the environment they are running in. **Example:** """javascript // moduleUsingLocalStorage.js export function saveToLocalStorage(key, value) { localStorage.setItem(key, value); } export function getFromLocalStorage(key) { return localStorage.getItem(key); } // moduleUsingLocalStorage.test.js import { saveToLocalStorage, getFromLocalStorage } from './moduleUsingLocalStorage'; describe('LocalStorage', () => { let localStorageMock; beforeEach(() => { localStorageMock = { getItem: jest.fn(), setItem: jest.fn(), clear: jest.fn(), }; global.localStorage = localStorageMock; }); afterEach(() => { global.localStorage = undefined; }); it('should save to localStorage', () => { saveToLocalStorage('myKey', 'myValue'); expect(localStorageMock.setItem).toHaveBeenCalledWith('myKey', 'myValue'); }); it('should get from localStorage', () => { localStorageMock.getItem.mockReturnValue('myValue'); const value = getFromLocalStorage('myKey'); expect(localStorageMock.getItem).toHaveBeenCalledWith('myKey'); expect(value).toBe('myValue'); }); }); """ ## 4. Asynchronous State Updates ### 4.1. Waiting for Updates **Rule:** Use Jest's asynchronous testing tools to properly handle and wait for state updates triggered by asynchronous operations to ensure updates are complete before assertions are made. **Do This:** * Use "async/await" with "Promise" based updates * Use "waitFor" or "findBy*" methods from "@testing-library/react" which handle re-renders. * Utilize "jest.advanceTimersByTime" if timers are involved in state transition. **Don't Do This:** * Avoid using "setTimeout" with arbitrary wait times; instead, use methods that wait for specific conditions to be met. * Forget to await asynchronous updates before making assertions, which can lead to false positives or negatives. **Why:** Async operations, such as fetching data from an API, can cause components and their state to update after an initial render. Waiting for these updates ensures your assertions are accurate and reliable. **Example with "async/await":** """javascript // Component triggering asynchronous state update const AsyncComponent = () => { const [data, setData] = React.useState(null); React.useEffect(() => { const fetchData = async () => { const result = await Promise.resolve({ message: "Hello Async" }); setData(result); }; fetchData(); }, []); return <div>{data ? data.message : 'Loading...'}</div>; }; // Test with async/await import { render, screen, waitFor } from '@testing-library/react'; it('should update state asynchronously', async () => { render(<AsyncComponent />); // Wait for the 'Loading...' text to disappear await waitFor(() => screen.getByText('Hello Async')); // Now assert that the updated state is rendered expect(screen.getByText('Hello Async')).toBeInTheDocument(); }); """ ### 4.2. Mocking Asynchronous Functions **Rule:** When testing asynchronous state updates, mock the asynchronous functions (e.g., API calls) to control the outcome and timing of the updates. **Do This:** * Use "jest.spyOn" to mock functions that cause asynchronous updates. * Use "mockResolvedValue" or "mockRejectedValue" to control the promise's resolution or rejection. * Use "async/await" in the test to handle the asynchronous nature of the mock. **Don't Do This:** * Call actual APIs during tests, leading to slow and unreliable tests. * Forget to handle promise rejections, which can cause unhandled promise rejection errors. **Why:** It is important to control the behavior so when assertions are made, you know the system under test matches your expectations. **Example:** """javascript //Component fetching data which affects state const DataFetchingComponent = () => { const [data, setData] = React.useState(null); React.useEffect(() => { const fetchData = async () => { const response = await fetch('/api/data'); const result = await response.json(); setData(result); }; fetchData(); }, []); return <div>{data ? data.message : 'Loading...'}</div>; }; // Mocking an API call it('should fetch and display data', async () => { const mockData = { message: 'Mocked Data' }; //Before render, mock the value being returned. global.fetch = jest.fn(() => Promise.resolve({ json: () => Promise.resolve(mockData), }) ); render(<DataFetchingComponent />); // Wait for the data to load and the component to update await waitFor(() => expect(screen.getByText('Mocked Data')).toBeInTheDocument()); }); """ ## 5. Performance Considerations ### 5.1. Avoiding Excessive Renders **Rule:** Optimize test performance by minimizing the number of renders triggered during tests, especially for complex components. **Do This:** * Use "React.memo" or "useMemo" to prevent unnecessary re-renders. * Write tests to only trigger updates in the specific parts of the component being tested. * Use "act" from React Testing Library to batch multiple state updates when necessary **Don't Do This:** * Trigger broad or unnecessary state updates. * Ignore performance warnings or logs related to excessive renders. **Why:** Reducing the number of renders will speed up the running the tests. ### 5.2. Minimize Test Setup **Rule:** Keep test setup minimal to reduce the overall execution time of test suites. **Do This:** * Only mock the state and dependencies required for each test. * Use the same setup and teardown for similar tests * Refactor redundant setups. **Don't Do This:** * Create overly complex or unnecessary setups. * Repeat setup code across multiple tests without refactoring. **Why:** Minimizing the setup means less time consumed before the actual testing. ## 6. Security Considerations ### 6.1. Sensitive Information **Rule:** Avoid storing any sensitive or production-specific data in the jest environment, and specifically avoid committing it to your source control (e.g., using ".gitignore"). **Do This:** * Use placeholders or mock values for sensitive data (API keys, passwords, IDs). * Ensure test data are not related to any actual production data. * Use environment variables or protected config files external to source control where necessary. **Don't Do This:** * Put secrets in plain text in test files. * Commit test files to your source repository that may expose such data. **Why:** Prevents accidental leaking of credentials or production-specific information. This also makes test data consistent and prevents tests failing due to changes in production information. ### 6.2. Input Validation **Rule:** Test for appropriate input sanitization and validation for any component/function that takes user-provided, or external data to prevent injection or cross-site scripting (XSS) issues. **Do This:** * Check behavior with invalid, malicious, and edge-case inputs. * Test for proper encoding or escaping of output to prevent XSS. **Example:** """javascript // Component accepting user input const UserInputComponent = ({ onSubmit }) => { const [userInput, setUserInput] = React.useState(''); const handleSubmit = () => { onSubmit(userInput); // Submit user input }; return ( <div> <input value={userInput} onChange={e => setUserInput(e.target.value)} /> <button onClick={handleSubmit}>Submit</button> </div> ); }; // Test for input sanitization it('should sanitize user input', () => { const onSubmit = jest.fn(); render(<UserInputComponent onSubmit={onSubmit} />); const maliciousInput = '<script>alert("XSS");</script>'; fireEvent.change(screen.getByRole('textbox'), { target: { value: maliciousInput } }); fireEvent.click(screen.getByRole('button')); expect(onSubmit).toHaveBeenCalledWith(expect.not.stringContaining('<script>')); // Ensure input is sanitized }); """ ## Conclusion Following these state management standards for Jest tests ensures robust, reliable, and maintainable test suites. These standards contribute to high-quality software and enable developers to work with confidence. Adhering to these standards and continuously refining them based on project-specific requirements will lead to better testing practices and contribute to the overall success of your team's projects.
# Tooling and Ecosystem Standards for Jest This document outlines the coding standards and best practices specifically related to the tooling and ecosystem surrounding Jest. Following these standards will ensure maintainable, performant, and reliable Jest tests within your projects. ## 1. Recommended Libraries and Tools Choosing the right tools within the Jest ecosystem significantly improves developer experience, test reliability, and code quality. ### 1.1 Testing Libraries * **"@testing-library/react" (or similar for other frameworks):** Prioritize user-centric testing instead of implementation details. This reduces brittle tests and promotes refactoring confidence. * **Why:** Tests focus on how users interact with components, making them less likely to break due to internal code changes. This also encourages better accessibility practices. * **Do This:** Use "screen.getByRole", "screen.getByText", and other query methods from "@testing-library/react" whenever possible. * **Don't Do This:** Access component instances directly or rely on internal state. """javascript // Good: Testing user interaction import { render, screen } from '@testing-library/react'; import userEvent from '@testing-library/user-event'; import MyComponent from './MyComponent'; test('increments counter on button click', async () => { render(<MyComponent />); const button = screen.getByRole('button', { name: 'Increment' }); await userEvent.click(button); expect(screen.getByText('Count: 1')).toBeInTheDocument(); }); // Bad: Testing internal state import { render, screen } from '@testing-library/react'; import MyComponent from './MyComponent'; test('increments counter on button click (BAD)', () => { const { container } = render(<MyComponent />); const button = container.querySelector('button'); // Direct DOM access button.click(); // This test is fragile because it relies on implementation details. // It breaks if the internal state or structure of the component changes. }); """ * **"jest-dom":** Provides custom Jest matchers that make assertions about the DOM more readable and intuitive. * **Why:** Improves test clarity and reduces boilerplate. * **Do This:** Use matchers like "toBeVisible", "toHaveTextContent", "toHaveClass", and "toBeInTheDocument". * **Don't Do This:** Rely on generic Jest matchers (e.g., "toBe(true)") when more specific "jest-dom" matchers are available. """javascript import { render, screen } from '@testing-library/react'; import MyComponent from './MyComponent'; import '@testing-library/jest-dom'; test('element is visible', () => { render(<MyComponent />); const element = screen.getByText('Hello'); expect(element).toBeVisible(); }); """ * **"msw" (Mock Service Worker):** Mock network requests at the network level within the browser. * **Why:** Provides realistic mocking scenarios that are less brittle than mocking individual functions. It enables testing the full request/response lifecycle. * **Do This:** Create a service worker mock that intercepts API calls and returns predefined responses. * **Dont Do This:** Use "jest.fn()" to mock "fetch" or "XMLHttpRequest" if complex API interactions are involved. """javascript // Example using MSW to mock a GET request import { rest } from 'msw' import { setupServer } from 'msw/node' //Or msw/browser if in browser context import { render, screen, waitFor } from '@testing-library/react'; import MyComponentThatFetchesData from './MyComponentThatFetchesData'; const handlers = [ rest.get('/api/data', (req, res, ctx) => { return res( ctx.status(200), ctx.json({ message: 'Mocked data' }) ) }), ] const server = setupServer(...handlers) beforeAll(() => server.listen()) afterEach(() => server.resetHandlers()) afterAll(() => server.close()) test('fetches and displays data', async () => { render(<MyComponentThatFetchesData />); await waitFor(() => screen.getByText('Mocked data')); expect(screen.getByText('Mocked data')).toBeInTheDocument(); }); """ * **"jest-mock-extended":** Provides type-safe mocks for TypeScript projects, avoiding "any" types in mocks * **Why:** Makes mocking easier and safer in TypeScript. * **Do This:** Use the factory functions provided to create mocks. * **Don't Do This:** Cast to "any" to get around type errors when mocking classes or interfaces. """typescript // Example using jest-mock-extended with TypeScript import { mock } from 'jest-mock-extended'; import { MyService } from './MyService'; describe('MyComponent', () => { it('calls the service', () => { const mockService = mock<MyService>(); mockService.getData.mockReturnValue(Promise.resolve('mocked data')); //... perform test expect(mockService.getData).toHaveBeenCalled(); }); }); """ ### 1.2 Assertion Libraries * **"expect" (built-in to Jest):** Use the built-in "expect" for basic assertions. Extend with "jest-dom" matchers where appropriate. * **Why:** "expect" is core to Jest and provides a solid foundation for assertions. * **Do This:** Use "expect.toBe", "expect.toEqual", "expect.toHaveBeenCalled", "expect.toThrow", etc. * **Don't Do This:** Use other assertion libraries unless they provide very specific functionality not covered by "expect". """javascript test('adds 1 + 2 to equal 3', () => { expect(1 + 2).toBe(3); }); test('object assignment', () => { const data = {one: 1}; data['two'] = 2; expect(data).toEqual({one: 1, two: 2}); }); """ ### 1.3 Utilities and Helpers * **"identity-obj-proxy":** A proxy for CSS modules that allows you to import them in tests without actually loading the CSS. * **Why:** Speeds up tests and avoids the need to compile CSS during testing. * **Do This:** Configure Jest to use "identity-obj-proxy" for CSS modules. * **Don't Do This:** Attempt complicated CSS mocking or compilation within your tests unless component styling is what you're explicitly testing. * **"cross-fetch" (or "node-fetch"):** Use a consistent "fetch" API in both browser and Node.js environments when testing code that makes HTTP requests. * **Why:** Ensures tests run consistently regardless of the environment. * **Do This:** Install and import "cross-fetch" or "node-fetch" and use it instead of the native "fetch" in your tests. * **Don't Do This:** Rely on the availability of the native "fetch" API, especially in Node.js environments. ### 1.4 Jest Extensions and Plugins * **Jest Visual Studio Code Extension:** Provides in-editor test running, debugging, and code coverage visualization. Facilitates a faster feedback loop during test development. * **Why:** Increase developer productivity by enabling direct test execution and observation within the IDE. * **Do This:** Install the Jest extension and configure settings to match the project (e.g., Jest path, test patterns). * **Don't Do This:** Rely solely on command-line testing, as it's often slower and less convenient. Configure the extension properly to automatically find tests. ## 2. Configuration and Setup Proper Jest configuration is crucial for optimal performance and accurate test results. ### 2.1 "jest.config.js" (or "jest.config.ts") * **"testEnvironment":** Set the appropriate test environment (e.g., "jsdom" for browser-like environments, "node" for Node.js). * **Why:** Ensures that your tests have access to the expected global objects and APIs. * **Do This:** Explicitly set the "testEnvironment" in your "jest.config.js" based on your target environment. * For React applications use "jsdom". """javascript // jest.config.js module.exports = { testEnvironment: 'jsdom', }; """ * **"transform":** Use Babel or other transforms to transpile your code for compatibility with the Node.js environment used by Jest. * **Why:** Allows you to write tests using the latest JavaScript syntax and features. * **Do This:** Configure "transform" to use "babel-jest" or other appropriate transformers for your project. * **Don't Do This:** Skip transpilation, as this can lead to syntax errors and incorrect test results. * **"moduleNameMapper":** Map module aliases to their actual paths, especially when using webpack or other module bundlers. * **Why:** Resolves module import errors during testing. * **Do This:** Define module aliases in "moduleNameMapper" that mirror your project's module resolution configuration. * **Don't Do This:** Neglect to configure module name mapping, which can lead to "module not found" errors during tests. """javascript // jest.config.js module.exports = { moduleNameMapper: { '^@components/(.*)$': '<rootDir>/src/components/$1', '^@utils/(.*)$': '<rootDir>/src/utils/$1', }, }; """ * **"setupFilesAfterEnv":** Use this option to specify setup files that run after the testing environment has been set up. This is the place to import libraries like "@testing-library/jest-dom". * **Why:** Provides a centralized location to configure the testing environment before each test suite runs. * **Do This:** Import necessary libraries and configure global mocks or stubs in "setupFilesAfterEnv". * **Don't Do This:** Perform environment setup directly within your test files, as this can lead to duplication and inconsistencies. """javascript // jest.config.js module.exports = { setupFilesAfterEnv: ['<rootDir>/src/setupTests.js'], }; // src/setupTests.js import '@testing-library/jest-dom/extend-expect'; // Extend expect with jest-dom matchers """ * **"collectCoverageFrom":** Specify the files and directories for which code coverage should be collected. * **Why:** Ensures accurate and comprehensive code coverage reporting. * **Do This:** Include all relevant source files in "collectCoverageFrom", excluding test files and other non-essential files. * **Don't Do This:** Collect coverage from test files or exclude important source files, which can lead to misleading coverage results. """javascript // jest.config.js module.exports = { collectCoverageFrom: [ 'src/**/*.{js,jsx,ts,tsx}', '!src/**/*.d.ts', '!src/index.tsx', '!src/reportWebVitals.ts', '!src/setupTests.ts', ], }; """ ### 2.2 ".babelrc" (or "babel.config.js") * Configure Babel to support the JavaScript syntax used in your codebase. * **Why:** Enables Jest to understand modern JavaScript syntax. * **Do This:** Use presets like "@babel/preset-env" and "@babel/preset-react" to support the latest JavaScript features and React syntax. * **Don't Do This:** Use outdated Babel configurations that may not support the latest JavaScript syntax. ### 2.3 Ignoring Files * **".test.js", ".spec.js", "__tests__":** Ensure that test files use consistent naming and are placed in predictable locations. * **Why:** Facilitates easy test discovery and organization. * **Do This:** Adopt a consistent naming convention for test files (e.g., ".test.js", ".spec.js") and place them either next to the source files they test or in a dedicated "__tests__" directory. * **Don't Do This:** Use inconsistent naming conventions or scatter test files throughout the codebase. ### 2.4 CI/CD Integration * **Running Tests in CI:** Integrate Jest into your CI/CD pipeline to automatically run tests on every commit or pull request. * **Why:** Ensures that code changes are tested before being merged into the main branch. * **Do This:** Configure your CI/CD system to run the "jest" command (or "npm test" if you have a "test" script defined in your "package.json"). * **Don't Do This:** Skip running tests in CI/CD, as this increases the risk of introducing bugs into the main codebase. * **Code Coverage Reporting:** Integrate code coverage reporting into your CI/CD pipeline to track code coverage metrics over time. * **Why:** Helps identify areas of the codebase that are not adequately tested. * **Do This:** Use a code coverage reporting tool (e.g., Coveralls, Codecov) to collect and visualize code coverage metrics. * **Don't Do This:** Ignore code coverage metrics, as this can lead to a false sense of security about the quality of your tests. ## 3. Mocking Strategies * **Explicit Mocking:** Use "jest.mock()" to mock modules to control dependencies. * **Why:** Prevents dependencies from running their real code during tests (such as APIs). Mocks can simulate various outcomes or errors. * **Do This:** Mock modules using "jest.mock()" and provide mock implementations using "jest.fn()" * **Don't Do This:** Modifying real module behavior in tests, as changes could affect other tests or introduce unintended global side-effects. """javascript // fileToTest.js export const externalFunction = () => { // some code } test('should call the mocked module function', () => { jest.mock('./fileToTest', () => ({ externalFunction: jest.fn(() => 'mockedValue') })); const { externalFunction } = require('./fileToTest'); expect(externalFunction()).toBe('mockedValue'); }) """ ## 4. Performance Optimization Jest offers several features and techniques for optimizing test performance. ### 4.1 "--watchAll=false" in CI/CD * Run tests without watching for file changes in CI/CD environments. * **Why:** Improves test execution time in CI/CD pipelines. * **Do This:** Use the "--watchAll=false" flag when running Jest in CI/CD. * **Don't Do This:** Run Jest in watch mode in CI/CD environments, as this is unnecessary and can slow down the pipeline. ### 4.2 "--findRelatedTests" * Use the "--findRelatedTests" flag to run only the tests that are related to the changed files. * **Why:** Reduces test execution time by running only the necessary tests. * **Do This:** Integrate "--findRelatedTests" into your development workflow to quickly run tests related to your current changes. * **Don't Do This:** Run all tests every time you make a change, potentially leading to longer feedback cycles if not necessary. ### 4.3 "testResultsProcessor" * Utilize a "testResultsProcessor" to customize test output and reporting. * **Why:** Reduce the overhead of processing extensive test results when full details are less crucial. It allows processing of the tests results, to create a more succinct output. It also reduces the load on the console, can also facilitate generating custom reports, or improving the performance of subsequent test runs by analyzing and caching test results * **Do This:** Set up a "testResultsProcessor" in "jest.config.js" to process the test results. """javascript // jest.config.js module.exports = { testResultsProcessor: './testResultsProcessor.js', }; """ ## 5. Security Considerations Securing your Jest tests is crucial to prevent vulnerabilities and ensure the integrity of your testing process. ### 5.1 Avoid Exposing Sensitive Information * **Environment Variables:** Do not hardcode sensitive information (e.g., API keys, passwords) directly into your test code. Use environment variables instead. * **Why:** Prevents accidental exposure of sensitive information in your codebase. * **Do This:** Store sensitive information in environment variables and access them using "process.env". * **Don't Do This:** Commit sensitive information directly into your code repository. * **Mocking APIs:** When mocking APIs, ensure that you are not inadvertently exposing sensitive data in your mock responses. * **Why:** Prevents sensitive information from being leaked in test results or logs. * **Do This:** Carefully review your mock responses to ensure that they do not contain any sensitive data. * **Don't Do This:** Use real API responses directly as mock responses, as this may expose sensitive data. ### 5.2 Dependency Management * **Regularly Update Dependencies:** Keep your Jest dependencies up to date to patch security vulnerabilities. * **Why:** Ensures that you are using the latest versions of Jest and its related libraries, which may include security fixes. * **Do This:** Use a dependency management tool (e.g., "npm", "yarn") to regularly update your dependencies. * **Don't Do This:** Use outdated versions of Jest and its related libraries, as this may expose you to known security vulnerabilities. * Utilize a tool such as "npm audit" to check dependencies for vulnerabilities. * **Why:** Proactively identifies and addresses security risks. * **Do This:** Run "npm audit" periodically and implement recommended fixes. ### 5.3 Prevent Test Pollution and Isolation * **"resetMocks", "restoreMocks" or "clearMocks":** Use Jest's mock reset functions ("jest.resetMocks", "jest.restoreMocks", or "jest.clearMocks") to reset mock state between tests. Choose the appropriate one based on your needs. "resetMocks" resets the mock and its implementation back to the original, "restoreMocks" restores the original implementation, and "clearMocks" clears the calls but keeps the implementation. * **Why:** Prevents test pollution and ensures that each test runs in isolation. * **Do This:** Call one of these functions in a "beforeEach" or "afterEach" block to reset mock state before or after each test. * **Don't Do This:** Rely on implicit mock state, as this can lead to flaky tests and incorrect results. """javascript describe('MyComponent', () => { beforeEach(() => { jest.clearAllMocks(); // or jest.resetAllMocks() or jest.restoreAllMocks() }); it('calls the mock function', () => { const myFunction = jest.fn(); myFunction(); expect(myFunction).toHaveBeenCalled(); }); it('does not call the mock function (because it was cleared)', () => { const myFunction = jest.fn(); expect(myFunction).not.toHaveBeenCalled(); }); }); """ ## 6. Ecosystem Integrations * **Editor Integration:** Integrate Jest with your editor or IDE using plugins. * **Why:** Provides real-time feedback and improves the developer experience. * **Do This:** Install the official Jest extension for your editor. * **Linting Integration:** Use ESLint with Jest-specific rules. * **Why:** Prevent common testing errors. * **Do This:** Add "eslint-plugin-jest" to your project and configure ESLint to use the recommended rules. ## 7. Documentation and Communication * **Well-Documented Tests:** Write clear and concise comments to explain the purpose of your tests. * **Why:** Makes it easier for other developers (and your future self) to understand and maintain your tests. * **Do This:** Add comments to explain the purpose of each test, especially when the test logic is complex. * **Don't Do This:** Write tests without any comments, as this can make it difficult to understand the test logic. * **Consistent Communication:** Establish clear communication channels for discussing Jest-related issues and best practices within your team. * **Why:** Promotes knowledge sharing and ensures that everyone is following the same coding standards. * **Do This:** Use a dedicated Slack channel, email list, or other communication channel for discussing Jest-related topics. * **Don't Do This:** Keep Jest-related knowledge siloed within individual developers, as this can lead to inconsistencies and inefficiencies. By adhering to these coding standards and best practices, you can ensure that your Jest tests are maintainable, performant, secure, and reliable. This will lead to higher-quality code and a more efficient development process.
# API Integration Standards for Jest This document outlines the coding standards for integrating Jest tests with APIs, covering best practices for maintainability, performance, and reliability. These standards are crucial for ensuring that our API integrations are robust and well-tested. ## 1. General Principles ### Do This * **Isolate API Interactions:** Use mocks and stubs to isolate your components/modules under test from the actual API. This helps avoid test flakiness, dependency on network availability, and unwanted side effects. * **Test API Contracts:** Verify that the data sent to and received from the API adheres to the expected contract (request/response schemas). * **Utilize Environment Variables:** Configure API endpoints, authentication tokens, and other environment-specific settings via environment variables to keep sensitive credentials out of code. * **Provide Meaningful Test Names:** Create descriptive test names that clearly convey what is being tested. For instance, "fetches user data successfully" is better than "test1". * **Focus on Integration Points:** Prioritize testing the integration points between your application logic and the API (e.g., data mapping, error handling). * **Keep Tests Independent:** Write independent tests that don't rely on shared state or execution order. This ensures that tests can be run in any order without unexpected failures. * **Clean Up Resources:** If your tests create resources on the API (e.g., new users), clean them up after the test to avoid polluting the environment. ### Don't Do This * **Directly Call APIs in Tests (Without Mocking):** Avoid making live API calls in your tests unless you are specifically writing end-to-end tests. This introduces external dependencies and potential flakiness. * **Hardcode Sensitive Information:** Never hardcode API keys, passwords, or other sensitive information directly into your test code. * **Ignore API Contract Testing:** Neglecting to test API contracts can lead to integration issues, unexpected errors, and data corruption when the API changes. * **Write Tests with Side Effects:** Tests should not have side effects that impact other tests or the application state. * **Rely on Global State:** Avoid using global variables or shared state in your tests, as this can lead to dependencies and unpredictable behavior. ### Why These Standards Matter * **Maintainability:** Clear, well-structured tests that isolate API interactions are easier to understand, modify, and debug. * **Performance:** By mocking API calls, tests run faster and avoid network latency. * **Reliability:** Isolated tests are less likely to fail due to external factors like network outages or API downtime. * **Security:** Using environment variables prevents sensitive information from being exposed in code. ## 2. Mocking Strategies ### Do This * **Use "jest.mock()" for Module Mocks:** Mock entire modules using "jest.mock()" to simulate API interactions. * **Use "jest.spyOn()" for Specific Function Mocks:** Spy on specific functions using "jest.spyOn()" to observe their behavior and control their return values. * **Use "mockResolvedValue()" and "mockRejectedValue()" for Promises:** Mock promise resolutions and rejections with "mockResolvedValue()" and "mockRejectedValue()" for asynchronous API calls. * **Utilize "axios-mock-adapter" for Axios:** For projects using Axios, "axios-mock-adapter" is a good way to intercept and mock API requests. * **Leverage "fetch-mock" for Fetch API:** Use "fetch-mock" to mock calls made via the Fetch API. * **Prefer Asynchronous Mocking:** When mocking asynchronous API calls, use "async/await" to ensure that the mocks are properly resolved before assertions are made. * **Create Mock Data:** Define a suite of sample API responses to test different scenarios. ### Don't Do This * **Over-mocking:** Avoid mocking more than necessary. Focus on mocking only the external API dependencies. * **Ignoring Error Cases:** Neglecting to test error cases (e.g., API timeouts, server errors) can lead to incomplete test coverage. * **Using Generic Mocks:** Avoid using generic, catch-all mocks that don't accurately represent the API behavior. ### Why These Standards Matter * **Isolate Unit Tests:** Mocking external dependencies allows focused testing of individual units of code without external interference. * **Improve Test Speed:** Mocked tests execute faster as they don't depend on network latency. * **Enable Comprehensive Testing:** Mocking allows simulating a wide range of scenarios, including error conditions, corner cases, and unusual API responses. ### Code Examples #### Example 1: Mocking an Axios API Call """javascript // api-client.js import axios from 'axios'; const API_ENDPOINT = process.env.API_ENDPOINT || 'https://example.com/api'; export const fetchUserData = async (userId) => { try { const response = await axios.get("${API_ENDPOINT}/users/${userId}"); return response.data; } catch (error) { throw new Error(error.response?.data?.message || 'Failed to fetch user data'); } }; // user.test.js import { fetchUserData } from './api-client'; import axios from 'axios'; // Import axios here import MockAdapter from 'axios-mock-adapter'; describe('fetchUserData', () => { let mock; beforeEach(() => { mock = new MockAdapter(axios); }); afterEach(() => { mock.restore(); }); it('fetches user data successfully', async () => { const mockUserData = { id: 1, name: 'John Doe' }; mock.onGet("${process.env.API_ENDPOINT || 'https://example.com/api'}/users/1").reply(200, mockUserData); const userData = await fetchUserData(1); expect(userData).toEqual(mockUserData); }); it('throws an error when the API call fails', async () => { mock.onGet("${process.env.API_ENDPOINT || 'https://example.com/api'}/users/1").reply(500, { message: 'Internal Server Error' }); await expect(fetchUserData(1)).rejects.toThrow('Internal Server Error'); }); }); """ #### Example 2: Mocking a Fetch API Call """javascript // data-fetcher.js export const fetchData = async (url) => { const response = await fetch(url); if (!response.ok) { throw new Error("HTTP error! status: ${response.status}"); } return await response.json(); }; // data-fetcher.test.js import { fetchData } from './data-fetcher'; import fetchMock from 'fetch-mock'; describe('fetchData', () => { afterEach(() => { fetchMock.restore(); }); it('fetches data successfully', async () => { const mockData = { id: 1, name: 'Test Data' }; fetchMock.get('https://example.com/data', mockData); const data = await fetchData('https://example.com/data'); expect(data).toEqual(mockData); }); it('throws an error when the fetch fails', async () => { fetchMock.get('https://example.com/data', { status: 500, body: {message: 'Internal Server Error'} }); await expect(fetchData('https://example.com/data')).rejects.toThrow('HTTP error! status: 500'); }); }); """ #### Example 3: Mocking a Module with "jest.mock" """javascript // api.js export const getUser = async (id) => { const response = await fetch("https://api.example.com/users/${id}"); return response.json(); }; // component.js import { getUser } from './api'; export const UserComponent = async ({ userId }) => { const user = await getUser(userId); return "<div>${user.name}</div>"; }; // component.test.js import { UserComponent } from './component'; import * as api from './api'; jest.mock('./api', () => ({ getUser: jest.fn() })); describe('UserComponent', () => { it('renders user name', async () => { const mockUser = { id: 1, name: 'John Doe' }; api.getUser.mockResolvedValue(mockUser); const component = await UserComponent({ userId: 1 }); expect(component).toContain('John Doe'); expect(api.getUser).toHaveBeenCalledWith(1); }); }); """ ## 3. API Test Strategies ### Do This * **Contract Testing:** Use tools like Pact or custom validation logic to ensure that your application adheres to the expected API contract. Verify both request and response schemas. * **End-to-End (E2E) Tests:** Include a selection of end-to-end tests using tools like Cypress or Playwright to validate the entire API integration flow in a real environment. * **Error Handling Tests:** Thoroughly test how your application handles API errors, such as timeouts, 500 errors, and invalid responses. * **Data Validation:** Validate both the data sent to the API and the data received from the API. Ensure that the data conforms to the expected format and constraints. Consider using libraries like "joi" or "yup" for schema validation. * **Rate Limiting Tests:** If the API uses rate limiting, simulate scenarios where the rate limit is exceeded and verify that your application handles it gracefully. ### Don't Do This * **Neglecting Error Cases:** Failing to test error-handling scenarios can lead to unexpected failures in production. * **Ignoring Edge Cases:** Overlooking edge cases (e.g., null values, empty strings, large numbers) can lead to vulnerabilities and incorrect behavior. * **Testing Live Data Without Authorization:** Avoid running tests against live production data without proper authorization and access controls. ### Why These Standards Matter * **End-to-End Reliability:** E2E tests assure total system functionality and highlight discrepancies that integration tests may overlook. * **Resilience:** Proper error management prevents application failures and provides a satisfying user experience even in adverse circumstances. * **Data Integrity:** Verifying both input and output data ensures consistency and protects against corrupt data from breaking the system. ### Example 1: Contract Testing with a simple validation function """javascript // api-contract.js (Simple example) export const validateUserSchema = (data) => { if (!data || typeof data !== 'object') { return false; } if (typeof data.id !== 'number' || typeof data.name !== 'string') { return false; } return true; // Very basic validation }; // api-consumer.test.js import { fetchUserData } from './api-client'; import { validateUserSchema } from './api-contract'; describe('API Consumer', () => { it('fetches user data and validates the schema', async () => { const mockUserData = { id: 123, name: 'Test User' }; jest.spyOn(global, 'fetch').mockResolvedValue({ ok: true, json: jest.fn().mockResolvedValue(mockUserData) }); const userData = await fetchUserData(123); expect(validateUserSchema(userData)).toBe(true); // Assumes fetched data follows schema }); it('handles API errors and validates the schema failure', async () => { jest.spyOn(global, 'fetch').mockRejectedValue(new Error('API error')); await expect(fetchUserData(123)).rejects.toThrow('API error'); }); }); """ ### Example 2: Testing API Error Handling """javascript // api-client.js (Simulated API client) export const fetchData = async (url) => { const response = await fetch(url); if (!response.ok) { throw new Error("API Error: ${response.status}"); } return response.json(); }; // component-using-api.test.js import { fetchData } from './api-client'; describe('Component using API', () => { it('handles successful API response', async () => { const mockData = { message: 'Success!' }; jest.spyOn(global, 'fetch').mockResolvedValue({ ok: true, json: jest.fn().mockResolvedValue(mockData), }); const data = await fetchData('https://api.example.com'); expect(data).toEqual(mockData); }); it('handles API error response', async () => { jest.spyOn(global, 'fetch').mockResolvedValue({ ok: false, status: 500, }); await expect(fetchData('https://api.example.com')).rejects.toThrow('API Error: 500'); }); it('handles network error', async () => { jest.spyOn(global, 'fetch').mockRejectedValue(new Error('Network Error')); await expect(fetchData('https://api.example.com')).rejects.toThrow('Network Error'); }); }); """ ## 4. Environment Configuration ### Do This * **Use Environment Variables:** Store API endpoints, authentication tokens, and other environment-specific settings in environment variables. * **Load Environment Variables:** Use a library like "dotenv" to load environment variables from a ".env" file during development and testing. * **Set Different Environment Variables for Different Environments:** Configure separate environment variables for development, testing, staging, and production environments. * **Mock Environment Variables:** Use "process.env" within tests or libraries like "mock-env" for more controlled mocking of environment variables during testing. ### Don't Do This * **Hardcoding API Keys:** Never hardcode API keys or other sensitive information directly into your test code or application code. * **Committing ".env" Files:** Avoid committing ".env" files to source control, as they may contain sensitive information. * **Using the Same Environment Variables for All Environments:** Using the same environment variables across all environments can lead to configuration issues and security vulnerabilities. ### Why These Standards Matter * **Security:** Using environment variables will prevent exposing sensitive credentials in code. * **Configuration Flexibility:** Modifying application configuration across different environments becomes easier using environment variables. * **Reproducibility:** Environment variables improve test environment stability and aid repeatability. ### Code Examples #### Example: Configuring API Endpoint Via Environment Variables """javascript // api-client.js import axios from 'axios'; const API_ENDPOINT = process.env.API_ENDPOINT || 'https://default.example.com/api'; // Add default endpoint const API_KEY = process.env.API_KEY; // API key export const fetchData = async () => { try { const response = await axios.get("${API_ENDPOINT}/data", { headers: { 'X-API-Key': API_KEY } }); return response.data; } catch (error) { throw new Error(error.response?.data?.message || 'Failed to fetch data'); } } // api-client.test.js (Using mock-env) import { fetchData } from './api-client'; import mockEnv from 'mock-env'; describe('API Client', () => { it('fetches data from configured API endpoint', async () => { const mockData = { message: 'Test Data' }; jest.spyOn(global, 'fetch').mockResolvedValue({ ok: true, json: jest.fn().mockResolvedValue(mockData), }); const restore = mockEnv({ API_ENDPOINT: 'https://test.example.com/api', API_KEY: 'test-api-key' }); try{ await fetchData(); expect(global.fetch).toHaveBeenCalledWith('https://test.example.com/api/data', { headers: { 'X-API-Key': 'test-api-key' } }); } finally { restore(); } }); }); """ This example showcases how to correctly use environment variables for sensitive configuration and how to mock them efficiently within Jest tests for isolated testing. By adhering to these standards, we can build reliable, maintainable, and secure API integrations tested with Jest. These standards guide developers in creating robust, high-quality testing and assist AI coding assistants to follow industry best practices.