# Security Best Practices Standards for Monorepo
This document outlines security best practices for Monorepo development. It serves as a guide for developers to write secure code, protect against common vulnerabilities, and implement secure coding patterns within a Monorepo architecture. Adherence to these standards is crucial for maintaining the integrity, confidentiality, and availability of applications built with Monorepo.
## 1. Input Validation and Sanitization
### Standard
All external inputs MUST be validated and sanitized before processing within any module/package in the Monorepo. This includes inputs from users, databases, APIs, and other sources.
* **Do This:** Use input validation libraries specific to your technology stack (e.g., "validator.js" for Node.js) and define strict validation rules. Sanitize inputs to remove potentially malicious characters or code.
* **Don't Do This:** Trust that input is safe or rely solely on client-side validation.
### Why It Matters
Failing to validate and sanitize inputs can lead to various vulnerabilities, including SQL injection, cross-site scripting (XSS), command injection, and path traversal attacks.
### Code Examples
**Node.js (Express.js) with "validator.js":**
"""javascript
const express = require('express');
const { body, validationResult } = require('express-validator');
const validator = require('validator');
const app = express();
app.use(express.json());
app.post('/user', [
// Validate and sanitize the request body
body('email').isEmail().normalizeEmail(),
body('password').isLength({ min: 8 }).trim().escape(),
body('username').matches(/^[a-zA-Z0-9]+$/).trim().escape(),
], (req, res) => {
const errors = validationResult(req);
if (!errors.isEmpty()) {
return res.status(400).json({ errors: errors.array() });
}
// Input is validated and sanitized, proceed with processing
const { email, password, username } = req.body;
console.log("Creating user with validated data: Email=${email}, Username=${username}");
res.send('User created successfully');
});
const PORT = process.env.PORT || 3000;
app.listen(PORT, () => {
console.log("Server is running on port ${PORT}");
});
"""
**Explanation:**
* "express-validator" middleware is used to validate and sanitize request body parameters.
* "isEmail()": Ensures the email is in a valid format.
* "normalizeEmail()": Normalizes the email address (e.g., converts to lowercase).
* "isLength({ min: 8 })": Requires the password to be at least 8 characters long.
* "trim()": Removes whitespace from the beginning and end of the input.
* "escape()": Escapes HTML characters to prevent XSS attacks.
* "matches(/^[a-zA-Z0-9]+$/)": Ensures the username contains only alphanumeric characters.
* Error handling is implemented using "validationResult".
### Anti-Patterns
"""javascript
// Anti-pattern: Directly using user input without validation or sanitization
app.get('/items/:id', (req, res) => {
const itemId = req.params.id; // NO VALIDATION
// Potentially vulnerable query
db.query("SELECT * FROM items WHERE id = ${itemId}", (err, result) => {
if (err) {
console.error(err);
return res.status(500).send('Database error');
}
res.json(result);
});
});
"""
**Explanation:**
The above code is susceptible to SQL injection because "itemId" from the request parameters is directly used in the SQL query without any validation or sanitization. An attacker could potentially inject malicious SQL code into the "itemId" parameter, leading to unauthorized data access or modification.
## 2. Authentication and Authorization
### Standard
Robust authentication and authorization mechanisms MUST be implemented to protect resources within the Monorepo.
* **Do This:** Use strong password hashing algorithms (e.g., bcrypt), multi-factor authentication (MFA) where possible, and role-based access control (RBAC). Utilize established libraries and frameworks for authentication and authorization.
* **Don't Do This:** Store passwords in plaintext, use weak or outdated hashing algorithms, or rely solely on client-side authorization. Avoid hardcoding credentials.
### Why It Matters
Proper authentication and authorization prevent unauthorized access to sensitive data and functionality.
### Code Examples
**Node.js (Express.js) with JWT and bcrypt:**
"""javascript
const express = require('express');
const bcrypt = require('bcrypt');
const jwt = require('jsonwebtoken');
const app = express();
app.use(express.json());
// In-memory user storage (in a real application, use a database)
const users = [];
// Registration
app.post('/register', async (req, res) => {
try {
const hashedPassword = await bcrypt.hash(req.body.password, 10);
const user = { name: req.body.name, password: hashedPassword };
users.push(user);
res.status(201).send('User registered');
} catch {
res.status(500).send();
}
});
// Login
app.post('/login', async (req, res) => {
const user = users.find(user => user.name === req.body.name);
if (user == null) {
return res.status(400).send('Cannot find user');
}
try {
if (await bcrypt.compare(req.body.password, user.password)) {
// Generate JWT token
const accessToken = jwt.sign({ name: user.name }, 'YOUR_SECRET_KEY', { expiresIn: '15m' }); // REPLACE WITH SECURE, ENVIRONMENT-SPECIFIC KEY
res.json({ accessToken: accessToken });
} else {
res.status(401).send('Not Allowed');
}
} catch {
res.status(500).send();
}
});
// Middleware to authenticate JWT token
function authenticateToken(req, res, next) {
const authHeader = req.headers['authorization'];
const token = authHeader && authHeader.split(' ')[1];
if (token == null) return res.sendStatus(401);
jwt.verify(token, 'YOUR_SECRET_KEY', (err, user) => { // REPLACE WITH SECURE, ENVIRONMENT-SPECIFIC KEY
if (err) return res.sendStatus(403);
req.user = user;
next();
});
}
// Secured route
app.get('/protected', authenticateToken, (req, res) => {
res.json({ message: "Welcome, ${req.user.name}!" });
});
const PORT = process.env.PORT || 3000;
app.listen(PORT, () => {
console.log("Server is running on port ${PORT}");
});
"""
**Explanation:**
* **bcrypt:** Used to securely hash passwords before storing them. A salt is automatically generated and included in the hash. A work factor of 10 is used, offering a reasonable balance between security and performance.
* **JWT (JSON Web Tokens):** Used to create access tokens for authenticated users allowing access to protected routes.
* **"authenticateToken" middleware:** Verifies the JWT token and protects sensitive routes.
* **Important:** Replace "'YOUR_SECRET_KEY'" with a strong, randomly generated secret key stored securely in an environment variable.
### Anti-Patterns
"""javascript
// Anti-pattern: Storing passwords in plaintext
app.post('/register', (req, res) => {
const password = req.body.password; // Storing plaintext password
//...
});
// Anti-pattern: Weak authentication mechanism
app.get('/admin', (req, res) => {
const isAdmin = req.query.admin === 'true'; // Insecure authentication
if (isAdmin) {
// Allows access
}
});
"""
Storing passwords in plaintext is a critical security vulnerability. Using query parameters for authentication is easily manipulated.
## 3. Secrets Management
### Standard
Secrets (API keys, database passwords, encryption keys, etc.) MUST be stored securely and never hardcoded in the source code of Monorepo packages.
* **Do This:** Use a dedicated secrets management solution (e.g., HashiCorp Vault, AWS Secrets Manager), environment variables, or encrypted configuration files. Apply the principle of least privilege to secret access. Rotate secrets regularly.
* **Don't Do This:** Hardcode secrets directly into the code, check them into version control, or log them.
### Why It Matters
Exposing secrets can lead to unauthorized access to systems, data breaches, and other severe security incidents.
### Code Examples
**Node.js with Environment Variables (using "dotenv"):**
"""javascript
require('dotenv').config(); // Load environment variables from .env file
const express = require('express');
const app = express();
const apiKey = process.env.API_KEY; // Access secret from environment variable
const dbPassword = process.env.DB_PASSWORD;
app.get('/data', (req, res) => {
// Use apiKey and dbPassword securely
console.log("Using API Key: ${apiKey.substring(0,4)}..."); // Logs the first four characters of the API key; better than logging the entire value
// ... your code to access date from db
res.send('Data retrieved');
});
const PORT = process.env.PORT || 3000;
app.listen(PORT, () => {
console.log("Server is running on port ${PORT}");
});
"""
**Explanation:**
* ".env" file (which should be added to ".gitignore") stores sensitive configuration data.
* "dotenv" package loads the environment variables from the ".env" file into "process.env".
* Secrets are accessed using "process.env.VARIABLE_NAME".
### Anti-Patterns
"""javascript
// Anti-pattern: Hardcoding API key
const apiKey = 'YOUR_API_KEY'; // DO NOT DO THIS
// Anti-pattern: Logging secrets
console.log('Database password:', dbPassword); // DO NOT DO THIS
"""
Hardcoding secrets directly within source code is a critical error and exposes the application to significant risk. Logging Secrets allows them to be captured and exposed.
## 4. Dependency Management
### Standard
Monorepo dependency management MUST be carefully controlled to avoid vulnerabilities introduced through third-party libraries.
* **Do This:** Use a package manager (e.g., npm, yarn, pnpm) with lockfiles to ensure consistent dependency versions. Regularly audit dependencies for known vulnerabilities using tools like "npm audit" or "yarn audit" and address identified issues promptly. Prefer well-maintained and reputable libraries. Use a dependency management tool like Dependabot, Snyk, or GitHub's automated security updates.
* **Don't Do This:** Use outdated or unmaintained libraries, ignore security audit warnings, or blindly update dependencies without testing.
### Why It Matters
Third-party libraries can contain vulnerabilities that can be exploited to compromise the entire application.
### Code Examples
**Using npm audit:**
"""bash
npm audit
npm audit fix # attempts to automatically fix vulnerabilities
"""
**Using yarn audit:**
"""bash
yarn audit
yarn audit fix # attempts to automatically fix vulnerabilities
"""
**Explanation:**
* "npm audit" and "yarn audit" scan the project's dependencies for known vulnerabilities and provide recommendations for remediation.
* "npm audit fix" and "yarn audit fix" attempt to automatically update vulnerable dependencies to patched versions (use with caution - test thoroughly!).
### Anti-Patterns
"""javascript
// Anti-pattern: Using an outdated library with known vulnerabilities
const someOutdatedLibrary = require('some-outdated-library'); // Likely has known security issues
// Anti-pattern: Ignoring audit warnings
// After running 'npm audit' and seeing warnings, ignoring them and proceeding
"""
Ignoring security audits can leave your application vulnerable and susceptible to known exploits within the outdated or vulnerable libraries. Continuously monitor dependency health
## 5. Error Handling and Logging
### Standard
Proper error handling and logging MUST be implemented to provide visibility into application behavior while avoiding exposing sensitive information.
* **Do This:** Implement structured logging using a logging library (e.g., Winston, Morgan). Log essential events (e.g., authentication attempts, authorization failures, unexpected errors). Avoid logging sensitive data (e.g., passwords, API keys, personally identifiable information (PII)). Implement centralized logging.
* **Don't Do This:** Log error details directly to users, ignore errors, or use "console.log" for production logging.
### Why It Matters
Poor error handling can expose internal system details to attackers, while inadequate logging hinders incident response and security investigations.
### Code Examples
**Node.js with Winston:**
"""javascript
const express = require('express');
const winston = require('winston');
const app = express();
// Configure Winston logger
const logger = winston.createLogger({
level: 'info',
format: winston.format.json(),
transports: [
new winston.transports.Console(),
new winston.transports.File({ filename: 'error.log', level: 'error' }),
new winston.transports.File({ filename: 'combined.log' }),
],
});
app.get('/error', (req, res) => {
try {
throw new Error('Simulated error');
} catch (error) {
logger.error({ message: 'Error occurred', error: error.message, stack: error.stack });
res.status(500).send('Internal server error'); // Display generic message to user
}
});
app.get('/info', (req, res) => {
logger.info({ message: 'Accessing info endpoint' });
res.send('Info endpoint accessed');
});
const PORT = process.env.PORT || 3000;
app.listen(PORT, () => {
logger.info("Server is running on port ${PORT}");
console.log("Server is running on port ${PORT}");
});
"""
**Explanation:**
* Winston is used for structured logging.
* Logs are written to the console and to files.
* Different log levels are used (e.g., "info", "error").
* Error details are logged, but a generic error message is shown to the user. Avoid showing stack traces directly to the client.
### Anti-Patterns
"""javascript
// Anti-pattern: Exposing error details to users
app.get('/error', (req, res) => {
try {
throw new Error('Detailed error message');
} catch (error) {
res.status(500).send(error.message); // Exposes internal error message
}
});
// Anti-pattern: Using console.log for production logging
console.log('User logged in'); // Inadequate for production, lacks context and control
"""
It is bad practice to expose detailed error messages or internal system details to end-users. It may allow threat actors to infer or discover further exploitable information.
## 6. Data Encryption
### Standard
Sensitive data MUST be encrypted both in transit and at rest.
* **Do This:** Use HTTPS for all network communication. Use encryption libraries (e.g., OpenSSL) to encrypt data stored in databases or files. Consider field-level encryption for highly sensitive data. Use a key management service to manage encryption keys.
* **Don't Do This:** Store sensitive data in plaintext, use outdated encryption algorithms, or hardcode encryption keys.
### Why It Matters
Encryption protects data from unauthorized access, even if a system is compromised.
### Code Examples
**Node.js with HTTPS and "crypto" library:**
"""javascript
const express = require('express');
const https = require('https');
const fs = require('fs');
const crypto = require('crypto'); // Import the crypto module
const app = express();
// Generate a secure encryption key (store securely)
const encryptionKey = crypto.randomBytes(32);
const iv = crypto.randomBytes(16);
function encrypt(text) {
const cipher = crypto.createCipheriv('aes-256-cbc', Buffer.from(encryptionKey), iv);
let encrypted = cipher.update(text);
encrypted = Buffer.concat([encrypted, cipher.final()]);
return { iv: iv.toString('hex'), encryptedData: encrypted.toString('hex') };
}
function decrypt(text, iv) {
let iv_buf = Buffer.from(iv, 'hex');
let encryptedText = Buffer.from(text, 'hex');
const decipher = crypto.createDecipheriv('aes-256-cbc', Buffer.from(encryptionKey), iv_buf);
let decrypted = decipher.update(encryptedText);
decrypted = Buffer.concat([decrypted, decipher.final()]);
return decrypted.toString();
}
app.get('/encrypt/:data', (req, res) => {
const data = req.params.data;
const encryptedData = encrypt(data);
res.json(encryptedData);
});
app.get('/decrypt/:encryptedData/:iv', (req, res) => {
const encryptedData = req.params.encryptedData;
const iv = req.params.iv;
const decryptedData = decrypt(encryptedData, iv);
res.send(decryptedData);
});
// HTTPS configuration
const privateKey = fs.readFileSync('sslcert/key.pem', 'utf8');
const certificate = fs.readFileSync('sslcert/cert.pem', 'utf8');
const credentials = {key: privateKey, cert: certificate};
const httpsServer = https.createServer(credentials, app);
const PORT = process.env.PORT || 443;
httpsServer.listen(PORT, () => {
console.log("HTTPS server listening on port ${PORT}");
});
"""
**Explanation:**
* "crypto" module is used for encryption.
* AES-256-CBC algorithm is used.
* Encryption keys and IVs are generated randomly and stored securely.
* HTTPS is used to secure communication.
### Anti-Patterns
"""javascript
// Anti-pattern: Storing sensitive data in plaintext
// Anti-pattern: Using hardcoded encryption key
const encryptionKey = 'HardcodedKey'; // DO NOT DO THIS
"""
Storing sensitive data without encryption or hardcoding the encryption key presents a considerable vulnerability, as it makes the data accessible to unauthorized parties upon system compromise.
## 7. Cross-Site Scripting (XSS) Prevention
### Standard
Prevent XSS vulnerabilities by properly escaping output and using appropriate security headers.
* **Do This:** Use template engines with automatic escaping (e.g., Handlebars, Mustache, or JSX with React), sanitize user input before displaying it, set the "Content-Security-Policy" (CSP) header to restrict the sources from which resources can be loaded, and set the "X-XSS-Protection" header to enable the browser's built-in XSS filter..
* **Don't Do This:** Directly insert user input into HTML without escaping or sanitization.
### Why It Matters
XSS allows attackers to inject malicious scripts into web pages viewed by other users.
### Code Examples
**Node.js (Express.js) with escaping and CSP header:**
"""javascript
const express = require('express');
const hbs = require('hbs'); // Using Handlebars for templating with automatic escaping
const app = express();
app.set('view engine', 'hbs'); // set up handlebars view engine
app.use(express.urlencoded({ extended: true }));
app.use((req, res, next) => {
res.setHeader("Content-Security-Policy", "default-src 'self'; script-src 'self' 'unsafe-inline'; style-src 'self' 'unsafe-inline'");
res.setHeader("X-XSS-Protection", "1; mode=block");
next();
});
app.get('/', (req, res) => {
res.render('index', { title: 'XSS Example', userInput: '' });
});
app.post('/submit', (req, res) => {
const userInput = req.body.userInput;
res.render('index', { title: 'XSS Example', userInput: userInput });
});
const PORT = process.env.PORT || 3000;
app.listen(PORT, () => {
console.log("Server is running on port ${PORT}");
});
"""
**index.hbs:**
"""html
{{title}}
Enter some text:
Submit
<p>You entered: {{userInput}}</p>
"""
**Explanation:**
* Handlebars template engine is used, which automatically escapes output to prevent XSS.
* CSP header is set to restrict the sources of content.
* "X-XSS-Protection" header is set to enable the browser's XSS filter in blocking mode.
### Anti-Patterns
"""javascript
// Anti-pattern: Directly inserting user input without escaping
app.get('/display', (req, res) => {
const userInput = req.query.input;
res.send("${userInput}"); // Vulnerable to XSS
});
"""
Directly embedding user input onto a web page without proper encoding creates a serious security risk. The lack of proper escaping allows malicious scripts contained within the user input to execute within the user's browser.
## 8. Cross-Site Request Forgery (CSRF) Prevention
### Standard
Protect against CSRF attacks by using anti-CSRF tokens.
* **Do This:** Generate and validate CSRF tokens for all state-changing requests (e.g., POST, PUT, DELETE). Use a library or framework that provides built-in CSRF protection (e.g., "csurf" middleware in Express.js). Implement "SameSite" cookie attribute: set the "SameSite" attribute for cookies to either "Strict" or "Lax".
* **Don't Do This:** Rely solely on "GET" requests for state-changing operations or disable CSRF protection.
### Why It Matters
CSRF allows attackers to perform actions on behalf of legitimate users without their knowledge.
### Code Examples
**Node.js (Express.js) with "csurf":**
"""javascript
const express = require('express');
const cookieParser = require('cookie-parser');
const csrf = require('csurf');
const app = express();
// Middleware
app.use(cookieParser());
const csrfProtection = csrf({ cookie: true });
app.use(express.urlencoded({ extended: false }));
app.get('/form', csrfProtection, (req, res) => {
// pass the csrfToken to the view
res.send("
Submit
");
});
app.post('/process', csrfProtection, (req, res) => {
res.send('Form is processed!');
});
const PORT = process.env.PORT || 3000;
app.listen(PORT, () => {
console.log("Server is running on port ${PORT}");
});
"""
**Explanation:**
* "csurf" middleware is used to generate and validate CSRF tokens.
* A hidden input field contains the CSRF token, which is submitted with the form.
* The server verifies the CSRF token before processing the request.
### Anti-Patterns
"""javascript
// Anti-pattern: Disabling CSRF protection
app.post('/transfer', (req, res) => {
// No CSRF protection
//...
});
"""
Code that lacks proper CSRF protection is vulnerable to malicious requests made on behalf of an authenticated user without their consent.
## 9. Denial of Service (DoS) Prevention
### Standard
Implement measures to mitigate DoS attacks.
* **Do This:** Limit request rates, implement timeouts, use rate limiting middleware (e.g., "express-rate-limit"), use a content delivery network (CDN), and protect against Slowloris attacks.
* **Don't Do This:** Allow unlimited requests or ignore potential DoS vulnerabilities.
### Why It Matters
DoS attacks can disrupt service availability and prevent legitimate users from accessing the application.
### Code Examples
**Node.js (Express.js) with "express-rate-limit":**
"""javascript
const express = require('express');
const rateLimit = require('express-rate-limit');
const app = express();
const limiter = rateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 100, // Limit each IP to 100 requests per windowMs
message: 'Too many requests from this IP, please try again after 15 minutes',
standardHeaders: true, // Return rate limit info in the "RateLimit-*" headers
legacyHeaders: false, // Disable the "X-RateLimit-*" headers
});
app.use(limiter);
app.get('/', (req, res) => {
res.send('Welcome to rate limited app!');
});
const PORT = process.env.PORT || 3000;
app.listen(PORT, () => {
console.log("Server is running on port ${PORT}");
});
"""
**Explanation:**
* "express-rate-limit" middleware is used to limit the number of requests from each IP address.
* The rate limit is configured to allow 100 requests per 15 minutes.
### Anti-Patterns
"""javascript
// Anti-pattern: Allowing unlimited requests
app.get('/unprotected', (req, res) => {
// No rate limiting
res.send('Unprotected route');
});
"""
Allowing unlimited requests exposes the service to high traffic volume and increases the vulnerability to denial-of-service attacks.
## 10. Server-Side Request Forgery (SSRF) Prevention
### Standard
Prevent SSRF vulnerabilities by validating and sanitizing outbound requests.
* **Do This:** Whitelist allowed domains or IP addresses, validate URLs, disable URL redirection, and use secure protocols (HTTPS) for outbound requests. Avoid using user-supplied data directly in outbound requests.
* **Don't Do This:** Allow unrestricted outbound requests or trust user-supplied URLs.
### Why It Matters
SSRF allows attackers to make requests to internal resources or arbitrary external endpoints from the server, bypassing security controls.
### Code Examples
**Node.js with URL validation:**
"""javascript
const express = require('express');
const { URL } = require('url');
const https = require('https'); // Use HTTPS for outbound requests
const app = express();
app.use(express.json());
const allowedHosts = ['api.example.com', 'secure.example.org']; // Whitelist
app.post('/proxy', async (req, res) => {
try {
const targetUrl = req.body.url;
// URL validation
const url = new URL(targetUrl);
if (!allowedHosts.includes(url.hostname)) {
return res.status(400).send('Invalid target URL');
}
// Make outbound request (HTTPS)
https.get(targetUrl, (response) => {
let data = '';
response.on('data', (chunk) => {
data += chunk;
});
response.on('end', () => {
res.send(data);
});
}).on('error', (err) => {
console.error('Error making request:', err);
res.status(500).send('Error making outbound request');
});
} catch (error) {
console.error('Invalid URL:', error);
res.status(400).send('Invalid URL');
}
});
const PORT = process.env.PORT || 3000;
app.listen(PORT, () => {
console.log("Server is running on port ${PORT}");
});
"""
**Explanation:**
* The code validates the target URL against a whitelist of allowed hosts.
* It uses "https.get" to ensure secure communication.
### Anti-Patterns
"""javascript
// Anti-pattern: Using user-supplied URL directly
app.get('/proxy', (req, res) => {
const targetUrl = req.query.url; // User-supplied URL
//...
https.get(targetUrl, (response) => {
response.pipe(res); // Vulnerable to SSRF
});
});
"""
Directly using user provided URLs in outbound requests, without domain validation, makes the service vulnerable to Server Side Request Forgery threats. Limit outbound requests to validated URLs to prevent vulnerability.
By adhering to these security best practices, Monorepo developers can build more secure and reliable applications. These standards should be reviewed and updated regularly to stay ahead of emerging threats and vulnerabilities. Remember to implement security as a continuous process, not a one-time fix.
danielsogl
Created Mar 6, 2025
This guide explains how to effectively use .clinerules
with Cline, the AI-powered coding assistant.
The .clinerules
file is a powerful configuration file that helps Cline understand your project's requirements, coding standards, and constraints. When placed in your project's root directory, it automatically guides Cline's behavior and ensures consistency across your codebase.
Place the .clinerules
file in your project's root directory. Cline automatically detects and follows these rules for all files within the project.
# Project Overview project: name: 'Your Project Name' description: 'Brief project description' stack: - technology: 'Framework/Language' version: 'X.Y.Z' - technology: 'Database' version: 'X.Y.Z'
# Code Standards standards: style: - 'Use consistent indentation (2 spaces)' - 'Follow language-specific naming conventions' documentation: - 'Include JSDoc comments for all functions' - 'Maintain up-to-date README files' testing: - 'Write unit tests for all new features' - 'Maintain minimum 80% code coverage'
# Security Guidelines security: authentication: - 'Implement proper token validation' - 'Use environment variables for secrets' dataProtection: - 'Sanitize all user inputs' - 'Implement proper error handling'
Be Specific
Maintain Organization
Regular Updates
# Common Patterns Example patterns: components: - pattern: 'Use functional components by default' - pattern: 'Implement error boundaries for component trees' stateManagement: - pattern: 'Use React Query for server state' - pattern: 'Implement proper loading states'
Commit the Rules
.clinerules
in version controlTeam Collaboration
Rules Not Being Applied
Conflicting Rules
Performance Considerations
# Basic .clinerules Example project: name: 'Web Application' type: 'Next.js Frontend' standards: - 'Use TypeScript for all new code' - 'Follow React best practices' - 'Implement proper error handling' testing: unit: - 'Jest for unit tests' - 'React Testing Library for components' e2e: - 'Cypress for end-to-end testing' documentation: required: - 'README.md in each major directory' - 'JSDoc comments for public APIs' - 'Changelog updates for all changes'
# Advanced .clinerules Example project: name: 'Enterprise Application' compliance: - 'GDPR requirements' - 'WCAG 2.1 AA accessibility' architecture: patterns: - 'Clean Architecture principles' - 'Domain-Driven Design concepts' security: requirements: - 'OAuth 2.0 authentication' - 'Rate limiting on all APIs' - 'Input validation with Zod'
# Core Architecture Standards for Monorepo This document outlines the core architectural standards for Monorepo projects. It focuses on fundamental patterns, project structure, and organization principles specifically within the Monorepo context. Adhering to these standards ensures maintainability, scalability, and a consistent development experience across all projects within the repository. ## 1. Fundamental Architectural Patterns Monorepos often benefit from a modular architecture. This allows for independent development, testing, and deployment of different parts of the system. The choice of architectural pattern depends on the specific needs of the project, but we encourage: * **Modular Monolith:** A single deployable unit composed of loosely coupled modules. This is a good starting point for many projects as it offers simplicity while still promoting modularity. * **Microservices within a Monorepo:** Smaller, independently deployable services residing within the same repository. This allows for independent scaling and development cycles but introduces more complexity in terms of deployment and inter-service communication. * **Layered Architecture:** A common and effective approach for organizing code into distinct layers (e.g., presentation, business logic, data access). This promotes separation of concerns and makes the codebase easier to understand and maintain. **Do This:** * Choose an architectural pattern that aligns with the project's complexity and scalability requirements. * Clearly define module boundaries and dependencies. * Strive for loose coupling between modules / services. **Don't Do This:** * Create a tightly coupled monolith without clear modules. This makes the codebase difficult to reason about and maintain. * Implement microservices prematurely without considering the added complexity. * Ignore architectural principles, especially if the project grows. **Why:** Choosing an architecture at the beginning of the project is important to prevent future refactoring. Selecting one appropriate for your project early will allow for easier scaling in the future and easier team collaboration. ## 2. Monorepo Project Structure and Organization A well-defined project structure is critical for navigating and managing Monorepo projects. We recommend the following structure: """ monorepo-root/ ├── apps/ # User facing applications │ ├── web-app/ │ │ ├── src/ │ │ ├── package.json │ │ └── tsconfig.json │ ├── mobile-app/ │ │ ├── src/ │ │ ├── package.json │ │ └── tsconfig.json ├── packages/ # Reusable libraries and components │ ├── ui-library/ │ │ ├── src/ │ │ ├── package.json │ │ └── tsconfig.json │ ├── utils/ │ │ ├── src/ │ │ ├── package.json │ │ └── tsconfig.json ├── tools/ # Build scripts, code generators, and other utilities │ ├── build/ │ ├── codegen/ ├── docs/ # Documentation for the monorepo and its projects ├── .eslintrc.js # Root ESLint configuration ├── .prettierrc.js # Root Prettier configuration ├── tsconfig.base.json # Base TypeScript configuration └── package.json # Root package.json (for tooling and scripts) """ * **"apps/"**: Contains user-facing applications (e.g., web apps, mobile apps, CLI tools). * **"packages/"**: Contains reusable libraries and components that can be shared across multiple applications. * **"tools/"**: Contains build scripts, code generators, and other utilities for the Monorepo. * **"docs/"**: Holds documentation for the monorepo itself and for individual packages/applications. Consider tools like Docusaurus or Storybook for document generation. * **Root Configuration Files:** Centralized configuration for linting, formatting, and TypeScript. **Do This:** * Organize code into clear and well-defined packages. * Use a consistent naming convention for packages and applications. * Keep shared libraries in the "packages/" directory. * Utilize shared configuration files at the root level. **Don't Do This:** * Scatter code across the repository without a clear structure. * Create overly large packages that are difficult to maintain. * Duplicate configuration files across multiple packages. **Why:** A clear and consistent structure is essential for navigation and maintainability, especially as the Monorepo grows in size and complexity. Using a standardized structure across multiple projects also facilitates onboarding new developers as they will quickly understand where to find code. ## 3. Dependency Management Managing dependencies within a Monorepo can be challenging. We recommend using a tool like "pnpm", "Yarn", or "npm" workspaces to simplify dependency management and avoid duplication. PNPM is often favored due to its efficient disk space usage and speedier installations. **Do This:** * Use a workspace-aware package manager (e.g., "pnpm", "Yarn", "npm"). * Declare dependencies explicitly in each package's "package.json" file. * Use version ranges that allow for minor and patch updates, but pin major versions to avoid breaking changes. * Leverage tools like "Dependabot" or "Renovate" to automate dependency updates. **Don't Do This:** * Rely on implicit dependencies between packages. * Install dependencies globally. * Use wildcard version ranges (e.g., "*"). **Example ("packages/ui-library/package.json"):** """json { "name": "@my-monorepo/ui-library", "version": "1.0.0", "dependencies": { "react": "^18.2.0", "@emotion/react": "^11.11.1", "@emotion/styled": "^11.11.0" }, "devDependencies": { "@types/react": "^18.2.15" }, "peerDependencies": { "next": ">=13.0.0" } } """ **Why:** Proper dependency management prevents version conflicts, improves build times, and reduces the overall size of the Monorepo. "peerDependencies" are critical to declare and ensure that components built are compatible with different versions of the host application. ## 4. Code Sharing and Reusability One of the key benefits of a Monorepo is the ability to easily share code between different projects. **Do This:** * Create reusable libraries and components in the "packages/" directory. * Use a consistent API design for shared libraries. * Write thorough documentation for shared components. * Utilize tools such as Bit (bit.dev) or Nx to manage and share components. **Don't Do This:** * Duplicate code across multiple projects. * Create overly specific components that are difficult to reuse. * Neglect documentation for shared libraries. **Example ("packages/utils/src/index.ts"):** """typescript export function formatDate(date: Date): string { return new Intl.DateTimeFormat('en-US').format(date); } export function isValidEmail(email: string): boolean { const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/; return emailRegex.test(email); } """ """typescript // Usage in "apps/web-app/src/components/UserComponent.tsx" import { formatDate, isValidEmail } from '@my-monorepo/utils'; function UserComponent({ user }: { user: any }) { const formattedDate = formatDate(new Date(user.createdAt)); const isValid = isValidEmail(user.email); return ( <div> <p>Created At: {formattedDate}</p> <p>Email Valid: {isValid ? 'Yes' : 'No'}</p> </div> ); } export default UserComponent; """ **Why:** Promotes code reuse and reduce redundant code, keeps applications lightweight, and reduces the likelihood of bugs. ## 5. Tooling and Automation Monorepos often require specialized tooling and automation to manage their complexity. Consider the following: * **Build Systems:** Tools like Nx, Turborepo, or Bazel can help you optimize build times by only rebuilding affected packages. * **Linting and Formatting:** Use ESLint and Prettier to enforce consistent code style across the Monorepo. * **Code Generation:** Use code generators to automate repetitive tasks and reduce boilerplate code. * **Testing:** Integrate testing frameworks (e.g., Jest, Mocha, Cypress) to ensure the quality of your code. * **CI/CD:** Implement a robust CI/CD pipeline to automate builds, tests, and deployments, potentially using tools like Github Actions, CircleCI or Jenkins * **Dependency Graph Visualization:** Tools such as Madge or dep-graph are useful in visualizing dependencies between packages. **Do This:** * Choose a build system that supports incremental builds and dependency analysis. * Automate linting and formatting to enforce code style. * Use code generators to reduce boilerplate. * Implement comprehensive testing. * Automate builds, tests, and deployments with CI/CD. * Include scripts to run tests and linters for commits. **Don't Do This:** * Manually run builds and tests. * Ignore linting and formatting errors. * Skip testing. * Neglect CI/CD. **Example ("tools/codegen/generate-component.js" - Simplified Example):** """javascript const fs = require('fs'); function generateComponent(name) { const componentCode = " import React from 'react'; interface ${name}Props { // Add props here } const ${name}: React.FC<${name}Props> = ({/* Props Go Here */}) => { return ( <div> {/* Component Content Here */} </div> ); }; export default ${name}; "; fs.writeFileSync("./packages/ui-library/src/components/${name}.tsx", componentCode); console.log("Component ${name} generated successfully!"); } const componentName = process.argv[2]; if (!componentName) { console.error('Please provide a component name.'); process.exit(1); } generateComponent(componentName); """ **Why:** Automates development tasks, speeds up the build process, and improve the overall quality of code. Build scripts should be repeatable. ## 6. Communication and Collaboration Effective communication and collaboration are crucial for successful Monorepo development, especially with large teams. **Do This:** * Establish clear communication channels (e.g., Slack, Discord). * Use code reviews to ensure code quality and knowledge sharing. * Document architectural decisions and coding standards. * Conduct regular team meetings to discuss progress and challenges. * Use Architecture Decision Records (ADRs) to keep a high-level log of important decisions that affect the project architecture and direction for future maintainers. **Don't Do This:** * Work in isolation without communicating with other team members. * Skip code reviews. * Neglect documentation. * Keep crucial design or engineering decisions locked inside a single person's mind. **Why:** Promotes knowledge sharing, reduces the risk of errors, and ensures that everyone is aligned on the project's goals and direction. ## 7. Versioning and Release Management Managing versions and releases within a Monorepo requires careful planning. We recommend using a tool like [Lerna](https://github.com/lerna/lerna) or [Changesets](https://github.com/changesets/changesets) to automate the release process. Changesets is generally preferred for its simplicity and ease of integration with CI/CD. **Do This:** * Use semantic versioning (SemVer) for all packages. * Automate release management with tools like Changesets or Lerna. * Generate changelogs automatically for each release. * Use conventional commits to automate version bumping and changelog generation. * Consider using git tags to mark releases. **Don't Do This:** * Manually manage versions and releases. * Forget to update changelogs. * Use inconsistent versioning schemes. **Example (Using Changesets):** 1. **Create a Changeset:** "pnpm changeset" 2. **Describe the changes:** (e.g., "Fixed a bug in the formatDate function") 3. **Commit the Changeset:** (e.g., ".changeset/fix-format-date.md") 4. **Run release:** In your CI/CD pipeline, run "pnpm changeset version" and "pnpm publish" **Why:** Simplifies the release process, reduces the risk of errors, and provides clear visibility into the changes included in each release. ## 8. Security Best Practices Security should be a primary concern in any Monorepo project. **Do This:** * Regularly scan dependencies for vulnerabilities using tools like "npm audit" or "snyk". * Implement secure coding practices (e.g., input validation, output encoding). * Use a static analysis tool such as SonarQube or Semgrep to check for code vulnerabilities. * Store secrets securely using environment variables or a dedicated secret management solution (e.g., HashiCorp Vault). * Follow the principle of least privilege when granting access to resources. * Have a clear incident response plan in place. **Don't Do This:** * Ignore security vulnerabilities. * Store secrets in code. * Grant unnecessary privileges. * Fail to monitor for security incidents. **Why:** Protects the Monorepo and its users from security threats. A vulnerability in a shared library might impact all applications in the monorepo. ## 9. Performance Optimization Monorepos can become slow if not optimized. Consider these approaches: * **Code Splitting:** Break up large applications into smaller chunks that can be loaded on demand. * **Tree Shaking:** Remove unused code from dependencies to reduce bundle sizes. * **Caching:** Implement caching strategies to avoid unnecessary computations. * **Lazy Loading:** Load components or modules only when they are needed. * **Optimize build times:** Use incremental builds and parallel execution. * **Profile application regularly:** Identify and resolve any bottlenecks. **Do This:** * Use code splitting to reduce initial load times. * Enable tree shaking to remove unused code. * Implement caching strategies. * Use lazy loading for non-critical components. * Profile and optimize performance regularly. **Don't Do This:** * Load the entire application at once. * Include unnecessary dependencies. * Ignore performance bottlenecks. **Why:** Improves the user experience and reduces resource consumption. A slow initial startup time can impact overall user satisfaction. ## 10. Documentation Standards Comprehensive documentation is essential for understanding and maintaining the Monorepo and its components. **Do This:** * Document all public APIs. * Provide clear examples of how to use shared libraries and components. * Use a consistent documentation style. * Automate documentation generation whenever possible. * Keep documentation up-to-date. * Place documentation close to the code it describes (e.g., using JSDoc comments or Markdown files in the same directory). **Don't Do This:** * Neglect documentation. * Write unclear or incomplete documentation. * Let documentation become outdated. **Example (Using JSDoc):** """typescript /** * Formats a date object into a human-readable string. * * @param {Date} date - The date object to format. * @returns {string} The formatted date string. */ export function formatDate(date: Date): string { return new Intl.DateTimeFormat('en-US').format(date); } """ **Why:** Makes it easier for developers to understand and use the Monorepo's components and keeps new engineers informed and productive. Well-documented architectural decisions are critical to prevent misunderstandings.
# Component Design Standards for Monorepo This document outlines the coding standards for component design within a Monorepo architecture. It focuses on creating reusable, maintainable, and performant components, tailored for the intricacies of a monorepo environment. These guidelines are designed to be used by both developers and AI coding assistants to ensure code consistency and quality. ## 1. Component Modularity and Reusability ### 1.1. Standard: Encapsulation and Abstraction **Standard:** Components should be encapsulated with a well-defined public API and hidden internal implementation details. Use abstraction to provide a simplified interface to complex functionalities. **Why:** This promotes reusability, reduces dependencies, and allows for internal changes without impacting dependent components. In a Monorepo, breaking changes can have widespread effects, making encapsulation crucial. **Do This:** * Define clear interfaces using TypeScript/JavaScript. * Use private/protected members to hide implementation details. * Favor composition over inheritance to promote flexibility. **Don't Do This:** * Expose internal state or logic directly. * Create overly complex inheritance hierarchies. * Create components tightly coupled to specific application contexts. **Code Example (TypeScript):** """typescript // packages/ui-library/src/components/Button/Button.tsx import React, { ReactNode } from 'react'; import styles from './Button.module.css'; interface ButtonProps { children: ReactNode; onClick: () => void; variant?: 'primary' | 'secondary'; } const Button: React.FC<ButtonProps> = ({ children, onClick, variant = 'primary' }) => { const buttonClass = variant === 'primary' ? styles.primaryButton : styles.secondaryButton; return ( <button className={"${styles.button} ${buttonClass}"} onClick={onClick}> {children} </button> ); }; export default Button; //Button.module.css (CSS Modules) .button { padding: 10px 20px; border: none; border-radius: 5px; cursor: pointer; font-size: 16px; } .primaryButton { background-color: #007bff; color: white; } .secondaryButton { background-color: #6c757d; color: white; } """ **Anti-Patterns:** * **God Components:** Components that implement too much logic or have too many responsibilities. These are hard to reuse and maintain. ### 1.2. Standard: Single Responsibility Principle (SRP) **Standard:** Each component should have one, and only one, reason to change. **Why:** Components with a single responsibility are easier to understand, test, and reuse. Changing one aspect of the component doesn't necessarily break other parts of the system. Reduces the blast radius of changes within the Monorepo. **Do This:** * Decompose complex components into smaller, more focused components. * Use composition to combine these smaller components. **Don't Do This:** * Add unrelated functionality to an existing component. * Create monolithic components that handle multiple different tasks. **Code Example (JavaScript/React):** """javascript // packages/ui-library/src/components/Input/Input.jsx import React from 'react'; import styles from './Input.module.css'; //Example using CSS Modules interface InputProps { label: string; value: string; onChange: (event: React.ChangeEvent<HTMLInputElement>) => void; type?: string; errorMessage?: string; } const Input: React.FC<InputProps> = ({ label, value, onChange, type = "text", errorMessage }) => { return ( <div className={styles.inputContainer}> <label htmlFor={label} className={styles.inputLabel}>{label}</label> <input type={type} id={label} value={value} onChange={onChange} className={styles.inputField} /> {errorMessage && <div className={styles.errorMessage}>{errorMessage}</div>} </div> ); }; export default Input; // Input.module.css .inputContainer { display: flex; flex-direction: column; margin-bottom: 10px; } .inputLabel { margin-bottom: 5px; font-weight: bold; } .inputField { padding: 8px; border: 1px solid #ccc; border-radius: 4px; font-size: 16px; } .errorMessage { color: red; font-size: 12px; } """ **Explanation:** The "Input" component handles only the rendering and management of a single input field. Error messages and labels are controlled within the component. ### 1.3. Standard: Versioning and Semantic Versioning **Standard:** All components should be versioned using Semantic Versioning (SemVer). Major versions should indicate breaking changes. **Why:** SemVer allows developers to manage dependencies and understand the impact of updates. In a Monorepo, this is even more critical as changes in one component can affect multiple applications. Automated dependency updates and change management tools rely on accurate versioning. **Do This:** * Use "npm version", "yarn version", or "pnpm version" to manage versions. * Follow SemVer principles for versioning. * Publish components with clearly defined versions. Use a tool like "changesets" or similar for managing release and versioning. **Don't Do This:** * Make breaking changes without bumping the major version. * Publish components without a version number. * Ignore SemVer best practices. **Code Example (package.json - Using changesets to manage versioning):** """json // packages/ui-library/package.json { "name": "@my-monorepo/ui-library", "version": "1.2.3", "description": "A UI library for my monorepo.", "main": "dist/index.js", "module": "dist/index.esm.js", "types": "dist/index.d.ts", "scripts": { "build": "rollup -c", "test": "jest", "lint": "eslint src --ext .ts,.tsx", "prepare": "npm run build", "version": "changeset version", //Uses changesets cli "release": "npm publish"//Uses changesets cli }, "dependencies": { //Example of using react and styled components. "react": "^18.0.0", "styled-components": "^5.0.0" }, "devDependencies": { "@changesets/cli": "^2.26.2", "@rollup/plugin-commonjs": "^25.0.7", "@rollup/plugin-node-resolve": "^15.2.3", "@rollup/plugin-typescript": "^11.1.5", "@types/react": "^18.0.0", "@types/styled-components": "^5.0.0", "rollup": "^2.79.1", "rollup-plugin-peer-deps-external": "^2.2.4", "typescript": "^4.9.5" }, "peerDependencies": { "react": "^18.0.0", "styled-components": "^5.0.0" }, "files": [ "dist" ], "publishConfig": { "access": "public" } } """ **Explanation:** The "changeset version" command, when part of the version lifecycle hook, automates the SemVer bumping process based on changeset files, which explicitly declare versions. Peer dependencies clearly define the React and Styled-Components versions required by the library. This explicit declaration creates a clear contract. ## 2. Component API Design ### 2.1. Standard: Explicit Props **Standard:** Components should accept data and behavior as explicit props, rather than relying on implicit state or context. **Why:** Explicit props make components easier to understand and reason about. They improve testability and reduce side effects. In a Monorepo, this is important for ensuring components are predictable and behave consistently across different applications. **Do This:** * Define all required props with TypeScript/JavaScript. * Use default props for optional values. * Document all props clearly. **Don't Do This:** * Rely on global state or context for component behavior unless absolutely required. **Code Example (TypeScript/React):** """typescript // packages/ui-library/src/components/Avatar/Avatar.tsx import React from 'react'; import styles from './Avatar.module.css'; interface AvatarProps { imageUrl: string; size?: 'small' | 'medium' | 'large'; altText?: string; } const Avatar: React.FC<AvatarProps> = ({ imageUrl, size = 'medium', altText = "User Avatar" }) => { let avatarSizeClass = styles.mediumAvatar; if (size === 'small') { avatarSizeClass = styles.smallAvatar; } else if (size === 'large') { avatarSizeClass = styles.largeAvatar; } return ( <img src={imageUrl} alt={altText} className={"${styles.avatar} ${avatarSizeClass}"} /> ); }; export default Avatar; //Avatar.module.css .avatar { border-radius: 50%; } .smallAvatar { width: 30px; height: 30px; } .mediumAvatar { width: 50px; height: 50px; } .largeAvatar { width: 80px; height: 80px; } """ **Explanation:** The "Avatar" component defines the "imageUrl", "size", and "altText" properties explicitly. The default value for size is set to "medium". ### 2.2. Standard: Event Handling **Standard:** Components should emit events using well-defined event handler props. **Why:** Allows parent components to react to events triggered by child components. Helps keep components decoupled and reusable. Standardized event handling makes it easier to track component interactions within the Monorepo. **Do This:** * Use descriptive event handler names (e.g., "onInputChange", "onSubmit"). * Pass necessary data as arguments to the event handler. * Create interfaces for event handler payloads. **Don't Do This:** * Directly modify the state of parent components. * Use generic event handlers without clear purpose. **Code Example (TypeScript/React):** """typescript // packages/ui-library/src/components/SearchInput/SearchInput.tsx import React, { useState, ChangeEvent } from 'react'; import styles from './SearchInput.module.css'; interface SearchInputProps { onSearch: (query: string) => void; placeholder?: string; } const SearchInput: React.FC<SearchInputProps> = ({ onSearch, placeholder = "Search..." }) => { const [searchTerm, setSearchTerm] = useState(''); const handleInputChange = (event: ChangeEvent<HTMLInputElement>) => { const newSearchTerm = event.target.value; setSearchTerm(newSearchTerm); onSearch(newSearchTerm); //Calls callback on every input change }; return ( <input type="text" placeholder={placeholder} value={searchTerm} onChange={handleInputChange} className={styles.searchInput} /> ); }; export default SearchInput; //SearchInput.module.css .searchInput { padding: 8px 12px; border: 1px solid #ccc; border-radius: 4px; font-size: 14px; width: 200px; /* Adjust width as needed */ outline: none; /* Removes the default focus outline */ } /* Style for when the input is focused (optional) */ .searchInput:focus { border-color: #007bff; /* Highlight the border on focus */ box-shadow: 0 0 5px rgba(0,123,255,0.5); /* Add a subtle shadow */ } """ **Explanation:** The "SearchInput" component has an "onSearch" prop that's a function taking the search query as an argument. Whenever text is entered into the input field, the "onSearch" function is called. The "placeholder" defaults to "Search...". ## 3. Styling and Theming ### 3.1. Standard: CSS Modules or Styled Components **Standard:** Use CSS Modules or Styled Components for component styling. **Why:** These techniques provide component-level styling, reducing the risk of style conflicts. They also improve maintainability and reusability. In a Monorepo, where multiple teams might be working on different applications, modular styling is essential. **Do This:** * Choose either CSS Modules or Styled Components and stick to it within a component library. * Use descriptive class names or style names. * Avoid global CSS styles that can conflict with other components. **Don't Do This:** * Use inline styles excessively. * Use global CSS classes without proper scoping or naming conventions. **Code Example (Styled Components):** """typescript // packages/ui-library/src/components/Alert/Alert.tsx import React, { ReactNode } from 'react'; import styled from 'styled-components'; interface AlertProps { children: ReactNode; type?: 'success' | 'warning' | 'error'; } const AlertContainer = styled.div<Pick<AlertProps, "type">>" padding: 10px; border-radius: 5px; margin-bottom: 10px; background-color: ${(props) => { switch (props.type) { case 'success': return '#d4edda'; case 'warning': return '#fff3cd'; case 'error': return '#f8d7da'; default: return '#e2e3e5'; } }}; color: ${(props) => { switch (props.type) { case 'success': return '#155724'; case 'warning': return '#856404'; case 'error': return '#721c24'; default: return '#000'; } }}; "; const Alert: React.FC<AlertProps> = ({ children, type = 'success' }) => { return ( <AlertContainer type={type}> {children} </AlertContainer> ); }; export default Alert; """ **Explanation:** This uses Styled Components to create a styled "AlertContainer" div. The background color and text color are dynamically set based on the "type" prop. ### 3.2. Standard: Theming Support **Standard:** Create components that are theme-aware, supporting light and dark themes, or other customized styles. **Why:** Increases the flexibility and reusability of components across different applications and user preferences. Theming provides a consistent user experience. **Do This:** * Use CSS Variables or Styled Components' theming capabilities. * Provide a default theme and allow applications to override it. **Don't Do This:** * Hardcode colors and styles within components. **Code Example (Styled Components with Theming):** """typescript // packages/ui-library/src/components/TextInput/TextInput.tsx import React, { useState, ChangeEvent, useContext } from 'react'; import styled from 'styled-components'; import { ThemeContext } from './ThemeProvider'; // Assuming you have a ThemeProvider interface TextInputProps { placeholder?: string; } const StyledInput = styled.input" padding: 8px 12px; border: 1px solid ${(props) => props.theme.borderColor}; border-radius: 4px; font-size: 14px; width: 200px; outline: none; background-color: ${(props) => props.theme.backgroundColor}; color: ${(props) => props.theme.textColor}; &:focus { border-color: ${(props) => props.theme.primaryColor}; box-shadow: 0 0 5px rgba(0,123,255,0.5); } "; const TextInput: React.FC<TextInputProps> = ({ placeholder = "Enter Text" }) => { const theme = useContext(ThemeContext); //Accessing the theme return ( <StyledInput placeholder={placeholder} theme={theme}/> //Pass theme as props ); }; export default TextInput; // packages/ui-library/src/components/ThemeProvider.tsx import React, { createContext, useState, useContext, ReactNode } from 'react'; import { ThemeProvider as StyledThemeProvider } from 'styled-components'; // Define the theme interface interface Theme { primaryColor: string; backgroundColor: string; textColor: string; borderColor: string; } // Define default themes const lightTheme: Theme = { primaryColor: '#007bff', backgroundColor: '#ffffff', textColor: '#333333', borderColor: '#cccccc' }; const darkTheme: Theme = { primaryColor: '#00aaff', // A slightly brighter shade for dark mode backgroundColor: '#333333', textColor: '#ffffff', borderColor: '#555555' }; // Create a context for the theme interface ThemeContextType { theme: Theme; toggleTheme: () => void; } const ThemeContext = createContext<ThemeContextType>({ theme: lightTheme, // Default theme toggleTheme: () => {} // Dummy function to avoid null checks }); // Create a ThemeProvider component interface ThemeProviderProps { children: ReactNode; } const ThemeProvider: React.FC<ThemeProviderProps> = ({ children }) => { const [currentTheme, setCurrentTheme] = useState<Theme>(lightTheme); // Function to toggle between light and dark themes const toggleTheme = () => { setCurrentTheme(currentTheme === lightTheme ? darkTheme : lightTheme); }; return ( <ThemeContext.Provider value={{ theme: currentTheme, toggleTheme }}> <StyledThemeProvider theme={currentTheme}> {children} </StyledThemeProvider> </ThemeContext.Provider> ); }; // Custom hook to use the theme const useTheme = () => useContext(ThemeContext); export { ThemeProvider, useTheme, ThemeContext }; export type { Theme }; """ **Explanation:** 1. **Theme Definition:** The "Theme" interface defines the properties for the theme, such as "primaryColor", "backgroundColor", "textColor", and "borderColor". 2. **Default Themes:** "lightTheme" and "darkTheme" are defined as default themes. 3. **ThemeContext:** "ThemeContext" is created to provide the theme to the components. 4. **ThemeProvider:** The "ThemeProvider" component manages the current theme state and provides a "toggleTheme" function to switch between themes. It utilizes Styled Components' "ThemeProvider" to pass down the theme. 5. **useTheme Hook:** A custom "useTheme" hook is provided to easily access the theme and "toggleTheme" function in components. 6. **Styled Components Integration:** Styled Components are used to create the styled TextInput. Theme is accessed within backticks: "$""${props => props.theme.borderColor};"" ## 4. Component Testing ### 4.1. Standard: Unit Tests **Standard:** Write unit tests for all components, focusing on testing their public API and behavior. **Why:** Ensures components function correctly and reduces the risk of regressions. Unit tests are fast and provide detailed feedback. In a Monorepo, component tests ensure that changes in one component don't break other parts of the system. Especially crucial when one component is utilized in multiple applications/packages. **Do This:** * Use a testing framework like Jest or Mocha. * Write tests for all possible states and inputs. * Use mocks and stubs to isolate components during testing. * Aim for high test coverage. **Don't Do This:** * Skip writing tests for complex components. * Write brittle tests that are tightly coupled to implementation details. **Code Example (Jest/React Testing Library):** """typescript // packages/ui-library/src/components/Counter/Counter.tsx import React, { useState } from 'react'; import styles from './Counter.module.css'; interface CounterProps { initialValue?: number; } const Counter: React.FC<CounterProps> = ({ initialValue = 0 }) => { const [count, setCount] = useState(initialValue); const increment = () => { setCount(count + 1); }; const decrement = () => { setCount(count - 1); }; return ( <div className={styles.counterContainer}> <button onClick={decrement} className={styles.counterButton}>-</button> <span className={styles.counterValue}>{count}</span> <button onClick={increment} className={styles.counterButton}>+</button> </div> ); }; export default Counter; //Counter.module.css .counterContainer { display: flex; align-items: center; } .counterButton { padding: 5px 10px; margin: 0 5px; font-size: 16px; cursor: pointer; } .counterValue { font-size: 18px; margin: 0 10px; } // packages/ui-library/src/components/Counter/Counter.test.tsx import React from 'react'; import { render, screen, fireEvent } from '@testing-library/react'; import Counter from './Counter'; describe('Counter Component', () => { test('renders initial value correctly', () => { render(<Counter initialValue={5} />); const countElement = screen.getByText('5'); expect(countElement).toBeInTheDocument(); }); test('increments count when increment button is clicked', () => { render(<Counter initialValue={0} />); const incrementButton = screen.getByText('+'); fireEvent.click(incrementButton); const countElement = screen.getByText('1'); expect(countElement).toBeInTheDocument(); }); test('decrements count when decrement button is clicked', () => { render(<Counter initialValue={10} />); const decrementButton = screen.getByText('-'); fireEvent.click(decrementButton); const countElement = screen.getByText('9'); expect(countElement).toBeInTheDocument(); }); }); """ **Explanation:** The "Counter.test.tsx" file uses React Testing Library to test the "Counter" component. It verifies that the initial value is rendered correctly and that the increment and decrement buttons work as expected. ### 4.2. Standard: Component Storybook or Similar Documentation **Standard:** Use a component documentation tool like Storybook to showcase the different states and variations of each component visually. **Why:** Provides a living style guide and documentation for components. Facilitates communication and collaboration between designers and developers. Helps ensure visual consistency across the Monorepo. Crucial for component discovery and understanding of purpose. **Do This:** * Create stories for all components, covering different props and states. * Use addons to enhance Storybook functionality (e.g., accessibility checks). * Keep stories up-to-date as components evolve. **Don't Do This:** * Treat Storybook as an afterthought. * Create incomplete or outdated stories. **Code Example (Storybook Story):** """typescript // packages/ui-library/src/components/Button/Button.stories.tsx import React from 'react'; import { Story, Meta } from '@storybook/react'; import Button from './Button'; export default { title: 'Components/Button', component: Button, argTypes: { variant: { control: { type: 'select', options: ['primary', 'secondary'] }, }, onClick: { action: 'clicked' }, }, } as Meta; const Template: Story = (args) => <Button {...args} />; export const Primary = Template.bind({}); Primary.args = { children: 'Primary Button', variant: 'primary', }; export const Secondary = Template.bind({}); Secondary.args = { children: 'Secondary Button', variant: 'secondary', }; """ **Explanation:** This Storybook story defines two variations of the "Button" component: "Primary" and "Secondary". The user can interact with these stories in the Storybook UI. ## 5. Performance Considerations ### 5.1. Standard: Minimize Re-renders **Standard:** Optimize components to minimize unnecessary re-renders. **Why:** Re-renders can be performance bottlenecks, especially in complex applications. Careful optimization is essential in a Monorepo where components are shared across multiple applications. **Do This:** * Use "React.memo" for functional components that receive the same props. * Implement "shouldComponentUpdate" or "PureComponent" for class components. * Use immutable data structures. **Don't Do This:** * Rely on default React behavior for all components without considering performance. **Code Example (React.memo):** """typescript // packages/ui-library/src/components/DisplayValue/DisplayValue.tsx import React from 'react'; import styles from './DisplayValue.module.css'; interface DisplayValueProps { value: string; } const DisplayValue: React.FC<DisplayValueProps> = ({ value }) => { console.log("DisplayValue rendered with value: ${value}"); return <div className={styles.displayValue}>{value}</div>; }; export default React.memo(DisplayValue); //DisplayValue.module.css .displayValue { font-size: 20px; font-weight: bold; color: #333; /* or any color that suits your design */ padding: 10px; /* some padding to give it space */ border: 1px solid #ccc; /* optional: a subtle border */ border-radius: 5px; /* optional: rounded corners for a softer look */ background-color: #f9f9f9; /* optional: a very light background */ text-align: center; /* centers the text */ } """ **Explanation:** "React.memo" memoizes the "DisplayValue" component, preventing re-renders if the "value" prop hasn't changed. ### 5.2. Standard: Code Splitting **Standard:** Implement code splitting to reduce the initial load time of applications. **Why:** Splitting code into smaller chunks allows the browser to download only the code that's needed initially, improving performance. In a Monorepo, this is essential because the codebase can be very large. **Do This:** * Use dynamic imports ("import()") to load components on demand. * Use tools like Webpack or Rollup to configure code splitting. * Identify chunks that can be loaded lazily. **Don't Do This:** * Load all components upfront, even if they aren't needed immediately. **Code Example (Dynamic Import):** """typescript // packages/app/src/App.tsx import React, { lazy, Suspense } from 'react'; const LazyLoadedComponent = lazy(() => import('@my-monorepo/ui-library/MyComponent')); const App = () => { return ( <Suspense fallback={<div>Loading...</div>}> <LazyLoadedComponent /> </Suspense> ); }; export default App; """ **Explanation:** The "@my-monorepo/ui-library/MyComponent" component is loaded lazily using "lazy" and "Suspense". This means that the component's code will only be downloaded when it's actually needed. ## 6. Accessibility ### 6.1. Standard: ARIA Attributes **Standard:** Utilize ARIA attributes to enhance the accessibility of components. **Why:** ARIA attributes provide semantic information to assistive technologies, making components more accessible to users with disabilities. **Do This:** * Use ARIA attributes to describe the role, state, and properties of elements. * Provide clear and concise labels for interactive elements. * Test components with screen readers. **Don't Do This:** * Use ARIA attributes incorrectly or unnecessarily. * Rely solely on ARIA attributes without providing proper semantic HTML. **Code Example (ARIA Attributes):** """typescript // packages/ui-library/src/components/ToggleSwitch/ToggleSwitch.tsx import React from 'react'; import styles from './ToggleSwitch.module.css'; interface ToggleSwitchProps { checked: boolean; onChange: (checked: boolean) => void; label?: string; } const ToggleSwitch: React.FC<ToggleSwitchProps> = ({ checked, onChange, label = "Enable" }) => { return ( <div className={styles.toggleContainer}> <label className={styles.switch}> <input type="checkbox" checked={checked} onChange={(e) => onChange(e.target.checked)} role="switch" aria-checked={checked} aria-label={label} /> <span className={"${styles.slider} ${styles.round}"}></span> </label> </div> ); }; export default ToggleSwitch; //ToggleSwitch.module.css .toggleContainer { display: flex; align-items: center; } .switch { position: relative; display: inline-block; width: 60px; height: 34px; } /* Hide default HTML checkbox */ .switch input { opacity: 0; width: 0; height: 0; } /* The slider */ .slider { position: absolute; cursor: pointer; top: 0; left: 0; right: 0; bottom: 0; background-color: #ccc; transition: .4s; } .slider:before { position: absolute; content: ""; height: 26px; width: 26px; left: 4px; bottom: 4px; background-color: white; transition: .4s; } input:checked + .slider { background-color: #2196F3; } input:focus + .slider { box-shadow: 0 0 1px #2196F3; } input:checked + .slider:before { transform: translateX(26px); } /* Rounded sliders */ .slider.round { border-radius: 34px; } .slider.round:before { border-radius: 50%; } """ **Explanation:** The "ToggleSwitch" component uses "role="switch"" and "aria-checked" to provide semantic information about the toggle switch to assistive technologies. "aria-label" provides text for screenreaders. ### 6.2. Standard: Keyboard Navigation **Standard:** Ensure that all interactive components are accessible via keyboard navigation. **Why:** Users who cannot use a mouse rely on keyboard navigation to interact with web applications. **Do This:** * Use proper HTML elements (e.g., "<button>", "<a>") that support keyboard navigation by default. * Use the "tabindex" attribute to control the focus order. * Provide visual focus indicators. **Don't Do This:** * Remove the focus outline without providing an alternative visual indicator. ## 7. Monorepo Specific Considerations ### 7.1. Standard: Dependency Management **Standard:** Strictly control dependencies between packages in the Monorepo. Avoid circular dependencies. **Why:** Circular dependencies can lead to build issues, runtime errors, and increased complexity. **Do This:** * Use a tool like "madge" or "depcheck" to detect circular dependencies. * Refactor code to remove circular dependencies. * Clearly define the public API of each package using TypeScript definition files. **Don't Do This:** * Introduce circular dependencies between packages willy-nilly. * Ignore dependency management best practices. ### 7.2. Standard: Build Tooling **Standard:** Use a build system that can efficiently build and test only the affected packages when changes are made. Tools like Nx and Turborepo can be very helpful. **Why:** Helps to keep build times low, improving developer productivity. **Do This:** - Use tools designed for monorepos like Nx or Turborepo to intelligently build and cache task outputs. - Clearly define the dependencies between packages in your build configuration. - Leverage caching mechanisms to avoid rebuilding unchanged packages. **Don't Do This:** - Build all packages from scratch every time, as this is inefficient. - Neglect to configure your tooling properly to track and optimize your build process. This document provides a comprehensive set of coding standards for component design within a Monorepo architecture. Adhering to these standards will help create reusable, maintainable, and performant components that can be shared across multiple applications. These are living documents and continuous feedback should be incorporated to refine/improve this document as new practices emerge.
# State Management Standards for Monorepo This document outlines the standards for state management within our monorepo. Effective state management is crucial for maintainability, performance, and scalability across our applications and libraries. These standards aim to provide a consistent approach to handling application state, data flow, and reactivity within the monorepo. ## 1. Principles of State Management in a Monorepo A monorepo architecture introduces unique challenges and opportunities regarding state management. Due to code sharing and potential inter-dependencies between projects, a unified and well-defined state management strategy becomes paramount. * **Standard:** Utilize a predictable and unidirectional data flow. * **Why:** Ensures that changes to state are traceable and debuggable, preventing unintended side effects across the monorepo. * **Do This:** Favor architectures like Flux, Redux, or their modern counterparts with clear data flow patterns. * **Don't Do This:** Avoid directly mutating state across different components or services without a defined flow. * **Standard:** Favor immutable data structures. * **Why:** Simplifies debugging, allows for easy change detection, and improves performance by enabling shallow comparisons. * **Do This:** Use libraries like Immutable.js, Immer, or native JavaScript with spread operators to create new, immutable state objects. * **Don't Do This:** Directly modify state objects, as this can lead to unpredictable behavior and difficult-to-trace bugs. * **Standard:** Separate stateful logic from presentational components. * **Why:** Enhances reusability, testability, and maintainability by isolating state-specific code. * **Do This:** Implement the Container/Presentational pattern or use hooks to separate data fetching and state manipulation from UI rendering. * **Don't Do This:** Embed complex state logic directly within UI components. * **Standard:** Define clear boundaries for state domains. * **Why:** Prevents components and services from accidentally modifying state that they shouldn't have access to. * **Do This:** Use techniques like context providers or scoped state management solutions to isolate state to specific parts of the application. * **Don't Do This:** Allow global, shared state to be modified from anywhere in the codebase without clear ownership or access controls. * **Standard:** Handle side effects carefully. * **Why:** Side effects (API calls, DOM manipulations, etc.) can introduce complexity and make state updates less predictable. * **Do This:** Isolate side effects within dedicated modules or using middleware/thunks in state management libraries. * **Don't Do This:** Perform side effects directly within reducers or component render functions. ## 2. Choosing a State Management Library Selecting the right state management library is critical. The choice depends on the project's complexity, team familiarity, and performance requirements. The monorepo should adopt a limited set of preferred libraries to promote consistency. * **Preferred Libraries:** For React-based applications, consider Zustand, Recoil, Jotai, or Redux Toolkit. For Vue-based applications, consider Pinia or Vuex. (These are leading contenders as of late 2024/early 2025.) * **Zustand:** A small, fast, and scalable bearbones state-management solution using simplified flux principles. * **Recoil:** A state management library for React that lets you create data-flow graphs. Particularly suited to complex dependencies. Can require more boilerplate than Zustand. * **Jotai:** Primitive and flexible state management based on an atomic model. * **Redux Toolkit:** An opinionated, batteries-included toolset for efficient Redux development, simplifying configuration and reducing boilerplate. Often combined now with RTK Query for data fetching. * **Pinia:** The recommended state management solution for Vue 3, offering a simpler and more intuitive API compared to Vuex. * **Vuex:** The official state management library for Vue, suitable for complex applications requiring centralized state management. * **Standard:** Justify the choice of state management library in the project's README. * **Why:** Provides context for other developers and helps maintain consistency across the monorepo. * **Do This:** Document the reasons for selecting a specific library, considering factors like team expertise, project complexity, and performance requirements. * **Don't Do This:** Choose a library arbitrarily without properly evaluating its suitability for the project. ## 3. Zustand State Management Examples Zustand is a minimalist and flexible state management solution suitable for many projects within a monorepo. ### 3.1 Core Implementation * **Standard:** Create a store using "create" from Zustand. * **Standard:** Define state and actions within the store function. """javascript // packages/my-app/src/store/myStore.js import { create } from 'zustand'; const useMyStore = create((set) => ({ count: 0, increment: () => set((state) => ({ count: state.count + 1 })), decrement: () => set((state) => ({ count: state.count - 1 })), reset: () => set({ count: 0 }), // Example with async action fetchData: async () => { const response = await fetch('/api/data'); // Replace with real API endpoint const data = await response.json(); set({ data: data }); // Assumes you add "data" to the initial state. }, })); export default useMyStore; """ * **Why:** Provides a simple and efficient way to manage state using hooks. * **Do This:** Use functional updates to ensure immutability. * **Don't Do This:** Mutate the state directly. ### 3.2 Using the Store in Components * **Standard:** Use the custom hook "useMyStore" to access state and actions within components. """javascript // packages/my-app/src/components/MyComponent.js import React from 'react'; import useMyStore from '../store/myStore'; function MyComponent() { const { count, increment, decrement, reset, fetchData } = useMyStore(); return ( <div> <p>Count: {count}</p> <button onClick={increment}>Increment</button> <button onClick={decrement}>Decrement</button> <button onClick={reset}>Reset</button> <button onClick={fetchData}>Fetch Data</button> </div> ); } export default MyComponent; """ * **Why:** Simplifies component logic and promotes reusability. ### 3.3. Middleware and Persistence * Zustand uses middleware for advanced functionality like persistence. """javascript // packages/my-app/src/store/myStore.js import { create } from 'zustand'; import { persist } from 'zustand/middleware' const useMyStore = create(persist( (set, get) => ({ count: 0, increment: () => set({ count: get().count + 1 }), decrement: () => set({ count: get().count - 1 }), }), { name: 'my-store', // unique name getStorage: () => localStorage, // (optional) default localStorage } )) export default useMyStore; """ * The "persist" middleware automatically saves the state to local storage. * **Why:** Enables easy persistence of state across sessions. ## 4. Recoil State Management Examples Recoil offers a different approach based on atoms and selectors, suitable for complex dependency graphs. ### 4.1 Core Implementation * **Standard:** Define atoms for state and selectors for derived state. """javascript // packages/my-app/src/recoil/atoms.js import { atom } from 'recoil'; export const countState = atom({ key: 'countState', default: 0, }); // packages/my-app/src/recoil/selectors.js import { selector } from 'recoil'; import { countState } from './atoms'; export const doubledCountState = selector({ key: 'doubledCountState', get: ({ get }) => { const count = get(countState); return count * 2; }, }); """ * **Why:** Provides a flexible and efficient way to manage complex state dependencies. * **Do This:** Use unique keys for atoms and selectors. * **Don't Do This:** Use generic keys that might conflict with other parts of the application. ### 4.2 Using Recoil in Components * **Standard:** Use "useRecoilState" and "useRecoilValue" hooks to access Recoil state and derived values. """javascript // packages/my-app/src/components/MyComponent.js import React from 'react'; import { useRecoilState, useRecoilValue } from 'recoil'; import { countState, doubledCountState } from '../recoil/atoms'; function MyComponent() { const [count, setCount] = useRecoilState(countState); const doubledCount = useRecoilValue(doubledCountState); return ( <div> <p>Count: {count}</p> <p>Doubled Count: {doubledCount}</p> <button onClick={() => setCount(count + 1)}>Increment</button> </div> ); } export default MyComponent; """ * **Why:** Simplifies component logic and promotes reusability. ### 4.3 Asynchronous Selectors for Data Fetching Recoil excels with asynchronous data fetching. """javascript import { selector } from 'recoil'; export const asyncDataState = selector({ key: 'asyncDataState', get: async () => { const response = await fetch('/api/data'); // Replace with a real API endpoint const data = await response.json(); return data; }, }); """ * "useRecoilValue" is used to access the data in components. ## 5. Redux Toolkit Examples Redux Toolkit simplifies Redux development with opinionated defaults and utility functions. RTK Query is the recommended approach to data fetching with Redux. ### 5.1 Core Implementation * **Standard:** Configure a Redux store using "configureStore" from Redux Toolkit. * **Standard:** Define reducers using "createSlice". """javascript // packages/my-app/src/store/store.js import { configureStore } from '@reduxjs/toolkit'; import counterReducer from './counterSlice'; export const store = configureStore({ reducer: { counter: counterReducer, }, }); // packages/my-app/src/store/counterSlice.js import { createSlice } from '@reduxjs/toolkit'; export const counterSlice = createSlice({ name: 'counter', initialState: { value: 0, }, reducers: { increment: (state) => { state.value += 1; }, decrement: (state) => { state.value -= 1; }, incrementByAmount: (state, action) => { state.value += action.payload; }, }, }); export const { increment, decrement, incrementByAmount } = counterSlice.actions; export default counterSlice.reducer; """ * **Why:** Provides a simplified and efficient way to manage Redux state. * **Do This:** Use "createSlice" to automatically generate action creators and reducer logic. * **Don't Do This:** Write manual action creators and reducers, as this can lead to boilerplate and errors. ### 5.2 Using Redux in Components * **Standard:** Use "useSelector" and "useDispatch" hooks from "react-redux" to access state and dispatch actions within components. """javascript // packages/my-app/src/components/MyComponent.js import React from 'react'; import { useSelector, useDispatch } from 'react-redux'; import { increment, decrement, incrementByAmount } from '../store/counterSlice'; function MyComponent() { const count = useSelector((state) => state.counter.value); const dispatch = useDispatch(); return ( <div> <p>Count: {count}</p> <button onClick={() => dispatch(increment())}>Increment</button> <button onClick={() => dispatch(decrement())}>Decrement</button> <button onClick={() => dispatch(incrementByAmount(5))}>Increment by 5</button> </div> ); } export default MyComponent; """ * **Why:** Simplifies component logic and promotes reusability. ### 5.3 RTK Query for Data Fetching RTK Query simplifies data fetching in Redux applications. """javascript // packages/my-app/src/services/api.js import { createApi, fetchBaseQuery } from '@reduxjs/toolkit/query/react' export const api = createApi({ baseQuery: fetchBaseQuery({ baseUrl: '/' }), // Adjust base URL as needed. Consider using env vars. endpoints: (builder) => ({ getData: builder.query({ query: () => "data", // Actual endpoint }), }), }); export const { useGetDataQuery } = api; // In store.js: import { configureStore } from '@reduxjs/toolkit'; import { api } from './services/api'; export const store = configureStore({ reducer: { [api.reducerPath]: api.reducer, }, middleware: (getDefaultMiddleware) => getDefaultMiddleware().concat(api.middleware), }); //In a component: import { useGetDataQuery } from '../services/api'; function MyComponent() { const { data, error, isLoading } = useGetDataQuery(); if (isLoading) return <div>Loading...</div>; if (error) return <div>Error: {error.message}</div>; return ( <div> {data.map(item => ( <div key={item.id}>{item.name}</div> ))} </div> ); } """ * **Why:** Provides a streamlined and efficient way to fetch and cache data using Redux. * **Do This:** Define API endpoints using "createApi". * **Don't Do This:** Manually fetch data and manage loading states and errors, as RTK Query handles this automatically. ## 6. Vue.js State Management with Pinia Pinia is the recommended state management solution for Vue 3. ### 6.1 Core Implementation * **Standard**: Define stores using "defineStore" from Pinia. """javascript // packages/my-app/src/stores/counter.js import { defineStore } from 'pinia' export const useCounterStore = defineStore('counter', { state: () => ({ count: 0, }), getters: { doubleCount: (state) => state.count * 2, }, actions: { increment() { this.count++ }, decrement() { this.count-- }, async fetchData() { // Example of making an API call, adapt to your needs const response = await fetch('/api/data') const data = await response.json() // Assign the fetched data to a state variable this.count = data.count; // Adapt based on actual returned data } }, }) """ * **Why**: Provides a modular and scalable approach to managing state in Vue.js applications. * **Do This**: Utilize actions for mutations and getters for derived data. Avoid directly mutating outside of actions. * **Don't Do This**: Use "mapState", "mapGetters", and "mapActions" (Vuex syntax) in Pinia. Use the "use" composable hook instead. ### 6.2 Using Pinia in Components * **Standard**: Use the "useCounterStore" custom hook to access state, getters, and actions within components via the composable "use" pattern. """vue // packages/my-app/src/components/MyComponent.vue <template> <p>Count: {{ counter.count }}</p> <p>Double Count: {{ counter.doubleCount }}</p> <button @click="counter.increment">Increment</button> <button @click="counter.decrement">Decrement</button> <button @click="counter.fetchData">Fetch Data</button> </template> <script setup> import { useCounterStore } from '../stores/counter' const counter = useCounterStore() </script> """ * **Why**: Provides a clear way to access store properties directly in the template and simplifies component logic. The "setup" script handles all state management. ## 7. Guidelines for Sharing State Across Packages Sharing state across packages within the monorepo needs careful consideration. * **Standard:** Avoid sharing mutable state directly between packages. * **Why:** Can lead to tight coupling and difficult-to-debug issues. * **Do This:** Use events, messages, or shared APIs to communicate state changes between packages. * **Don't Do This:** Directly import and modify state from one package into another. * **Standard:** Define shared state contracts using TypeScript interfaces. * **Why:** Ensures that state is transferred consistently and predictably between packages. * **Do This:** Create a shared "types" package to define interfaces for state objects. * **Don't Do This:** Use dynamic or untyped data structures for shared state. * **Standard:** Consider using a shared state management solution if multiple packages need to access the same state. * **Why:** Provides a centralized and consistent way to manage shared state. * **Do This:** Use a shared Redux store, Zustand store, or Recoil graph if necessary. ## 8. Testing State Management Testing state management logic is critical for ensuring application correctness. * **Standard:** Write unit tests for reducers, actions, and selectors. * **Why:** Ensures that state updates are predictable and correct. * **Do This:** Use testing libraries like Jest or Mocha to write unit tests. * **Don't Do This:** Skip testing state management logic, as this can lead to subtle bugs. * **Standard:** Write integration tests for components that interact with state. * **Why:** Ensures that components correctly dispatch actions and render state. * **Do This:** Use testing libraries like React Testing Library or Vue Test Utils to write integration tests. * **Don't Do This:** Rely solely on manual testing to verify state management. * **Standard:** Mock API calls when testing state management logic. * **Why:** Prevents tests from depending on external services and makes them more reliable. * **Do This:** Use mocking libraries like Mock Service Worker (MSW) or Nock to intercept and mock API calls. ## 9. Anti-Patterns and Mistakes to Avoid * **Over-reliance on Global State:** Avoid storing purely local component state in the global state management solution. Performance will suffer. * **Direct State Mutation:** Always ensure immutability. * **Ignoring Asynchronous Actions:** Handle async operations correctly, especially API calls. Use RTK Query, thunks, or comparable patterns. * **Lack of Testing:** State management logic is often complex and requires thorough testing. * **Unnecessary Complexity:** Choose the simplest state management solution that meets the project's needs. Don't automatically reach for Redux when Zustand will do. * **Tight Coupling:** Avoid creating tight dependencies between components and the state management implementation. * **Neglecting Performance:** Be aware of performance implications, especially when dealing with large state objects. * **Not Using Typescript:** Typescript can save lots of problems when refactoring and understanding the data structures across the monorepo. Use it! * **Magic strings**: Use constants instead of strings for action types and other related items. By adhering to these standards, we can ensure a consistent, maintainable, and scalable approach to state management across our monorepo. This document should be used as a reference for all development teams and integrated into code review processes. Continuously updating these standards as the ecosystem evolves is crucial for maintaining high-quality code.
# Performance Optimization Standards for Monorepo This document outlines coding standards and best practices for performance optimization within a Monorepo environment. These standards are designed to improve application speed, responsiveness, and resource utilization. Following these guidelines will result in more maintainable, scalable, and performant applications. ## 1. Architectural Considerations for Performance ### 1.1. Strategic Module Decomposition **Goal:** Minimize the impact of changes and builds across the entire repository and optimize for parallel build execution. * **Do This:** * Divide the Monorepo into cohesive, independent modules (libraries, applications, shared components). * Consider the "blast radius" of changes. Modifications to one module should ideally have minimal or no impact on unrelated modules. * Ensure well-defined public APIs for modules that need to interact. * **Don't Do This:** * Create a monolithic module containing everything. * Establish circular dependencies between modules. * Expose internal implementation details through public APIs. **Why:** Poor module decomposition leads to unnecessary rebuilds, increased testing burden, and difficulty in isolating performance bottlenecks. A well-structured Monorepo facilitates parallel builds, targeted testing, and independent deployments, all of which contribute to faster development cycles and improved performance. **Example:** """ monorepo/ ├── apps/ │ ├── web-app/ # Independent web application │ │ ├── src/ │ │ └── package.json │ ├── mobile-app/ # Independent mobile application │ │ ├── src/ │ │ └── package.json ├── libs/ │ ├── ui-components/ # Reusable UI components │ │ ├── src/ │ │ └── package.json │ ├── data-access/ # Data fetching and caching logic │ │ ├── src/ │ │ └── package.json └── tools/ └── scripts/ # Utility scripts (e.g., build, test) """ ### 1.2. Dependency Management **Goal:** Reduce build times and runtime overhead by minimizing unnecessary dependencies. * **Do This:** * Declare dependencies accurately (e.g., using "devDependencies" for build-time dependencies). * Utilize dependency analysis tools (like "npm audit", "yarn audit") to identify and mitigate security vulnerabilities and outdated packages. * Keep dependencies up to date to benefit from performance improvements and security patches. * Use tools like "pnpm" or "yarn" with workspace functionality for optimal dependency sharing and installation speed * **Don't Do This:** * Include unnecessary dependencies in your modules. * Rely on transitive dependencies without declaring them explicitly. **Why:** Excessive or poorly managed dependencies increase build times, bundle sizes, and potentially introduce security vulnerabilities. Explicitly managing dependencies ensures that each module only includes what it truly needs, optimizing overall performance. **Example (package.json):** """json { "name": "@my-monorepo/ui-components", "version": "1.0.0", "dependencies": { "@emotion/react": "^11.11.1", "@emotion/styled": "^11.11.0", "@mui/material": "^5.14.18" }, "devDependencies": { "@types/react": "^18.2.33", "@types/styled-components": "^5.1.29", "typescript": "^5.2.2" } } """ ### 1.3. Build System Optimization **Goal:** Minimize build times and optimize for incremental builds. * **Do This:** * Use a modern build system tailored for Monorepos, such as Nx, Bazel, or Turborepo. * Configure the build system to leverage caching and incremental builds. * Define clear build targets and dependencies within the build configuration. * Use parallel execution where appropriate to speed up build processes. * Profile your builds regularly to identify bottlenecks. * **Don't Do This:** * Use generic build tools that don't understand Monorepo structures. * Disable caching or incremental builds. * Create complex build scripts that are difficult to maintain. **Why:** Optimized build processes significantly reduce development time and improve developer productivity. Caching and incremental builds ensure that only necessary code is rebuilt, leading to substantial performance gains. A modern build system designed for Monorepos understands the relationships between modules and can optimize the build process accordingly. **Example (Nx "nx.json"):** """json { "tasksRunnerOptions": { "default": { "runner": "nx-cloud", "options": { "cacheableOperations": ["build", "lint", "test", "e2e"], "accessToken": "YOUR_NX_CLOUD_TOKEN" } } }, "affected": { "defaultBase": "main" }, "namedInputs": { "default": ["{projectRoot}/**/*", "sharedGlobals"], "production": [ "default", "!{projectRoot}/**/?(*.)+(spec|test).[jt]s?(x)?(.snap)", "!{projectRoot}/tsconfig.spec.json", "!{projectRoot}/jest.config.[jt]s", "!{projectRoot}/.eslintrc.json" ], "sharedGlobals": [] } } """ ### 1.4. Code Sharing and Reusability **Goal:** Avoid code duplication and promote efficient use of resources. * **Do This:** * Identify common functionality across modules and extract it into shared libraries. * Use a design system or component library for consistent UI elements. * Employ code generation techniques to reduce boilerplate code. * **Don't Do This:** * Duplicate code across multiple modules. * Create tightly coupled components that are difficult to reuse. **Why:** Code duplication increases maintenance costs and potential performance issues. Sharing code reduces the overall codebase size, promotes consistency, and simplifies updates. Using a component library improves rendering performance by reducing the amount of unique CSS and JavaScript that needs to be loaded. **Example:** Move common utility functions to a shared library. """typescript // libs/utils/src/index.ts export function formatCurrency(amount: number, currencyCode: string = 'USD'): string { return new Intl.NumberFormat('en-US', { style: 'currency', currency: currencyCode, }).format(amount); } // apps/web-app/src/components/Product.tsx import { formatCurrency } from '@my-monorepo/utils'; function Product({ price }: { price: number }) { return <div>Price: {formatCurrency(price)}</div>; } """ ## 2. Coding Practices for Performance ### 2.1. Lazy Loading and Code Splitting **Goal:** Reduce initial load times by loading code only when it is needed. * **Do This:** * Implement lazy loading for modules that are not immediately required. * Use code splitting to break large bundles into smaller chunks. * Consider route-based code splitting for single-page applications. * **Don't Do This:** * Load all code upfront. * Create excessively large bundles that take a long time to download and parse. **Why:** Initial load time is critical for user experience. Lazy loading and code splitting significantly improve startup performance by deferring the loading of non-essential code. **Example (React with "React.lazy"):** """jsx import React, { lazy, Suspense } from 'react'; const AnalyticsDashboard = lazy(() => import('./AnalyticsDashboard')); // Lazy-loaded component function App() { return ( <div> {/* ... other components ... */} <Suspense fallback={<div>Loading...</div>}> <AnalyticsDashboard /> </Suspense> </div> ); } """ ### 2.2. Efficient Data Structures and Algorithms **Goal:** Optimize runtime performance by choosing appropriate data structures and algorithms. * **Do This:** * Select data structures based on access patterns (e.g., use a Set for membership tests, a Map for key-value lookups). * Use efficient algorithms for common operations (e.g., sorting, searching). * Consider the time and space complexity of your algorithms. * **Don't Do This:** * Use inefficient data structures or algorithms. * Perform unnecessary computations. **Why:** The choice of data structures and algorithms significantly impacts application performance. Choosing the right tools for the job can lead to dramatic improvements in speed and resource utilization. **Example:** """javascript // Efficiently check if an element exists in an array. Use a Set instead of an array for repeated lookups. const myArray = ['a', 'b', 'c', 'd', 'e']; const mySet = new Set(myArray); // Bad: Linear time complexity // myArray.includes('c'); // Good: Near-constant time complexity mySet.has('c'); """ ### 2.3. Memory Management **Goal:** Prevent memory leaks and optimize memory usage. * **Do This:** * Avoid creating unnecessary objects. * Release resources when they are no longer needed (e.g., event listeners, timers). * Use techniques like object pooling to reuse objects. * Be mindful of closures and their potential to capture large amounts of data. * Use tools like the Chrome DevTools memory profiler to identify memory leaks. * When possible, leverage technologies with automatic garbage collection. * **Don't Do This:** * Create large numbers of temporary objects. * Forget to release resources. * Store large amounts of data in memory unnecessarily. **Why:** Memory leaks and excessive memory usage can lead to performance degradation and application crashes. Proper memory management ensures that applications run smoothly and efficiently. **Example:** Removing event listeners to prevent memory leaks. """javascript class MyComponent { constructor() { this.handleClick = this.handleClick.bind(this); } componentDidMount() { window.addEventListener('click', this.handleClick); } componentWillUnmount() { window.removeEventListener('click', this.handleClick); // Remove the event listener } handleClick() { console.log('Clicked!'); } } """ ### 2.4. Minimize DOM Manipulation **Goal:** Reduce the performance overhead associated with updating the Document Object Model (DOM). * **Do This:** * Batch DOM updates. * Use virtual DOM techniques (e.g., React, Vue). * Avoid direct DOM manipulation where possible. * Use efficient selectors (e.g., avoid complex CSS selectors). * **Don't Do This:** * Perform frequent DOM updates. * Use inefficient DOM manipulation methods. **Why:** DOM manipulation is an expensive operation. Minimizing the number of DOM updates improves rendering performance and reduces layout thrashing. Virtual DOM techniques allow you to efficiently update the DOM by comparing the current state with the desired state and only making necessary changes. **Example (React):** """jsx import React, { useState } from 'react'; function MyComponent() { const [items, setItems] = useState(['item1', 'item2', 'item3']); const addItem = () => { // Bad: Multiple state updates trigger multiple re-renders // setItems([...items, 'newItem1']); // setItems([...items, 'newItem2']); // Good: Batch updates into a single state update setItems(prevItems => [...prevItems, 'newItem1', 'newItem2']); }; return ( <div> <ul> {items.map(item => ( <li key={item}>{item}</li> ))} </ul> <button onClick={addItem}>Add Items</button> </div> ); } """ ### 2.5. Caching Strategies **Goal:** Reduce the need to repeatedly fetch or compute the same data. * **Do This:** * Implement caching at different levels (e.g., browser caching, server-side caching, in-memory caching). * Use appropriate cache invalidation strategies (e.g., time-based expiration, event-based invalidation). * Leverage Content Delivery Networks (CDNs) for static assets. * **Don't Do This:** * Cache data indefinitely without invalidation. * Cache sensitive data inappropriately. **Why:** Caching can dramatically improve application performance by reducing the load on servers and databases. Properly invalidating caches is crucial to ensure that users see the latest data. **Example (Browser caching using "Cache-Control" headers):** """javascript // Server-side code (e.g., Node.js with Express) app.get('/api/data', (req, res) => { // Set the Cache-Control header res.set('Cache-Control', 'public, max-age=3600'); // Cache for 1 hour // ... fetch and send data ... }); """ ## 3. Technology-Specific Considerations ### 3.1. JavaScript/TypeScript * **Do This:** * Use modern JavaScript features (e.g., "async/await", "const/let") for improved readability and performance. * Use TypeScript's type system to catch errors early and improve code maintainability. * Use "Array.map", "Array.filter", and "Array.reduce" instead of "for" loops where appropriate for more concise and potentially faster code. * Use "import" and "export" for modular code that utilizes tree shaking * **Don't Do This:** * Use legacy JavaScript features that are less performant or more difficult to understand. * Ignore TypeScript's type checking. ### 3.2. React * **Do This:** * Use "React.memo" to prevent unnecessary re-renders of pure components. * Use "useCallback" and "useMemo" to memoize functions and values. * Use keys effectively when rendering lists. * Profile your components using the React Profiler to identify performance bottlenecks. * Use code splitting and lazy loading with "React.lazy" and "Suspense". * **Don't Do This:** * Rely solely on "shouldComponentUpdate" for preventing re-renders (use "React.memo" instead). * Create new objects or functions inside render methods. ### 3.3. Node.js * **Do This:** * Use asynchronous operations and event loops effectively. * Optimize database queries. * Use connection pooling to reduce database connection overhead. * Use caching mechanisms (e.g., Redis, Memcached). * Profile your application using tools like Clinic.js to identify performance bottlenecks. * **Don't Do This:** * Perform blocking operations in the main event loop. ## 4. Profiling and Monitoring ### 4.1. Performance Audits * Perform regular performance audits using tools like Lighthouse, WebPageTest, or Chrome DevTools to identify areas for improvement. ### 4.2. Monitoring * Implement monitoring solutions to track key performance indicators (KPIs) such as response time, error rate, and resource utilization. Use these KPIs to proactively identify and address performance issues. * Consider using tools like Prometheus, Grafana, or Datadog for advanced monitoring and alerting. ## 5. Continuous Improvement * **Do This:** Regularly review and update these standards to reflect the latest best practices and technology advancements. Encourage developers to propose improvements and share their knowledge. By adhering to these performance optimization standards, development teams can build high-performing Monorepo applications that deliver excellent user experiences and are easy to maintain. Remember that performance optimization is an ongoing process that requires continuous monitoring, analysis, and refinement.
# Testing Methodologies Standards for Monorepo This document outlines the testing methodology standards for our monorepo. It aims to guide developers in creating robust, reliable, and maintainable code. These standards are designed to enhance maintainability, improve developer velocity, and ensure code quality across the entire monorepo. This document serves as a reference for developers and a context for AI-assisted coding tools. ## 1. Introduction to Monorepo Testing Testing in a monorepo architecture presents unique challenges and opportunities compared to traditional, multi-repo setups. Centralized code necessitates a holistic testing strategy that accounts for inter-package dependencies and potential ripple effects of changes. The goal is to maintain high confidence in code correctness, stability, and performance with efficient and effective testing methodologies. ### 1.1. Key Principles * **Test Pyramid:** Implement a test strategy that follows the test pyramid, emphasizing unit tests, followed by integration tests, and then end-to-end tests. * **Test Automation:** Automate testing at all levels to ensure consistent and repeatable results. * **Parallel Execution:** Leverage monorepo tooling to parallelize test execution across packages to reduce overall testing time. * **Isolation:** Isolate tests to prevent interference from external systems or other packages. Provide appropriate mocking and stubbing. * **Code Coverage:** Aim for high code coverage to identify untested code paths, but prioritize meaningful tests over simply achieving a coverage percentage. * **Continuous Integration/Continuous Deployment (CI/CD):** Integrate testing into a CI/CD pipeline to automatically run tests on every commit. * **Contract Testing:** Utilize contract testing to verify interactions between services or modules. ### 1.2. Monorepo Specific Considerations * **Dependency Management:** Pay close attention to inter-package dependencies when designing tests. Changes in one package can affect others, so tests must account for potential ripple effects. * **Scoped Testing:** Implement mechanisms for running tests selectively (e.g., only tests in changed packages and their dependents). * **Shared Tooling:** Leverage shared testing infrastructure and utilities to maintain consistency and reduce duplication. (e.g., shared Jest configurations, custom matchers, testing libraries). * **Impact Analysis:** Use tooling to analyze the impact of changes before running tests, optimizing which tests need to be executed. ## 2. Unit Testing Unit tests verify the functionality of individual units of code (e.g., functions, classes, components) in isolation. They are the foundation of a robust testing strategy. ### 2.1. Standards * **Do This:** * Write unit tests for all non-trivial code. * Focus on testing the public API of modules and components. * Use mocking and stubbing to isolate units of code from their dependencies. * Write tests that are fast, reliable, and easy to understand. * Use descriptive test names that clearly indicate what is being tested. * Follow the Arrange-Act-Assert (AAA) pattern. * **Don't Do This:** * Skip unit tests for "simple" code. Even simple code can have subtle bugs. * Write unit tests that test implementation details. These tests are brittle and prone to breaking when the implementation changes. * Over-mock or over-stub, which can lead to tests that don't accurately reflect the behavior of the system. * Write slow or unreliable unit tests. These tests will slow down the development process and erode confidence. * Use vague or ambiguous test names. ### 2.2. Code Examples (JavaScript/TypeScript) """typescript // example.ts export function add(a: number, b: number): number { return a + b; } export function greet(name: string): string { if (!name) { throw new Error("Name cannot be empty"); } return "Hello, ${name}!"; } """ """typescript // example.test.ts (using Jest) import { add, greet } from './example'; describe('add', () => { it('should add two numbers correctly', () => { // Arrange const a = 2; const b = 3; // Act const result = add(a, b); // Assert expect(result).toBe(5); }); }); describe('greet', () => { it('should greet a person with their name', () => { expect(greet('Alice')).toBe('Hello, Alice!'); }); it('should throw an error if the name is empty', () => { expect(() => greet('')).toThrowError("Name cannot be empty"); }); }); """ ### 2.3. Anti-Patterns * **Testing implementation details:** Testing private methods or internal state. * **Over-mocking:** Mocking excessively can make the tests less effective in identifying real bugs. ### 2.4. Technology-Specific Details * Use Jest, Mocha, or Jasmine for JavaScript/TypeScript testing. Jest is recommended for React applications. * Use appropriate assertion libraries (e.g., Chai, Jest's built-in assertions). * Configure test runners to run in parallel and watch mode. * Use code coverage tools to measure the effectiveness of unit tests. Istanbul (nyc) integrates well with Jest. Configure "nyc" to exclude test files and generated code. * Use mocking libraries like "jest.mock" or "sinon" strategically only when necessary to isolate the unit under test. ## 3. Integration Testing Integration tests verify the interactions between different units of code or modules. They provide confidence that the system works correctly as a whole, bridging the gap between unit and end-to-end (E2E) tests. ### 3.1. Standards * **Do This:** * Write integration tests that verify the interactions between different modules or services within the monorepo. * Focus on testing the flow of data through the system. * Use real dependencies or lightweight test doubles. * Write tests that are more comprehensive than unit tests but faster than E2E tests. * Ensure that integration tests clean up any test data after they run. * **Don't Do This:** * Write integration tests that are too broad, testing too many components at once. * Use mocks for everything. Integration tests should verify real interactions. * Neglect to clean up test data. This can lead to tests that fail intermittently or pollute the environment. ### 3.2. Code Examples (Node.js/TypeScript with Express) """typescript // user-service.ts import { add } from './math-service'; // Assuming math-service is another module export class UserService { createUser(firstName: string, lastName: string): string { const userId = add(firstName.length, lastName.length); return "user-${userId}"; } } """ """typescript // math-service.ts export function add(a: number, b: number): number { return a + b; } """ """typescript // user-service.test.ts (using Jest) import { UserService } from './user-service'; import * as mathService from './math-service'; describe('UserService', () => { it('should create a user with a generated ID based on math-service', () => { const userService = new UserService(); //Mock the specific function which allows testing the service independantly jest.spyOn(mathService, 'add').mockReturnValue(10); const userId = userService.createUser('John', 'Doe'); expect(userId).toBe('user-10'); expect(mathService.add).toHaveBeenCalledWith('John'.length, 'Doe'.length); }); }); """ ### 3.3. Anti-Patterns * **Testing through the UI:** Integration tests should focus on backend interactions, not UI components. * **Not using a test database:** Use a separate database for testing to avoid affecting production data. * **Relying on external services:** Mock external services or use test doubles (e.g., using "nock" to intercept HTTP requests). ### 3.4. Technology-Specific Details * Use tools like Supertest for testing HTTP endpoints in Node.js. * Use dependency injection to make it easier to replace dependencies with test doubles. * Consider using Docker Compose to set up test environments with multiple services. ## 4. End-to-End (E2E) Testing E2E tests simulate real user interactions with the application. They provide the highest level of confidence that the system works correctly from end-to-end. These are significantly slower than unit and integration tests but critical for verifying the overall system behavior. ### 4.1. Standards * **Do This:** * Write E2E tests that cover the most critical user flows. * Use real browsers or headless browser environments (e.g., Playwright, Cypress, Puppeteer). * Set up the test environment automatically before each test run. * Clean up the test environment after each test run. * Write tests that are reliable and repeatable. * **Don't Do This:** * Write too many E2E tests. Focus on the most critical user flows. * Write E2E tests that are brittle or flaky. * Run E2E tests too frequently. Ideally within the CI/CD pipeline on merges/releases or nightly builds. ### 4.2. Code Examples (Playwright - Typescript Preferred) """typescript // playwright.config.ts import { defineConfig, devices } from '@playwright/test'; export default defineConfig({ testDir: './tests', fullyParallel: true, reporter: 'html', use: { baseURL: 'http://localhost:3000', trace: 'on-first-retry', }, projects: [ { name: 'chromium', use: { ...devices['Desktop Chrome'] }, }, ], }); """ """typescript // tests/example.spec.ts import { test, expect } from '@playwright/test'; test('should navigate to the about page', async ({ page }) => { await page.goto('/'); await page.getByRole('link', { name: 'About' }).click(); await expect(page).toHaveURL(/.*about/); await expect(page.locator('h1')).toContainText('About Us'); }); test('should allow a user to log in', async ({ page }) => { await page.goto('/login'); await page.fill('input[name="username"]', 'testuser'); await page.fill('input[name="password"]', 'password123'); await page.click('button[type="submit"]'); await page.waitForURL('/dashboard'); // Or any URL after login await expect(page.locator('#dashboard-title')).toContainText('Dashboard'); }); """ ### 4.3. Anti-Patterns * **Relying on the UI for setup:** Whenever possible, use APIs for test setup and teardown rather than the UI. This makes tests faster and more reliable. * **Not waiting for elements to load:** Use explicit waits to ensure that elements are fully loaded before interacting with them. ### 4.4. Technology-Specific Details * Use Playwright, Cypress, or Puppeteer for E2E testing. Playwright is currently favored for its speed, reliability, and multi-browser support. * Use Docker to create consistent test environments. * Use environment variables to configure tests for different environments (e.g., staging, production). * Implement retries to reduce flakiness in E2E tests. Playwright and Cypress have built-in retry mechanisms. * Integrate visual regression testing to catch unexpected UI changes. Tools like Percy or Applitools can be used. ## 5. Monorepo Testing Strategies Adapting testing strategies to the monorepo context requires optimizing test execution and understanding interdependencies. ### 5.1. Selective Test Execution Only run the tests that are affected by the changes in a commit. Utilize tooling that can identify changed packages and their dependencies to select the appropriate tests. * **Do This:** * Use tools that automatically determine which packages have changed. * Configure your CI/CD system to only run tests for changed packages and their dependents. * Create a dependency graph of packages in the monorepo. * **Don't Do This:** * Run all tests for every commit. This is inefficient and slows down the development process. ### 5.2. Parallelization Run tests in parallel across multiple agents to reduce the overall testing time. Modern monorepo tools support parallel test execution. * **Do This:** * Configure test runners to run tests in parallel. * Use a CI/CD system that can distribute tests across multiple agents. * Allocate sufficient resources to your CI/CD agents to handle the parallel test load. * **Don't Do This:** * Run tests sequentially. This is slow and inefficient. ### 5.3. Code Coverage Across Packages Aggregate code coverage data across all packages in the monorepo to provide a comprehensive view of code coverage. * **Do This:** * Configure code coverage tools to generate reports for each package. * Aggregate the reports into a single dashboard to provide a complete view of code coverage. * Set code coverage thresholds to ensure that all packages are adequately tested. * **Don't Do This:** * Ignore code coverage. This makes it difficult to identify untested code paths. ### 5.4. Example: Leveraging Nx for Affected Tests Nx provides excellent support for running affected tests. """json // nx.json { "tasksRunnerOptions": { "default": { "runner": "nx-cloud", "options": { "cacheableOperations": ["build", "lint", "test", "e2e"], "accessToken": "YOUR_NX_CLOUD_ACCESS_TOKEN" } } }, "targetDefaults": { "test": { "inputs": ["default", "{workspaceRoot}/jest.preset.js"], "cache": true } } } """ To run tests affected by a commit: """bash nx affected:test --base=main --head=HEAD """ ## 6. Contract Testing Contract testing is a specialized testing technique that verifies the interactions between services, ensuring that they adhere to a defined contract. This is especially relevant when dealing with different teams owning different parts of the monorepo that interact via APIs. ### 6.1. Standards * **Do This:** * Define clear contracts between services or modules with well-defined inputs and outputs. * Implement contract tests that verify that each service adheres to its contract. * Use tools like Pact or Spring Cloud Contract to simplify the process of writing and running contract tests. * **Don't Do This:** * Assume that services will always interact correctly. Contract tests are crucial for preventing integration issues. * Neglect to update contract tests when contracts change. * Skip contract testing when changes are isolated to one service. The other side of the contract *must* also be tested. ### 6.2 Example (Pact with JavaScript) A *consumer* project wanting to consume information from the *provider* project using an API: """javascript // Consumer: consumer.test.js const { Pact } = require('@pact-foundation/pact'); const { fetchProviderData } = require('./consumer'); // This is the code under test describe('Pact Verification', () => { const provider = new Pact({ consumer: 'MyConsumer', provider: 'MyProvider', port: 1234, // Port the mock service will run on dir: path.resolve(process.cwd(), 'pacts'), // Directory to save pact files log: path.resolve(process.cwd(), 'logs', 'pact.log'), logLevel: 'info', specVersion: 2, }); beforeAll(async () => { await provider.setup() }); afterEach(async () => { await provider.verify() }); afterAll(async () => { await provider.finalize() }); describe('When a call to retrieve data from the provider is made', () => { beforeEach(() => { provider.addInteraction({ state: 'Provider has some data', uponReceiving: 'a request for the data', withRequest: { method: 'GET', path: '/data', }, willRespondWith: { status: 200, headers: { 'Content-Type': 'application/json', }, body: { message: 'Hello, Consumer!', }, }, }); }); it('should return the correct data', async () => { const data = await fetchProviderData('http://localhost:1234'); expect(data.message).toEqual('Hello, Consumer!'); }); }); }); """ """javascript // Provider: provider.test.js (using Pact CLI or library to verify pacts) const { Verifier } = require('@pact-foundation/pact'); const path = require('path'); describe('Pact Verification', () => { it('should validate the expectations of the Consumer', () => { const opts = { providerBaseUrl: 'http://localhost:3000', // Where the provider is running pactUrls: [ path.resolve(__dirname, '../pacts/myconsumer-myprovider.json'), // Path to pact file ], publishVerificationResult: true, providerVersion: '1.0.0', }; return new Verifier(opts).verifyProvider().then(output => { console.log('Pact Verification Complete!'); console.log(output); }); }); }); """ ### 6.3. Technology-Specific Details * Utilize Pact for contract testing in polyglot environments. * Spring Cloud Contract is a great option for Java-based microservices. * Clearly define the responsibilities of consumers and providers in the contract. * Automate the process of verifying contracts in the CI/CD pipeline. ## 7. Performance Testing Performance testing is vital for ensuring applications within the monorepo remain responsive and scalable. In a monorepo, performance issues in one package can potentially affect others, making this crucial. ### 7.1. Standards: * **DO**: * Conduct load, stress, and soak tests to identify bottlenecks and performance degradation. * Use tools like JMeter, Gatling, or k6 for performance testing. * Define key performance indicators (KPIs) like response time, throughput, and error rate. * Establish performance baselines to measure improvements and regressions. * **DON'T**: * Neglect performance testing until late in the development cycle. * Rely solely on manual performance evaluations. * Ignore the impact of database queries and inefficient algorithms on performance. ### 7.2: Example using k6 """javascript import http from 'k6/http'; import { sleep } from 'k6'; export const options = { vus: 10, duration: '10s', }; export default function () { http.get('http://localhost:3000/api/data'); sleep(1); } """ ### 7.3: Considerations for Monorepos: * Isolate specific packages or APIs for testing. * Use monorepo aware CI/CD tools. * Monitor resource consumption across the monorepo. ## 8. Security Testing Security testing identifies vulnerabilities in the code and ensures that the application is protected against attacks. ### 8.1. Standards * **Do This:** * Perform static analysis to identify potential security vulnerabilities in the code. * Conduct dynamic analysis to test the application for vulnerabilities during runtime. * Use tools like SonarQube, Snyk, or OWASP ZAP to automate security testing. * Follow secure coding practices to prevent common vulnerabilities like SQL injection, cross-site scripting (XSS), and cross-site request forgery (CSRF). * Conduct regular penetration testing to identify weaknesses in the application's security. * **Don't Do This:** * Ignore security vulnerabilities. Even seemingly minor vulnerabilities can be exploited by attackers. * Rely solely on automated security testing. Manual code reviews and penetration testing are also important. ### 8.2. Technology-Specific Details * Use ESLint with security-related rules to identify potential vulnerabilities in JavaScript/TypeScript code. * Use npm audit or yarn audit to identify vulnerabilities in dependencies. * Use tools like Snyk to automatically fix vulnerabilities in dependencies. * Follow the OWASP Top 10 guidelines to prevent common web application vulnerabilities. ## 9. Documentation Clear documentation is crucial for maintainability and knowledge sharing within the codebase. ### 9.1. Standards: * **DO**: * Document the purpose and functionality of tests cases. * Document the integration and end-to-end testing environments. * **DON'T**: * Overlook the importance of keeping test documentation up to date. * Skip documenting even the most straightforward-looking tests. ## 10. Conclusion These testing methodology standards are designed to promote high-quality code within our monorepo. By adhering to these guidelines, developers can build robust, reliable, and maintainable applications that meet the needs of our users. This document should be reviewed and updated regularly to reflect the latest best practices and technologies. Remember that testing is an integral part of the development process and should be considered at every stage of the software lifecycle.