# Core Architecture Standards for Monorepo
This document outlines the core architectural standards for Monorepo projects. It focuses on fundamental patterns, project structure, and organization principles specifically within the Monorepo context. Adhering to these standards ensures maintainability, scalability, and a consistent development experience across all projects within the repository.
## 1. Fundamental Architectural Patterns
Monorepos often benefit from a modular architecture. This allows for independent development, testing, and deployment of different parts of the system. The choice of architectural pattern depends on the specific needs of the project, but we encourage:
* **Modular Monolith:** A single deployable unit composed of loosely coupled modules. This is a good starting point for many projects as it offers simplicity while still promoting modularity.
* **Microservices within a Monorepo:** Smaller, independently deployable services residing within the same repository. This allows for independent scaling and development cycles but introduces more complexity in terms of deployment and inter-service communication.
* **Layered Architecture:** A common and effective approach for organizing code into distinct layers (e.g., presentation, business logic, data access). This promotes separation of concerns and makes the codebase easier to understand and maintain.
**Do This:**
* Choose an architectural pattern that aligns with the project's complexity and scalability requirements.
* Clearly define module boundaries and dependencies.
* Strive for loose coupling between modules / services.
**Don't Do This:**
* Create a tightly coupled monolith without clear modules. This makes the codebase difficult to reason about and maintain.
* Implement microservices prematurely without considering the added complexity.
* Ignore architectural principles, especially if the project grows.
**Why:** Choosing an architecture at the beginning of the project is important to prevent future refactoring. Selecting one appropriate for your project early will allow for easier scaling in the future and easier team collaboration.
## 2. Monorepo Project Structure and Organization
A well-defined project structure is critical for navigating and managing Monorepo projects. We recommend the following structure:
"""
monorepo-root/
├── apps/ # User facing applications
│ ├── web-app/
│ │ ├── src/
│ │ ├── package.json
│ │ └── tsconfig.json
│ ├── mobile-app/
│ │ ├── src/
│ │ ├── package.json
│ │ └── tsconfig.json
├── packages/ # Reusable libraries and components
│ ├── ui-library/
│ │ ├── src/
│ │ ├── package.json
│ │ └── tsconfig.json
│ ├── utils/
│ │ ├── src/
│ │ ├── package.json
│ │ └── tsconfig.json
├── tools/ # Build scripts, code generators, and other utilities
│ ├── build/
│ ├── codegen/
├── docs/ # Documentation for the monorepo and its projects
├── .eslintrc.js # Root ESLint configuration
├── .prettierrc.js # Root Prettier configuration
├── tsconfig.base.json # Base TypeScript configuration
└── package.json # Root package.json (for tooling and scripts)
"""
* **"apps/"**: Contains user-facing applications (e.g., web apps, mobile apps, CLI tools).
* **"packages/"**: Contains reusable libraries and components that can be shared across multiple applications.
* **"tools/"**: Contains build scripts, code generators, and other utilities for the Monorepo.
* **"docs/"**: Holds documentation for the monorepo itself and for individual packages/applications. Consider tools like Docusaurus or Storybook for document generation.
* **Root Configuration Files:** Centralized configuration for linting, formatting, and TypeScript.
**Do This:**
* Organize code into clear and well-defined packages.
* Use a consistent naming convention for packages and applications.
* Keep shared libraries in the "packages/" directory.
* Utilize shared configuration files at the root level.
**Don't Do This:**
* Scatter code across the repository without a clear structure.
* Create overly large packages that are difficult to maintain.
* Duplicate configuration files across multiple packages.
**Why:** A clear and consistent structure is essential for navigation and maintainability, especially as the Monorepo grows in size and complexity. Using a standardized structure across multiple projects also facilitates onboarding new developers as they will quickly understand where to find code.
## 3. Dependency Management
Managing dependencies within a Monorepo can be challenging. We recommend using a tool like "pnpm", "Yarn", or "npm" workspaces to simplify dependency management and avoid duplication. PNPM is often favored due to its efficient disk space usage and speedier installations.
**Do This:**
* Use a workspace-aware package manager (e.g., "pnpm", "Yarn", "npm").
* Declare dependencies explicitly in each package's "package.json" file.
* Use version ranges that allow for minor and patch updates, but pin major versions to avoid breaking changes.
* Leverage tools like "Dependabot" or "Renovate" to automate dependency updates.
**Don't Do This:**
* Rely on implicit dependencies between packages.
* Install dependencies globally.
* Use wildcard version ranges (e.g., "*").
**Example ("packages/ui-library/package.json"):**
"""json
{
"name": "@my-monorepo/ui-library",
"version": "1.0.0",
"dependencies": {
"react": "^18.2.0",
"@emotion/react": "^11.11.1",
"@emotion/styled": "^11.11.0"
},
"devDependencies": {
"@types/react": "^18.2.15"
},
"peerDependencies": {
"next": ">=13.0.0"
}
}
"""
**Why:** Proper dependency management prevents version conflicts, improves build times, and reduces the overall size of the Monorepo. "peerDependencies" are critical to declare and ensure that components built are compatible with different versions of the host application.
## 4. Code Sharing and Reusability
One of the key benefits of a Monorepo is the ability to easily share code between different projects.
**Do This:**
* Create reusable libraries and components in the "packages/" directory.
* Use a consistent API design for shared libraries.
* Write thorough documentation for shared components.
* Utilize tools such as Bit (bit.dev) or Nx to manage and share components.
**Don't Do This:**
* Duplicate code across multiple projects.
* Create overly specific components that are difficult to reuse.
* Neglect documentation for shared libraries.
**Example ("packages/utils/src/index.ts"):**
"""typescript
export function formatDate(date: Date): string {
return new Intl.DateTimeFormat('en-US').format(date);
}
export function isValidEmail(email: string): boolean {
const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
return emailRegex.test(email);
}
"""
"""typescript
// Usage in "apps/web-app/src/components/UserComponent.tsx"
import { formatDate, isValidEmail } from '@my-monorepo/utils';
function UserComponent({ user }: { user: any }) {
const formattedDate = formatDate(new Date(user.createdAt));
const isValid = isValidEmail(user.email);
return (
<p>Created At: {formattedDate}</p>
<p>Email Valid: {isValid ? 'Yes' : 'No'}</p>
);
}
export default UserComponent;
"""
**Why:** Promotes code reuse and reduce redundant code, keeps applications lightweight, and reduces the likelihood of bugs.
## 5. Tooling and Automation
Monorepos often require specialized tooling and automation to manage their complexity. Consider the following:
* **Build Systems:** Tools like Nx, Turborepo, or Bazel can help you optimize build times by only rebuilding affected packages.
* **Linting and Formatting:** Use ESLint and Prettier to enforce consistent code style across the Monorepo.
* **Code Generation:** Use code generators to automate repetitive tasks and reduce boilerplate code.
* **Testing:** Integrate testing frameworks (e.g., Jest, Mocha, Cypress) to ensure the quality of your code.
* **CI/CD:** Implement a robust CI/CD pipeline to automate builds, tests, and deployments, potentially using tools like Github Actions, CircleCI or Jenkins
* **Dependency Graph Visualization:** Tools such as Madge or dep-graph are useful in visualizing dependencies between packages.
**Do This:**
* Choose a build system that supports incremental builds and dependency analysis.
* Automate linting and formatting to enforce code style.
* Use code generators to reduce boilerplate.
* Implement comprehensive testing.
* Automate builds, tests, and deployments with CI/CD.
* Include scripts to run tests and linters for commits.
**Don't Do This:**
* Manually run builds and tests.
* Ignore linting and formatting errors.
* Skip testing.
* Neglect CI/CD.
**Example ("tools/codegen/generate-component.js" - Simplified Example):**
"""javascript
const fs = require('fs');
function generateComponent(name) {
const componentCode = "
import React from 'react';
interface ${name}Props {
// Add props here
}
const ${name}: React.FC<${name}Props> = ({/* Props Go Here */}) => {
return (
{/* Component Content Here */}
);
};
export default ${name};
";
fs.writeFileSync("./packages/ui-library/src/components/${name}.tsx", componentCode);
console.log("Component ${name} generated successfully!");
}
const componentName = process.argv[2];
if (!componentName) {
console.error('Please provide a component name.');
process.exit(1);
}
generateComponent(componentName);
"""
**Why:** Automates development tasks, speeds up the build process, and improve the overall quality of code. Build scripts should be repeatable.
## 6. Communication and Collaboration
Effective communication and collaboration are crucial for successful Monorepo development, especially with large teams.
**Do This:**
* Establish clear communication channels (e.g., Slack, Discord).
* Use code reviews to ensure code quality and knowledge sharing.
* Document architectural decisions and coding standards.
* Conduct regular team meetings to discuss progress and challenges.
* Use Architecture Decision Records (ADRs) to keep a high-level log of important decisions that affect the project architecture and direction for future maintainers.
**Don't Do This:**
* Work in isolation without communicating with other team members.
* Skip code reviews.
* Neglect documentation.
* Keep crucial design or engineering decisions locked inside a single person's mind.
**Why:** Promotes knowledge sharing, reduces the risk of errors, and ensures that everyone is aligned on the project's goals and direction.
## 7. Versioning and Release Management
Managing versions and releases within a Monorepo requires careful planning. We recommend using a tool like [Lerna](https://github.com/lerna/lerna) or [Changesets](https://github.com/changesets/changesets) to automate the release process. Changesets is generally preferred for its simplicity and ease of integration with CI/CD.
**Do This:**
* Use semantic versioning (SemVer) for all packages.
* Automate release management with tools like Changesets or Lerna.
* Generate changelogs automatically for each release.
* Use conventional commits to automate version bumping and changelog generation.
* Consider using git tags to mark releases.
**Don't Do This:**
* Manually manage versions and releases.
* Forget to update changelogs.
* Use inconsistent versioning schemes.
**Example (Using Changesets):**
1. **Create a Changeset:** "pnpm changeset"
2. **Describe the changes:** (e.g., "Fixed a bug in the formatDate function")
3. **Commit the Changeset:** (e.g., ".changeset/fix-format-date.md")
4. **Run release:** In your CI/CD pipeline, run "pnpm changeset version" and "pnpm publish"
**Why:** Simplifies the release process, reduces the risk of errors, and provides clear visibility into the changes included in each release.
## 8. Security Best Practices
Security should be a primary concern in any Monorepo project.
**Do This:**
* Regularly scan dependencies for vulnerabilities using tools like "npm audit" or "snyk".
* Implement secure coding practices (e.g., input validation, output encoding).
* Use a static analysis tool such as SonarQube or Semgrep to check for code vulnerabilities.
* Store secrets securely using environment variables or a dedicated secret management solution (e.g., HashiCorp Vault).
* Follow the principle of least privilege when granting access to resources.
* Have a clear incident response plan in place.
**Don't Do This:**
* Ignore security vulnerabilities.
* Store secrets in code.
* Grant unnecessary privileges.
* Fail to monitor for security incidents.
**Why:** Protects the Monorepo and its users from security threats. A vulnerability in a shared library might impact all applications in the monorepo.
## 9. Performance Optimization
Monorepos can become slow if not optimized. Consider these approaches:
* **Code Splitting:** Break up large applications into smaller chunks that can be loaded on demand.
* **Tree Shaking:** Remove unused code from dependencies to reduce bundle sizes.
* **Caching:** Implement caching strategies to avoid unnecessary computations.
* **Lazy Loading:** Load components or modules only when they are needed.
* **Optimize build times:** Use incremental builds and parallel execution.
* **Profile application regularly:** Identify and resolve any bottlenecks.
**Do This:**
* Use code splitting to reduce initial load times.
* Enable tree shaking to remove unused code.
* Implement caching strategies.
* Use lazy loading for non-critical components.
* Profile and optimize performance regularly.
**Don't Do This:**
* Load the entire application at once.
* Include unnecessary dependencies.
* Ignore performance bottlenecks.
**Why:** Improves the user experience and reduces resource consumption. A slow initial startup time can impact overall user satisfaction.
## 10. Documentation Standards
Comprehensive documentation is essential for understanding and maintaining the Monorepo and its components.
**Do This:**
* Document all public APIs.
* Provide clear examples of how to use shared libraries and components.
* Use a consistent documentation style.
* Automate documentation generation whenever possible.
* Keep documentation up-to-date.
* Place documentation close to the code it describes (e.g., using JSDoc comments or Markdown files in the same directory).
**Don't Do This:**
* Neglect documentation.
* Write unclear or incomplete documentation.
* Let documentation become outdated.
**Example (Using JSDoc):**
"""typescript
/**
* Formats a date object into a human-readable string.
*
* @param {Date} date - The date object to format.
* @returns {string} The formatted date string.
*/
export function formatDate(date: Date): string {
return new Intl.DateTimeFormat('en-US').format(date);
}
"""
**Why:** Makes it easier for developers to understand and use the Monorepo's components and keeps new engineers informed and productive. Well-documented architectural decisions are critical to prevent misunderstandings.
danielsogl
Created Mar 6, 2025
This guide explains how to effectively use .clinerules
with Cline, the AI-powered coding assistant.
The .clinerules
file is a powerful configuration file that helps Cline understand your project's requirements, coding standards, and constraints. When placed in your project's root directory, it automatically guides Cline's behavior and ensures consistency across your codebase.
Place the .clinerules
file in your project's root directory. Cline automatically detects and follows these rules for all files within the project.
# Project Overview project: name: 'Your Project Name' description: 'Brief project description' stack: - technology: 'Framework/Language' version: 'X.Y.Z' - technology: 'Database' version: 'X.Y.Z'
# Code Standards standards: style: - 'Use consistent indentation (2 spaces)' - 'Follow language-specific naming conventions' documentation: - 'Include JSDoc comments for all functions' - 'Maintain up-to-date README files' testing: - 'Write unit tests for all new features' - 'Maintain minimum 80% code coverage'
# Security Guidelines security: authentication: - 'Implement proper token validation' - 'Use environment variables for secrets' dataProtection: - 'Sanitize all user inputs' - 'Implement proper error handling'
Be Specific
Maintain Organization
Regular Updates
# Common Patterns Example patterns: components: - pattern: 'Use functional components by default' - pattern: 'Implement error boundaries for component trees' stateManagement: - pattern: 'Use React Query for server state' - pattern: 'Implement proper loading states'
Commit the Rules
.clinerules
in version controlTeam Collaboration
Rules Not Being Applied
Conflicting Rules
Performance Considerations
# Basic .clinerules Example project: name: 'Web Application' type: 'Next.js Frontend' standards: - 'Use TypeScript for all new code' - 'Follow React best practices' - 'Implement proper error handling' testing: unit: - 'Jest for unit tests' - 'React Testing Library for components' e2e: - 'Cypress for end-to-end testing' documentation: required: - 'README.md in each major directory' - 'JSDoc comments for public APIs' - 'Changelog updates for all changes'
# Advanced .clinerules Example project: name: 'Enterprise Application' compliance: - 'GDPR requirements' - 'WCAG 2.1 AA accessibility' architecture: patterns: - 'Clean Architecture principles' - 'Domain-Driven Design concepts' security: requirements: - 'OAuth 2.0 authentication' - 'Rate limiting on all APIs' - 'Input validation with Zod'
# Component Design Standards for Monorepo This document outlines the coding standards for component design within a Monorepo architecture. It focuses on creating reusable, maintainable, and performant components, tailored for the intricacies of a monorepo environment. These guidelines are designed to be used by both developers and AI coding assistants to ensure code consistency and quality. ## 1. Component Modularity and Reusability ### 1.1. Standard: Encapsulation and Abstraction **Standard:** Components should be encapsulated with a well-defined public API and hidden internal implementation details. Use abstraction to provide a simplified interface to complex functionalities. **Why:** This promotes reusability, reduces dependencies, and allows for internal changes without impacting dependent components. In a Monorepo, breaking changes can have widespread effects, making encapsulation crucial. **Do This:** * Define clear interfaces using TypeScript/JavaScript. * Use private/protected members to hide implementation details. * Favor composition over inheritance to promote flexibility. **Don't Do This:** * Expose internal state or logic directly. * Create overly complex inheritance hierarchies. * Create components tightly coupled to specific application contexts. **Code Example (TypeScript):** """typescript // packages/ui-library/src/components/Button/Button.tsx import React, { ReactNode } from 'react'; import styles from './Button.module.css'; interface ButtonProps { children: ReactNode; onClick: () => void; variant?: 'primary' | 'secondary'; } const Button: React.FC<ButtonProps> = ({ children, onClick, variant = 'primary' }) => { const buttonClass = variant === 'primary' ? styles.primaryButton : styles.secondaryButton; return ( <button className={"${styles.button} ${buttonClass}"} onClick={onClick}> {children} </button> ); }; export default Button; //Button.module.css (CSS Modules) .button { padding: 10px 20px; border: none; border-radius: 5px; cursor: pointer; font-size: 16px; } .primaryButton { background-color: #007bff; color: white; } .secondaryButton { background-color: #6c757d; color: white; } """ **Anti-Patterns:** * **God Components:** Components that implement too much logic or have too many responsibilities. These are hard to reuse and maintain. ### 1.2. Standard: Single Responsibility Principle (SRP) **Standard:** Each component should have one, and only one, reason to change. **Why:** Components with a single responsibility are easier to understand, test, and reuse. Changing one aspect of the component doesn't necessarily break other parts of the system. Reduces the blast radius of changes within the Monorepo. **Do This:** * Decompose complex components into smaller, more focused components. * Use composition to combine these smaller components. **Don't Do This:** * Add unrelated functionality to an existing component. * Create monolithic components that handle multiple different tasks. **Code Example (JavaScript/React):** """javascript // packages/ui-library/src/components/Input/Input.jsx import React from 'react'; import styles from './Input.module.css'; //Example using CSS Modules interface InputProps { label: string; value: string; onChange: (event: React.ChangeEvent<HTMLInputElement>) => void; type?: string; errorMessage?: string; } const Input: React.FC<InputProps> = ({ label, value, onChange, type = "text", errorMessage }) => { return ( <div className={styles.inputContainer}> <label htmlFor={label} className={styles.inputLabel}>{label}</label> <input type={type} id={label} value={value} onChange={onChange} className={styles.inputField} /> {errorMessage && <div className={styles.errorMessage}>{errorMessage}</div>} </div> ); }; export default Input; // Input.module.css .inputContainer { display: flex; flex-direction: column; margin-bottom: 10px; } .inputLabel { margin-bottom: 5px; font-weight: bold; } .inputField { padding: 8px; border: 1px solid #ccc; border-radius: 4px; font-size: 16px; } .errorMessage { color: red; font-size: 12px; } """ **Explanation:** The "Input" component handles only the rendering and management of a single input field. Error messages and labels are controlled within the component. ### 1.3. Standard: Versioning and Semantic Versioning **Standard:** All components should be versioned using Semantic Versioning (SemVer). Major versions should indicate breaking changes. **Why:** SemVer allows developers to manage dependencies and understand the impact of updates. In a Monorepo, this is even more critical as changes in one component can affect multiple applications. Automated dependency updates and change management tools rely on accurate versioning. **Do This:** * Use "npm version", "yarn version", or "pnpm version" to manage versions. * Follow SemVer principles for versioning. * Publish components with clearly defined versions. Use a tool like "changesets" or similar for managing release and versioning. **Don't Do This:** * Make breaking changes without bumping the major version. * Publish components without a version number. * Ignore SemVer best practices. **Code Example (package.json - Using changesets to manage versioning):** """json // packages/ui-library/package.json { "name": "@my-monorepo/ui-library", "version": "1.2.3", "description": "A UI library for my monorepo.", "main": "dist/index.js", "module": "dist/index.esm.js", "types": "dist/index.d.ts", "scripts": { "build": "rollup -c", "test": "jest", "lint": "eslint src --ext .ts,.tsx", "prepare": "npm run build", "version": "changeset version", //Uses changesets cli "release": "npm publish"//Uses changesets cli }, "dependencies": { //Example of using react and styled components. "react": "^18.0.0", "styled-components": "^5.0.0" }, "devDependencies": { "@changesets/cli": "^2.26.2", "@rollup/plugin-commonjs": "^25.0.7", "@rollup/plugin-node-resolve": "^15.2.3", "@rollup/plugin-typescript": "^11.1.5", "@types/react": "^18.0.0", "@types/styled-components": "^5.0.0", "rollup": "^2.79.1", "rollup-plugin-peer-deps-external": "^2.2.4", "typescript": "^4.9.5" }, "peerDependencies": { "react": "^18.0.0", "styled-components": "^5.0.0" }, "files": [ "dist" ], "publishConfig": { "access": "public" } } """ **Explanation:** The "changeset version" command, when part of the version lifecycle hook, automates the SemVer bumping process based on changeset files, which explicitly declare versions. Peer dependencies clearly define the React and Styled-Components versions required by the library. This explicit declaration creates a clear contract. ## 2. Component API Design ### 2.1. Standard: Explicit Props **Standard:** Components should accept data and behavior as explicit props, rather than relying on implicit state or context. **Why:** Explicit props make components easier to understand and reason about. They improve testability and reduce side effects. In a Monorepo, this is important for ensuring components are predictable and behave consistently across different applications. **Do This:** * Define all required props with TypeScript/JavaScript. * Use default props for optional values. * Document all props clearly. **Don't Do This:** * Rely on global state or context for component behavior unless absolutely required. **Code Example (TypeScript/React):** """typescript // packages/ui-library/src/components/Avatar/Avatar.tsx import React from 'react'; import styles from './Avatar.module.css'; interface AvatarProps { imageUrl: string; size?: 'small' | 'medium' | 'large'; altText?: string; } const Avatar: React.FC<AvatarProps> = ({ imageUrl, size = 'medium', altText = "User Avatar" }) => { let avatarSizeClass = styles.mediumAvatar; if (size === 'small') { avatarSizeClass = styles.smallAvatar; } else if (size === 'large') { avatarSizeClass = styles.largeAvatar; } return ( <img src={imageUrl} alt={altText} className={"${styles.avatar} ${avatarSizeClass}"} /> ); }; export default Avatar; //Avatar.module.css .avatar { border-radius: 50%; } .smallAvatar { width: 30px; height: 30px; } .mediumAvatar { width: 50px; height: 50px; } .largeAvatar { width: 80px; height: 80px; } """ **Explanation:** The "Avatar" component defines the "imageUrl", "size", and "altText" properties explicitly. The default value for size is set to "medium". ### 2.2. Standard: Event Handling **Standard:** Components should emit events using well-defined event handler props. **Why:** Allows parent components to react to events triggered by child components. Helps keep components decoupled and reusable. Standardized event handling makes it easier to track component interactions within the Monorepo. **Do This:** * Use descriptive event handler names (e.g., "onInputChange", "onSubmit"). * Pass necessary data as arguments to the event handler. * Create interfaces for event handler payloads. **Don't Do This:** * Directly modify the state of parent components. * Use generic event handlers without clear purpose. **Code Example (TypeScript/React):** """typescript // packages/ui-library/src/components/SearchInput/SearchInput.tsx import React, { useState, ChangeEvent } from 'react'; import styles from './SearchInput.module.css'; interface SearchInputProps { onSearch: (query: string) => void; placeholder?: string; } const SearchInput: React.FC<SearchInputProps> = ({ onSearch, placeholder = "Search..." }) => { const [searchTerm, setSearchTerm] = useState(''); const handleInputChange = (event: ChangeEvent<HTMLInputElement>) => { const newSearchTerm = event.target.value; setSearchTerm(newSearchTerm); onSearch(newSearchTerm); //Calls callback on every input change }; return ( <input type="text" placeholder={placeholder} value={searchTerm} onChange={handleInputChange} className={styles.searchInput} /> ); }; export default SearchInput; //SearchInput.module.css .searchInput { padding: 8px 12px; border: 1px solid #ccc; border-radius: 4px; font-size: 14px; width: 200px; /* Adjust width as needed */ outline: none; /* Removes the default focus outline */ } /* Style for when the input is focused (optional) */ .searchInput:focus { border-color: #007bff; /* Highlight the border on focus */ box-shadow: 0 0 5px rgba(0,123,255,0.5); /* Add a subtle shadow */ } """ **Explanation:** The "SearchInput" component has an "onSearch" prop that's a function taking the search query as an argument. Whenever text is entered into the input field, the "onSearch" function is called. The "placeholder" defaults to "Search...". ## 3. Styling and Theming ### 3.1. Standard: CSS Modules or Styled Components **Standard:** Use CSS Modules or Styled Components for component styling. **Why:** These techniques provide component-level styling, reducing the risk of style conflicts. They also improve maintainability and reusability. In a Monorepo, where multiple teams might be working on different applications, modular styling is essential. **Do This:** * Choose either CSS Modules or Styled Components and stick to it within a component library. * Use descriptive class names or style names. * Avoid global CSS styles that can conflict with other components. **Don't Do This:** * Use inline styles excessively. * Use global CSS classes without proper scoping or naming conventions. **Code Example (Styled Components):** """typescript // packages/ui-library/src/components/Alert/Alert.tsx import React, { ReactNode } from 'react'; import styled from 'styled-components'; interface AlertProps { children: ReactNode; type?: 'success' | 'warning' | 'error'; } const AlertContainer = styled.div<Pick<AlertProps, "type">>" padding: 10px; border-radius: 5px; margin-bottom: 10px; background-color: ${(props) => { switch (props.type) { case 'success': return '#d4edda'; case 'warning': return '#fff3cd'; case 'error': return '#f8d7da'; default: return '#e2e3e5'; } }}; color: ${(props) => { switch (props.type) { case 'success': return '#155724'; case 'warning': return '#856404'; case 'error': return '#721c24'; default: return '#000'; } }}; "; const Alert: React.FC<AlertProps> = ({ children, type = 'success' }) => { return ( <AlertContainer type={type}> {children} </AlertContainer> ); }; export default Alert; """ **Explanation:** This uses Styled Components to create a styled "AlertContainer" div. The background color and text color are dynamically set based on the "type" prop. ### 3.2. Standard: Theming Support **Standard:** Create components that are theme-aware, supporting light and dark themes, or other customized styles. **Why:** Increases the flexibility and reusability of components across different applications and user preferences. Theming provides a consistent user experience. **Do This:** * Use CSS Variables or Styled Components' theming capabilities. * Provide a default theme and allow applications to override it. **Don't Do This:** * Hardcode colors and styles within components. **Code Example (Styled Components with Theming):** """typescript // packages/ui-library/src/components/TextInput/TextInput.tsx import React, { useState, ChangeEvent, useContext } from 'react'; import styled from 'styled-components'; import { ThemeContext } from './ThemeProvider'; // Assuming you have a ThemeProvider interface TextInputProps { placeholder?: string; } const StyledInput = styled.input" padding: 8px 12px; border: 1px solid ${(props) => props.theme.borderColor}; border-radius: 4px; font-size: 14px; width: 200px; outline: none; background-color: ${(props) => props.theme.backgroundColor}; color: ${(props) => props.theme.textColor}; &:focus { border-color: ${(props) => props.theme.primaryColor}; box-shadow: 0 0 5px rgba(0,123,255,0.5); } "; const TextInput: React.FC<TextInputProps> = ({ placeholder = "Enter Text" }) => { const theme = useContext(ThemeContext); //Accessing the theme return ( <StyledInput placeholder={placeholder} theme={theme}/> //Pass theme as props ); }; export default TextInput; // packages/ui-library/src/components/ThemeProvider.tsx import React, { createContext, useState, useContext, ReactNode } from 'react'; import { ThemeProvider as StyledThemeProvider } from 'styled-components'; // Define the theme interface interface Theme { primaryColor: string; backgroundColor: string; textColor: string; borderColor: string; } // Define default themes const lightTheme: Theme = { primaryColor: '#007bff', backgroundColor: '#ffffff', textColor: '#333333', borderColor: '#cccccc' }; const darkTheme: Theme = { primaryColor: '#00aaff', // A slightly brighter shade for dark mode backgroundColor: '#333333', textColor: '#ffffff', borderColor: '#555555' }; // Create a context for the theme interface ThemeContextType { theme: Theme; toggleTheme: () => void; } const ThemeContext = createContext<ThemeContextType>({ theme: lightTheme, // Default theme toggleTheme: () => {} // Dummy function to avoid null checks }); // Create a ThemeProvider component interface ThemeProviderProps { children: ReactNode; } const ThemeProvider: React.FC<ThemeProviderProps> = ({ children }) => { const [currentTheme, setCurrentTheme] = useState<Theme>(lightTheme); // Function to toggle between light and dark themes const toggleTheme = () => { setCurrentTheme(currentTheme === lightTheme ? darkTheme : lightTheme); }; return ( <ThemeContext.Provider value={{ theme: currentTheme, toggleTheme }}> <StyledThemeProvider theme={currentTheme}> {children} </StyledThemeProvider> </ThemeContext.Provider> ); }; // Custom hook to use the theme const useTheme = () => useContext(ThemeContext); export { ThemeProvider, useTheme, ThemeContext }; export type { Theme }; """ **Explanation:** 1. **Theme Definition:** The "Theme" interface defines the properties for the theme, such as "primaryColor", "backgroundColor", "textColor", and "borderColor". 2. **Default Themes:** "lightTheme" and "darkTheme" are defined as default themes. 3. **ThemeContext:** "ThemeContext" is created to provide the theme to the components. 4. **ThemeProvider:** The "ThemeProvider" component manages the current theme state and provides a "toggleTheme" function to switch between themes. It utilizes Styled Components' "ThemeProvider" to pass down the theme. 5. **useTheme Hook:** A custom "useTheme" hook is provided to easily access the theme and "toggleTheme" function in components. 6. **Styled Components Integration:** Styled Components are used to create the styled TextInput. Theme is accessed within backticks: "$""${props => props.theme.borderColor};"" ## 4. Component Testing ### 4.1. Standard: Unit Tests **Standard:** Write unit tests for all components, focusing on testing their public API and behavior. **Why:** Ensures components function correctly and reduces the risk of regressions. Unit tests are fast and provide detailed feedback. In a Monorepo, component tests ensure that changes in one component don't break other parts of the system. Especially crucial when one component is utilized in multiple applications/packages. **Do This:** * Use a testing framework like Jest or Mocha. * Write tests for all possible states and inputs. * Use mocks and stubs to isolate components during testing. * Aim for high test coverage. **Don't Do This:** * Skip writing tests for complex components. * Write brittle tests that are tightly coupled to implementation details. **Code Example (Jest/React Testing Library):** """typescript // packages/ui-library/src/components/Counter/Counter.tsx import React, { useState } from 'react'; import styles from './Counter.module.css'; interface CounterProps { initialValue?: number; } const Counter: React.FC<CounterProps> = ({ initialValue = 0 }) => { const [count, setCount] = useState(initialValue); const increment = () => { setCount(count + 1); }; const decrement = () => { setCount(count - 1); }; return ( <div className={styles.counterContainer}> <button onClick={decrement} className={styles.counterButton}>-</button> <span className={styles.counterValue}>{count}</span> <button onClick={increment} className={styles.counterButton}>+</button> </div> ); }; export default Counter; //Counter.module.css .counterContainer { display: flex; align-items: center; } .counterButton { padding: 5px 10px; margin: 0 5px; font-size: 16px; cursor: pointer; } .counterValue { font-size: 18px; margin: 0 10px; } // packages/ui-library/src/components/Counter/Counter.test.tsx import React from 'react'; import { render, screen, fireEvent } from '@testing-library/react'; import Counter from './Counter'; describe('Counter Component', () => { test('renders initial value correctly', () => { render(<Counter initialValue={5} />); const countElement = screen.getByText('5'); expect(countElement).toBeInTheDocument(); }); test('increments count when increment button is clicked', () => { render(<Counter initialValue={0} />); const incrementButton = screen.getByText('+'); fireEvent.click(incrementButton); const countElement = screen.getByText('1'); expect(countElement).toBeInTheDocument(); }); test('decrements count when decrement button is clicked', () => { render(<Counter initialValue={10} />); const decrementButton = screen.getByText('-'); fireEvent.click(decrementButton); const countElement = screen.getByText('9'); expect(countElement).toBeInTheDocument(); }); }); """ **Explanation:** The "Counter.test.tsx" file uses React Testing Library to test the "Counter" component. It verifies that the initial value is rendered correctly and that the increment and decrement buttons work as expected. ### 4.2. Standard: Component Storybook or Similar Documentation **Standard:** Use a component documentation tool like Storybook to showcase the different states and variations of each component visually. **Why:** Provides a living style guide and documentation for components. Facilitates communication and collaboration between designers and developers. Helps ensure visual consistency across the Monorepo. Crucial for component discovery and understanding of purpose. **Do This:** * Create stories for all components, covering different props and states. * Use addons to enhance Storybook functionality (e.g., accessibility checks). * Keep stories up-to-date as components evolve. **Don't Do This:** * Treat Storybook as an afterthought. * Create incomplete or outdated stories. **Code Example (Storybook Story):** """typescript // packages/ui-library/src/components/Button/Button.stories.tsx import React from 'react'; import { Story, Meta } from '@storybook/react'; import Button from './Button'; export default { title: 'Components/Button', component: Button, argTypes: { variant: { control: { type: 'select', options: ['primary', 'secondary'] }, }, onClick: { action: 'clicked' }, }, } as Meta; const Template: Story = (args) => <Button {...args} />; export const Primary = Template.bind({}); Primary.args = { children: 'Primary Button', variant: 'primary', }; export const Secondary = Template.bind({}); Secondary.args = { children: 'Secondary Button', variant: 'secondary', }; """ **Explanation:** This Storybook story defines two variations of the "Button" component: "Primary" and "Secondary". The user can interact with these stories in the Storybook UI. ## 5. Performance Considerations ### 5.1. Standard: Minimize Re-renders **Standard:** Optimize components to minimize unnecessary re-renders. **Why:** Re-renders can be performance bottlenecks, especially in complex applications. Careful optimization is essential in a Monorepo where components are shared across multiple applications. **Do This:** * Use "React.memo" for functional components that receive the same props. * Implement "shouldComponentUpdate" or "PureComponent" for class components. * Use immutable data structures. **Don't Do This:** * Rely on default React behavior for all components without considering performance. **Code Example (React.memo):** """typescript // packages/ui-library/src/components/DisplayValue/DisplayValue.tsx import React from 'react'; import styles from './DisplayValue.module.css'; interface DisplayValueProps { value: string; } const DisplayValue: React.FC<DisplayValueProps> = ({ value }) => { console.log("DisplayValue rendered with value: ${value}"); return <div className={styles.displayValue}>{value}</div>; }; export default React.memo(DisplayValue); //DisplayValue.module.css .displayValue { font-size: 20px; font-weight: bold; color: #333; /* or any color that suits your design */ padding: 10px; /* some padding to give it space */ border: 1px solid #ccc; /* optional: a subtle border */ border-radius: 5px; /* optional: rounded corners for a softer look */ background-color: #f9f9f9; /* optional: a very light background */ text-align: center; /* centers the text */ } """ **Explanation:** "React.memo" memoizes the "DisplayValue" component, preventing re-renders if the "value" prop hasn't changed. ### 5.2. Standard: Code Splitting **Standard:** Implement code splitting to reduce the initial load time of applications. **Why:** Splitting code into smaller chunks allows the browser to download only the code that's needed initially, improving performance. In a Monorepo, this is essential because the codebase can be very large. **Do This:** * Use dynamic imports ("import()") to load components on demand. * Use tools like Webpack or Rollup to configure code splitting. * Identify chunks that can be loaded lazily. **Don't Do This:** * Load all components upfront, even if they aren't needed immediately. **Code Example (Dynamic Import):** """typescript // packages/app/src/App.tsx import React, { lazy, Suspense } from 'react'; const LazyLoadedComponent = lazy(() => import('@my-monorepo/ui-library/MyComponent')); const App = () => { return ( <Suspense fallback={<div>Loading...</div>}> <LazyLoadedComponent /> </Suspense> ); }; export default App; """ **Explanation:** The "@my-monorepo/ui-library/MyComponent" component is loaded lazily using "lazy" and "Suspense". This means that the component's code will only be downloaded when it's actually needed. ## 6. Accessibility ### 6.1. Standard: ARIA Attributes **Standard:** Utilize ARIA attributes to enhance the accessibility of components. **Why:** ARIA attributes provide semantic information to assistive technologies, making components more accessible to users with disabilities. **Do This:** * Use ARIA attributes to describe the role, state, and properties of elements. * Provide clear and concise labels for interactive elements. * Test components with screen readers. **Don't Do This:** * Use ARIA attributes incorrectly or unnecessarily. * Rely solely on ARIA attributes without providing proper semantic HTML. **Code Example (ARIA Attributes):** """typescript // packages/ui-library/src/components/ToggleSwitch/ToggleSwitch.tsx import React from 'react'; import styles from './ToggleSwitch.module.css'; interface ToggleSwitchProps { checked: boolean; onChange: (checked: boolean) => void; label?: string; } const ToggleSwitch: React.FC<ToggleSwitchProps> = ({ checked, onChange, label = "Enable" }) => { return ( <div className={styles.toggleContainer}> <label className={styles.switch}> <input type="checkbox" checked={checked} onChange={(e) => onChange(e.target.checked)} role="switch" aria-checked={checked} aria-label={label} /> <span className={"${styles.slider} ${styles.round}"}></span> </label> </div> ); }; export default ToggleSwitch; //ToggleSwitch.module.css .toggleContainer { display: flex; align-items: center; } .switch { position: relative; display: inline-block; width: 60px; height: 34px; } /* Hide default HTML checkbox */ .switch input { opacity: 0; width: 0; height: 0; } /* The slider */ .slider { position: absolute; cursor: pointer; top: 0; left: 0; right: 0; bottom: 0; background-color: #ccc; transition: .4s; } .slider:before { position: absolute; content: ""; height: 26px; width: 26px; left: 4px; bottom: 4px; background-color: white; transition: .4s; } input:checked + .slider { background-color: #2196F3; } input:focus + .slider { box-shadow: 0 0 1px #2196F3; } input:checked + .slider:before { transform: translateX(26px); } /* Rounded sliders */ .slider.round { border-radius: 34px; } .slider.round:before { border-radius: 50%; } """ **Explanation:** The "ToggleSwitch" component uses "role="switch"" and "aria-checked" to provide semantic information about the toggle switch to assistive technologies. "aria-label" provides text for screenreaders. ### 6.2. Standard: Keyboard Navigation **Standard:** Ensure that all interactive components are accessible via keyboard navigation. **Why:** Users who cannot use a mouse rely on keyboard navigation to interact with web applications. **Do This:** * Use proper HTML elements (e.g., "<button>", "<a>") that support keyboard navigation by default. * Use the "tabindex" attribute to control the focus order. * Provide visual focus indicators. **Don't Do This:** * Remove the focus outline without providing an alternative visual indicator. ## 7. Monorepo Specific Considerations ### 7.1. Standard: Dependency Management **Standard:** Strictly control dependencies between packages in the Monorepo. Avoid circular dependencies. **Why:** Circular dependencies can lead to build issues, runtime errors, and increased complexity. **Do This:** * Use a tool like "madge" or "depcheck" to detect circular dependencies. * Refactor code to remove circular dependencies. * Clearly define the public API of each package using TypeScript definition files. **Don't Do This:** * Introduce circular dependencies between packages willy-nilly. * Ignore dependency management best practices. ### 7.2. Standard: Build Tooling **Standard:** Use a build system that can efficiently build and test only the affected packages when changes are made. Tools like Nx and Turborepo can be very helpful. **Why:** Helps to keep build times low, improving developer productivity. **Do This:** - Use tools designed for monorepos like Nx or Turborepo to intelligently build and cache task outputs. - Clearly define the dependencies between packages in your build configuration. - Leverage caching mechanisms to avoid rebuilding unchanged packages. **Don't Do This:** - Build all packages from scratch every time, as this is inefficient. - Neglect to configure your tooling properly to track and optimize your build process. This document provides a comprehensive set of coding standards for component design within a Monorepo architecture. Adhering to these standards will help create reusable, maintainable, and performant components that can be shared across multiple applications. These are living documents and continuous feedback should be incorporated to refine/improve this document as new practices emerge.
# State Management Standards for Monorepo This document outlines the standards for state management within our monorepo. Effective state management is crucial for maintainability, performance, and scalability across our applications and libraries. These standards aim to provide a consistent approach to handling application state, data flow, and reactivity within the monorepo. ## 1. Principles of State Management in a Monorepo A monorepo architecture introduces unique challenges and opportunities regarding state management. Due to code sharing and potential inter-dependencies between projects, a unified and well-defined state management strategy becomes paramount. * **Standard:** Utilize a predictable and unidirectional data flow. * **Why:** Ensures that changes to state are traceable and debuggable, preventing unintended side effects across the monorepo. * **Do This:** Favor architectures like Flux, Redux, or their modern counterparts with clear data flow patterns. * **Don't Do This:** Avoid directly mutating state across different components or services without a defined flow. * **Standard:** Favor immutable data structures. * **Why:** Simplifies debugging, allows for easy change detection, and improves performance by enabling shallow comparisons. * **Do This:** Use libraries like Immutable.js, Immer, or native JavaScript with spread operators to create new, immutable state objects. * **Don't Do This:** Directly modify state objects, as this can lead to unpredictable behavior and difficult-to-trace bugs. * **Standard:** Separate stateful logic from presentational components. * **Why:** Enhances reusability, testability, and maintainability by isolating state-specific code. * **Do This:** Implement the Container/Presentational pattern or use hooks to separate data fetching and state manipulation from UI rendering. * **Don't Do This:** Embed complex state logic directly within UI components. * **Standard:** Define clear boundaries for state domains. * **Why:** Prevents components and services from accidentally modifying state that they shouldn't have access to. * **Do This:** Use techniques like context providers or scoped state management solutions to isolate state to specific parts of the application. * **Don't Do This:** Allow global, shared state to be modified from anywhere in the codebase without clear ownership or access controls. * **Standard:** Handle side effects carefully. * **Why:** Side effects (API calls, DOM manipulations, etc.) can introduce complexity and make state updates less predictable. * **Do This:** Isolate side effects within dedicated modules or using middleware/thunks in state management libraries. * **Don't Do This:** Perform side effects directly within reducers or component render functions. ## 2. Choosing a State Management Library Selecting the right state management library is critical. The choice depends on the project's complexity, team familiarity, and performance requirements. The monorepo should adopt a limited set of preferred libraries to promote consistency. * **Preferred Libraries:** For React-based applications, consider Zustand, Recoil, Jotai, or Redux Toolkit. For Vue-based applications, consider Pinia or Vuex. (These are leading contenders as of late 2024/early 2025.) * **Zustand:** A small, fast, and scalable bearbones state-management solution using simplified flux principles. * **Recoil:** A state management library for React that lets you create data-flow graphs. Particularly suited to complex dependencies. Can require more boilerplate than Zustand. * **Jotai:** Primitive and flexible state management based on an atomic model. * **Redux Toolkit:** An opinionated, batteries-included toolset for efficient Redux development, simplifying configuration and reducing boilerplate. Often combined now with RTK Query for data fetching. * **Pinia:** The recommended state management solution for Vue 3, offering a simpler and more intuitive API compared to Vuex. * **Vuex:** The official state management library for Vue, suitable for complex applications requiring centralized state management. * **Standard:** Justify the choice of state management library in the project's README. * **Why:** Provides context for other developers and helps maintain consistency across the monorepo. * **Do This:** Document the reasons for selecting a specific library, considering factors like team expertise, project complexity, and performance requirements. * **Don't Do This:** Choose a library arbitrarily without properly evaluating its suitability for the project. ## 3. Zustand State Management Examples Zustand is a minimalist and flexible state management solution suitable for many projects within a monorepo. ### 3.1 Core Implementation * **Standard:** Create a store using "create" from Zustand. * **Standard:** Define state and actions within the store function. """javascript // packages/my-app/src/store/myStore.js import { create } from 'zustand'; const useMyStore = create((set) => ({ count: 0, increment: () => set((state) => ({ count: state.count + 1 })), decrement: () => set((state) => ({ count: state.count - 1 })), reset: () => set({ count: 0 }), // Example with async action fetchData: async () => { const response = await fetch('/api/data'); // Replace with real API endpoint const data = await response.json(); set({ data: data }); // Assumes you add "data" to the initial state. }, })); export default useMyStore; """ * **Why:** Provides a simple and efficient way to manage state using hooks. * **Do This:** Use functional updates to ensure immutability. * **Don't Do This:** Mutate the state directly. ### 3.2 Using the Store in Components * **Standard:** Use the custom hook "useMyStore" to access state and actions within components. """javascript // packages/my-app/src/components/MyComponent.js import React from 'react'; import useMyStore from '../store/myStore'; function MyComponent() { const { count, increment, decrement, reset, fetchData } = useMyStore(); return ( <div> <p>Count: {count}</p> <button onClick={increment}>Increment</button> <button onClick={decrement}>Decrement</button> <button onClick={reset}>Reset</button> <button onClick={fetchData}>Fetch Data</button> </div> ); } export default MyComponent; """ * **Why:** Simplifies component logic and promotes reusability. ### 3.3. Middleware and Persistence * Zustand uses middleware for advanced functionality like persistence. """javascript // packages/my-app/src/store/myStore.js import { create } from 'zustand'; import { persist } from 'zustand/middleware' const useMyStore = create(persist( (set, get) => ({ count: 0, increment: () => set({ count: get().count + 1 }), decrement: () => set({ count: get().count - 1 }), }), { name: 'my-store', // unique name getStorage: () => localStorage, // (optional) default localStorage } )) export default useMyStore; """ * The "persist" middleware automatically saves the state to local storage. * **Why:** Enables easy persistence of state across sessions. ## 4. Recoil State Management Examples Recoil offers a different approach based on atoms and selectors, suitable for complex dependency graphs. ### 4.1 Core Implementation * **Standard:** Define atoms for state and selectors for derived state. """javascript // packages/my-app/src/recoil/atoms.js import { atom } from 'recoil'; export const countState = atom({ key: 'countState', default: 0, }); // packages/my-app/src/recoil/selectors.js import { selector } from 'recoil'; import { countState } from './atoms'; export const doubledCountState = selector({ key: 'doubledCountState', get: ({ get }) => { const count = get(countState); return count * 2; }, }); """ * **Why:** Provides a flexible and efficient way to manage complex state dependencies. * **Do This:** Use unique keys for atoms and selectors. * **Don't Do This:** Use generic keys that might conflict with other parts of the application. ### 4.2 Using Recoil in Components * **Standard:** Use "useRecoilState" and "useRecoilValue" hooks to access Recoil state and derived values. """javascript // packages/my-app/src/components/MyComponent.js import React from 'react'; import { useRecoilState, useRecoilValue } from 'recoil'; import { countState, doubledCountState } from '../recoil/atoms'; function MyComponent() { const [count, setCount] = useRecoilState(countState); const doubledCount = useRecoilValue(doubledCountState); return ( <div> <p>Count: {count}</p> <p>Doubled Count: {doubledCount}</p> <button onClick={() => setCount(count + 1)}>Increment</button> </div> ); } export default MyComponent; """ * **Why:** Simplifies component logic and promotes reusability. ### 4.3 Asynchronous Selectors for Data Fetching Recoil excels with asynchronous data fetching. """javascript import { selector } from 'recoil'; export const asyncDataState = selector({ key: 'asyncDataState', get: async () => { const response = await fetch('/api/data'); // Replace with a real API endpoint const data = await response.json(); return data; }, }); """ * "useRecoilValue" is used to access the data in components. ## 5. Redux Toolkit Examples Redux Toolkit simplifies Redux development with opinionated defaults and utility functions. RTK Query is the recommended approach to data fetching with Redux. ### 5.1 Core Implementation * **Standard:** Configure a Redux store using "configureStore" from Redux Toolkit. * **Standard:** Define reducers using "createSlice". """javascript // packages/my-app/src/store/store.js import { configureStore } from '@reduxjs/toolkit'; import counterReducer from './counterSlice'; export const store = configureStore({ reducer: { counter: counterReducer, }, }); // packages/my-app/src/store/counterSlice.js import { createSlice } from '@reduxjs/toolkit'; export const counterSlice = createSlice({ name: 'counter', initialState: { value: 0, }, reducers: { increment: (state) => { state.value += 1; }, decrement: (state) => { state.value -= 1; }, incrementByAmount: (state, action) => { state.value += action.payload; }, }, }); export const { increment, decrement, incrementByAmount } = counterSlice.actions; export default counterSlice.reducer; """ * **Why:** Provides a simplified and efficient way to manage Redux state. * **Do This:** Use "createSlice" to automatically generate action creators and reducer logic. * **Don't Do This:** Write manual action creators and reducers, as this can lead to boilerplate and errors. ### 5.2 Using Redux in Components * **Standard:** Use "useSelector" and "useDispatch" hooks from "react-redux" to access state and dispatch actions within components. """javascript // packages/my-app/src/components/MyComponent.js import React from 'react'; import { useSelector, useDispatch } from 'react-redux'; import { increment, decrement, incrementByAmount } from '../store/counterSlice'; function MyComponent() { const count = useSelector((state) => state.counter.value); const dispatch = useDispatch(); return ( <div> <p>Count: {count}</p> <button onClick={() => dispatch(increment())}>Increment</button> <button onClick={() => dispatch(decrement())}>Decrement</button> <button onClick={() => dispatch(incrementByAmount(5))}>Increment by 5</button> </div> ); } export default MyComponent; """ * **Why:** Simplifies component logic and promotes reusability. ### 5.3 RTK Query for Data Fetching RTK Query simplifies data fetching in Redux applications. """javascript // packages/my-app/src/services/api.js import { createApi, fetchBaseQuery } from '@reduxjs/toolkit/query/react' export const api = createApi({ baseQuery: fetchBaseQuery({ baseUrl: '/' }), // Adjust base URL as needed. Consider using env vars. endpoints: (builder) => ({ getData: builder.query({ query: () => "data", // Actual endpoint }), }), }); export const { useGetDataQuery } = api; // In store.js: import { configureStore } from '@reduxjs/toolkit'; import { api } from './services/api'; export const store = configureStore({ reducer: { [api.reducerPath]: api.reducer, }, middleware: (getDefaultMiddleware) => getDefaultMiddleware().concat(api.middleware), }); //In a component: import { useGetDataQuery } from '../services/api'; function MyComponent() { const { data, error, isLoading } = useGetDataQuery(); if (isLoading) return <div>Loading...</div>; if (error) return <div>Error: {error.message}</div>; return ( <div> {data.map(item => ( <div key={item.id}>{item.name}</div> ))} </div> ); } """ * **Why:** Provides a streamlined and efficient way to fetch and cache data using Redux. * **Do This:** Define API endpoints using "createApi". * **Don't Do This:** Manually fetch data and manage loading states and errors, as RTK Query handles this automatically. ## 6. Vue.js State Management with Pinia Pinia is the recommended state management solution for Vue 3. ### 6.1 Core Implementation * **Standard**: Define stores using "defineStore" from Pinia. """javascript // packages/my-app/src/stores/counter.js import { defineStore } from 'pinia' export const useCounterStore = defineStore('counter', { state: () => ({ count: 0, }), getters: { doubleCount: (state) => state.count * 2, }, actions: { increment() { this.count++ }, decrement() { this.count-- }, async fetchData() { // Example of making an API call, adapt to your needs const response = await fetch('/api/data') const data = await response.json() // Assign the fetched data to a state variable this.count = data.count; // Adapt based on actual returned data } }, }) """ * **Why**: Provides a modular and scalable approach to managing state in Vue.js applications. * **Do This**: Utilize actions for mutations and getters for derived data. Avoid directly mutating outside of actions. * **Don't Do This**: Use "mapState", "mapGetters", and "mapActions" (Vuex syntax) in Pinia. Use the "use" composable hook instead. ### 6.2 Using Pinia in Components * **Standard**: Use the "useCounterStore" custom hook to access state, getters, and actions within components via the composable "use" pattern. """vue // packages/my-app/src/components/MyComponent.vue <template> <p>Count: {{ counter.count }}</p> <p>Double Count: {{ counter.doubleCount }}</p> <button @click="counter.increment">Increment</button> <button @click="counter.decrement">Decrement</button> <button @click="counter.fetchData">Fetch Data</button> </template> <script setup> import { useCounterStore } from '../stores/counter' const counter = useCounterStore() </script> """ * **Why**: Provides a clear way to access store properties directly in the template and simplifies component logic. The "setup" script handles all state management. ## 7. Guidelines for Sharing State Across Packages Sharing state across packages within the monorepo needs careful consideration. * **Standard:** Avoid sharing mutable state directly between packages. * **Why:** Can lead to tight coupling and difficult-to-debug issues. * **Do This:** Use events, messages, or shared APIs to communicate state changes between packages. * **Don't Do This:** Directly import and modify state from one package into another. * **Standard:** Define shared state contracts using TypeScript interfaces. * **Why:** Ensures that state is transferred consistently and predictably between packages. * **Do This:** Create a shared "types" package to define interfaces for state objects. * **Don't Do This:** Use dynamic or untyped data structures for shared state. * **Standard:** Consider using a shared state management solution if multiple packages need to access the same state. * **Why:** Provides a centralized and consistent way to manage shared state. * **Do This:** Use a shared Redux store, Zustand store, or Recoil graph if necessary. ## 8. Testing State Management Testing state management logic is critical for ensuring application correctness. * **Standard:** Write unit tests for reducers, actions, and selectors. * **Why:** Ensures that state updates are predictable and correct. * **Do This:** Use testing libraries like Jest or Mocha to write unit tests. * **Don't Do This:** Skip testing state management logic, as this can lead to subtle bugs. * **Standard:** Write integration tests for components that interact with state. * **Why:** Ensures that components correctly dispatch actions and render state. * **Do This:** Use testing libraries like React Testing Library or Vue Test Utils to write integration tests. * **Don't Do This:** Rely solely on manual testing to verify state management. * **Standard:** Mock API calls when testing state management logic. * **Why:** Prevents tests from depending on external services and makes them more reliable. * **Do This:** Use mocking libraries like Mock Service Worker (MSW) or Nock to intercept and mock API calls. ## 9. Anti-Patterns and Mistakes to Avoid * **Over-reliance on Global State:** Avoid storing purely local component state in the global state management solution. Performance will suffer. * **Direct State Mutation:** Always ensure immutability. * **Ignoring Asynchronous Actions:** Handle async operations correctly, especially API calls. Use RTK Query, thunks, or comparable patterns. * **Lack of Testing:** State management logic is often complex and requires thorough testing. * **Unnecessary Complexity:** Choose the simplest state management solution that meets the project's needs. Don't automatically reach for Redux when Zustand will do. * **Tight Coupling:** Avoid creating tight dependencies between components and the state management implementation. * **Neglecting Performance:** Be aware of performance implications, especially when dealing with large state objects. * **Not Using Typescript:** Typescript can save lots of problems when refactoring and understanding the data structures across the monorepo. Use it! * **Magic strings**: Use constants instead of strings for action types and other related items. By adhering to these standards, we can ensure a consistent, maintainable, and scalable approach to state management across our monorepo. This document should be used as a reference for all development teams and integrated into code review processes. Continuously updating these standards as the ecosystem evolves is crucial for maintaining high-quality code.
# Performance Optimization Standards for Monorepo This document outlines coding standards and best practices for performance optimization within a Monorepo environment. These standards are designed to improve application speed, responsiveness, and resource utilization. Following these guidelines will result in more maintainable, scalable, and performant applications. ## 1. Architectural Considerations for Performance ### 1.1. Strategic Module Decomposition **Goal:** Minimize the impact of changes and builds across the entire repository and optimize for parallel build execution. * **Do This:** * Divide the Monorepo into cohesive, independent modules (libraries, applications, shared components). * Consider the "blast radius" of changes. Modifications to one module should ideally have minimal or no impact on unrelated modules. * Ensure well-defined public APIs for modules that need to interact. * **Don't Do This:** * Create a monolithic module containing everything. * Establish circular dependencies between modules. * Expose internal implementation details through public APIs. **Why:** Poor module decomposition leads to unnecessary rebuilds, increased testing burden, and difficulty in isolating performance bottlenecks. A well-structured Monorepo facilitates parallel builds, targeted testing, and independent deployments, all of which contribute to faster development cycles and improved performance. **Example:** """ monorepo/ ├── apps/ │ ├── web-app/ # Independent web application │ │ ├── src/ │ │ └── package.json │ ├── mobile-app/ # Independent mobile application │ │ ├── src/ │ │ └── package.json ├── libs/ │ ├── ui-components/ # Reusable UI components │ │ ├── src/ │ │ └── package.json │ ├── data-access/ # Data fetching and caching logic │ │ ├── src/ │ │ └── package.json └── tools/ └── scripts/ # Utility scripts (e.g., build, test) """ ### 1.2. Dependency Management **Goal:** Reduce build times and runtime overhead by minimizing unnecessary dependencies. * **Do This:** * Declare dependencies accurately (e.g., using "devDependencies" for build-time dependencies). * Utilize dependency analysis tools (like "npm audit", "yarn audit") to identify and mitigate security vulnerabilities and outdated packages. * Keep dependencies up to date to benefit from performance improvements and security patches. * Use tools like "pnpm" or "yarn" with workspace functionality for optimal dependency sharing and installation speed * **Don't Do This:** * Include unnecessary dependencies in your modules. * Rely on transitive dependencies without declaring them explicitly. **Why:** Excessive or poorly managed dependencies increase build times, bundle sizes, and potentially introduce security vulnerabilities. Explicitly managing dependencies ensures that each module only includes what it truly needs, optimizing overall performance. **Example (package.json):** """json { "name": "@my-monorepo/ui-components", "version": "1.0.0", "dependencies": { "@emotion/react": "^11.11.1", "@emotion/styled": "^11.11.0", "@mui/material": "^5.14.18" }, "devDependencies": { "@types/react": "^18.2.33", "@types/styled-components": "^5.1.29", "typescript": "^5.2.2" } } """ ### 1.3. Build System Optimization **Goal:** Minimize build times and optimize for incremental builds. * **Do This:** * Use a modern build system tailored for Monorepos, such as Nx, Bazel, or Turborepo. * Configure the build system to leverage caching and incremental builds. * Define clear build targets and dependencies within the build configuration. * Use parallel execution where appropriate to speed up build processes. * Profile your builds regularly to identify bottlenecks. * **Don't Do This:** * Use generic build tools that don't understand Monorepo structures. * Disable caching or incremental builds. * Create complex build scripts that are difficult to maintain. **Why:** Optimized build processes significantly reduce development time and improve developer productivity. Caching and incremental builds ensure that only necessary code is rebuilt, leading to substantial performance gains. A modern build system designed for Monorepos understands the relationships between modules and can optimize the build process accordingly. **Example (Nx "nx.json"):** """json { "tasksRunnerOptions": { "default": { "runner": "nx-cloud", "options": { "cacheableOperations": ["build", "lint", "test", "e2e"], "accessToken": "YOUR_NX_CLOUD_TOKEN" } } }, "affected": { "defaultBase": "main" }, "namedInputs": { "default": ["{projectRoot}/**/*", "sharedGlobals"], "production": [ "default", "!{projectRoot}/**/?(*.)+(spec|test).[jt]s?(x)?(.snap)", "!{projectRoot}/tsconfig.spec.json", "!{projectRoot}/jest.config.[jt]s", "!{projectRoot}/.eslintrc.json" ], "sharedGlobals": [] } } """ ### 1.4. Code Sharing and Reusability **Goal:** Avoid code duplication and promote efficient use of resources. * **Do This:** * Identify common functionality across modules and extract it into shared libraries. * Use a design system or component library for consistent UI elements. * Employ code generation techniques to reduce boilerplate code. * **Don't Do This:** * Duplicate code across multiple modules. * Create tightly coupled components that are difficult to reuse. **Why:** Code duplication increases maintenance costs and potential performance issues. Sharing code reduces the overall codebase size, promotes consistency, and simplifies updates. Using a component library improves rendering performance by reducing the amount of unique CSS and JavaScript that needs to be loaded. **Example:** Move common utility functions to a shared library. """typescript // libs/utils/src/index.ts export function formatCurrency(amount: number, currencyCode: string = 'USD'): string { return new Intl.NumberFormat('en-US', { style: 'currency', currency: currencyCode, }).format(amount); } // apps/web-app/src/components/Product.tsx import { formatCurrency } from '@my-monorepo/utils'; function Product({ price }: { price: number }) { return <div>Price: {formatCurrency(price)}</div>; } """ ## 2. Coding Practices for Performance ### 2.1. Lazy Loading and Code Splitting **Goal:** Reduce initial load times by loading code only when it is needed. * **Do This:** * Implement lazy loading for modules that are not immediately required. * Use code splitting to break large bundles into smaller chunks. * Consider route-based code splitting for single-page applications. * **Don't Do This:** * Load all code upfront. * Create excessively large bundles that take a long time to download and parse. **Why:** Initial load time is critical for user experience. Lazy loading and code splitting significantly improve startup performance by deferring the loading of non-essential code. **Example (React with "React.lazy"):** """jsx import React, { lazy, Suspense } from 'react'; const AnalyticsDashboard = lazy(() => import('./AnalyticsDashboard')); // Lazy-loaded component function App() { return ( <div> {/* ... other components ... */} <Suspense fallback={<div>Loading...</div>}> <AnalyticsDashboard /> </Suspense> </div> ); } """ ### 2.2. Efficient Data Structures and Algorithms **Goal:** Optimize runtime performance by choosing appropriate data structures and algorithms. * **Do This:** * Select data structures based on access patterns (e.g., use a Set for membership tests, a Map for key-value lookups). * Use efficient algorithms for common operations (e.g., sorting, searching). * Consider the time and space complexity of your algorithms. * **Don't Do This:** * Use inefficient data structures or algorithms. * Perform unnecessary computations. **Why:** The choice of data structures and algorithms significantly impacts application performance. Choosing the right tools for the job can lead to dramatic improvements in speed and resource utilization. **Example:** """javascript // Efficiently check if an element exists in an array. Use a Set instead of an array for repeated lookups. const myArray = ['a', 'b', 'c', 'd', 'e']; const mySet = new Set(myArray); // Bad: Linear time complexity // myArray.includes('c'); // Good: Near-constant time complexity mySet.has('c'); """ ### 2.3. Memory Management **Goal:** Prevent memory leaks and optimize memory usage. * **Do This:** * Avoid creating unnecessary objects. * Release resources when they are no longer needed (e.g., event listeners, timers). * Use techniques like object pooling to reuse objects. * Be mindful of closures and their potential to capture large amounts of data. * Use tools like the Chrome DevTools memory profiler to identify memory leaks. * When possible, leverage technologies with automatic garbage collection. * **Don't Do This:** * Create large numbers of temporary objects. * Forget to release resources. * Store large amounts of data in memory unnecessarily. **Why:** Memory leaks and excessive memory usage can lead to performance degradation and application crashes. Proper memory management ensures that applications run smoothly and efficiently. **Example:** Removing event listeners to prevent memory leaks. """javascript class MyComponent { constructor() { this.handleClick = this.handleClick.bind(this); } componentDidMount() { window.addEventListener('click', this.handleClick); } componentWillUnmount() { window.removeEventListener('click', this.handleClick); // Remove the event listener } handleClick() { console.log('Clicked!'); } } """ ### 2.4. Minimize DOM Manipulation **Goal:** Reduce the performance overhead associated with updating the Document Object Model (DOM). * **Do This:** * Batch DOM updates. * Use virtual DOM techniques (e.g., React, Vue). * Avoid direct DOM manipulation where possible. * Use efficient selectors (e.g., avoid complex CSS selectors). * **Don't Do This:** * Perform frequent DOM updates. * Use inefficient DOM manipulation methods. **Why:** DOM manipulation is an expensive operation. Minimizing the number of DOM updates improves rendering performance and reduces layout thrashing. Virtual DOM techniques allow you to efficiently update the DOM by comparing the current state with the desired state and only making necessary changes. **Example (React):** """jsx import React, { useState } from 'react'; function MyComponent() { const [items, setItems] = useState(['item1', 'item2', 'item3']); const addItem = () => { // Bad: Multiple state updates trigger multiple re-renders // setItems([...items, 'newItem1']); // setItems([...items, 'newItem2']); // Good: Batch updates into a single state update setItems(prevItems => [...prevItems, 'newItem1', 'newItem2']); }; return ( <div> <ul> {items.map(item => ( <li key={item}>{item}</li> ))} </ul> <button onClick={addItem}>Add Items</button> </div> ); } """ ### 2.5. Caching Strategies **Goal:** Reduce the need to repeatedly fetch or compute the same data. * **Do This:** * Implement caching at different levels (e.g., browser caching, server-side caching, in-memory caching). * Use appropriate cache invalidation strategies (e.g., time-based expiration, event-based invalidation). * Leverage Content Delivery Networks (CDNs) for static assets. * **Don't Do This:** * Cache data indefinitely without invalidation. * Cache sensitive data inappropriately. **Why:** Caching can dramatically improve application performance by reducing the load on servers and databases. Properly invalidating caches is crucial to ensure that users see the latest data. **Example (Browser caching using "Cache-Control" headers):** """javascript // Server-side code (e.g., Node.js with Express) app.get('/api/data', (req, res) => { // Set the Cache-Control header res.set('Cache-Control', 'public, max-age=3600'); // Cache for 1 hour // ... fetch and send data ... }); """ ## 3. Technology-Specific Considerations ### 3.1. JavaScript/TypeScript * **Do This:** * Use modern JavaScript features (e.g., "async/await", "const/let") for improved readability and performance. * Use TypeScript's type system to catch errors early and improve code maintainability. * Use "Array.map", "Array.filter", and "Array.reduce" instead of "for" loops where appropriate for more concise and potentially faster code. * Use "import" and "export" for modular code that utilizes tree shaking * **Don't Do This:** * Use legacy JavaScript features that are less performant or more difficult to understand. * Ignore TypeScript's type checking. ### 3.2. React * **Do This:** * Use "React.memo" to prevent unnecessary re-renders of pure components. * Use "useCallback" and "useMemo" to memoize functions and values. * Use keys effectively when rendering lists. * Profile your components using the React Profiler to identify performance bottlenecks. * Use code splitting and lazy loading with "React.lazy" and "Suspense". * **Don't Do This:** * Rely solely on "shouldComponentUpdate" for preventing re-renders (use "React.memo" instead). * Create new objects or functions inside render methods. ### 3.3. Node.js * **Do This:** * Use asynchronous operations and event loops effectively. * Optimize database queries. * Use connection pooling to reduce database connection overhead. * Use caching mechanisms (e.g., Redis, Memcached). * Profile your application using tools like Clinic.js to identify performance bottlenecks. * **Don't Do This:** * Perform blocking operations in the main event loop. ## 4. Profiling and Monitoring ### 4.1. Performance Audits * Perform regular performance audits using tools like Lighthouse, WebPageTest, or Chrome DevTools to identify areas for improvement. ### 4.2. Monitoring * Implement monitoring solutions to track key performance indicators (KPIs) such as response time, error rate, and resource utilization. Use these KPIs to proactively identify and address performance issues. * Consider using tools like Prometheus, Grafana, or Datadog for advanced monitoring and alerting. ## 5. Continuous Improvement * **Do This:** Regularly review and update these standards to reflect the latest best practices and technology advancements. Encourage developers to propose improvements and share their knowledge. By adhering to these performance optimization standards, development teams can build high-performing Monorepo applications that deliver excellent user experiences and are easy to maintain. Remember that performance optimization is an ongoing process that requires continuous monitoring, analysis, and refinement.
# Testing Methodologies Standards for Monorepo This document outlines the testing methodology standards for our monorepo. It aims to guide developers in creating robust, reliable, and maintainable code. These standards are designed to enhance maintainability, improve developer velocity, and ensure code quality across the entire monorepo. This document serves as a reference for developers and a context for AI-assisted coding tools. ## 1. Introduction to Monorepo Testing Testing in a monorepo architecture presents unique challenges and opportunities compared to traditional, multi-repo setups. Centralized code necessitates a holistic testing strategy that accounts for inter-package dependencies and potential ripple effects of changes. The goal is to maintain high confidence in code correctness, stability, and performance with efficient and effective testing methodologies. ### 1.1. Key Principles * **Test Pyramid:** Implement a test strategy that follows the test pyramid, emphasizing unit tests, followed by integration tests, and then end-to-end tests. * **Test Automation:** Automate testing at all levels to ensure consistent and repeatable results. * **Parallel Execution:** Leverage monorepo tooling to parallelize test execution across packages to reduce overall testing time. * **Isolation:** Isolate tests to prevent interference from external systems or other packages. Provide appropriate mocking and stubbing. * **Code Coverage:** Aim for high code coverage to identify untested code paths, but prioritize meaningful tests over simply achieving a coverage percentage. * **Continuous Integration/Continuous Deployment (CI/CD):** Integrate testing into a CI/CD pipeline to automatically run tests on every commit. * **Contract Testing:** Utilize contract testing to verify interactions between services or modules. ### 1.2. Monorepo Specific Considerations * **Dependency Management:** Pay close attention to inter-package dependencies when designing tests. Changes in one package can affect others, so tests must account for potential ripple effects. * **Scoped Testing:** Implement mechanisms for running tests selectively (e.g., only tests in changed packages and their dependents). * **Shared Tooling:** Leverage shared testing infrastructure and utilities to maintain consistency and reduce duplication. (e.g., shared Jest configurations, custom matchers, testing libraries). * **Impact Analysis:** Use tooling to analyze the impact of changes before running tests, optimizing which tests need to be executed. ## 2. Unit Testing Unit tests verify the functionality of individual units of code (e.g., functions, classes, components) in isolation. They are the foundation of a robust testing strategy. ### 2.1. Standards * **Do This:** * Write unit tests for all non-trivial code. * Focus on testing the public API of modules and components. * Use mocking and stubbing to isolate units of code from their dependencies. * Write tests that are fast, reliable, and easy to understand. * Use descriptive test names that clearly indicate what is being tested. * Follow the Arrange-Act-Assert (AAA) pattern. * **Don't Do This:** * Skip unit tests for "simple" code. Even simple code can have subtle bugs. * Write unit tests that test implementation details. These tests are brittle and prone to breaking when the implementation changes. * Over-mock or over-stub, which can lead to tests that don't accurately reflect the behavior of the system. * Write slow or unreliable unit tests. These tests will slow down the development process and erode confidence. * Use vague or ambiguous test names. ### 2.2. Code Examples (JavaScript/TypeScript) """typescript // example.ts export function add(a: number, b: number): number { return a + b; } export function greet(name: string): string { if (!name) { throw new Error("Name cannot be empty"); } return "Hello, ${name}!"; } """ """typescript // example.test.ts (using Jest) import { add, greet } from './example'; describe('add', () => { it('should add two numbers correctly', () => { // Arrange const a = 2; const b = 3; // Act const result = add(a, b); // Assert expect(result).toBe(5); }); }); describe('greet', () => { it('should greet a person with their name', () => { expect(greet('Alice')).toBe('Hello, Alice!'); }); it('should throw an error if the name is empty', () => { expect(() => greet('')).toThrowError("Name cannot be empty"); }); }); """ ### 2.3. Anti-Patterns * **Testing implementation details:** Testing private methods or internal state. * **Over-mocking:** Mocking excessively can make the tests less effective in identifying real bugs. ### 2.4. Technology-Specific Details * Use Jest, Mocha, or Jasmine for JavaScript/TypeScript testing. Jest is recommended for React applications. * Use appropriate assertion libraries (e.g., Chai, Jest's built-in assertions). * Configure test runners to run in parallel and watch mode. * Use code coverage tools to measure the effectiveness of unit tests. Istanbul (nyc) integrates well with Jest. Configure "nyc" to exclude test files and generated code. * Use mocking libraries like "jest.mock" or "sinon" strategically only when necessary to isolate the unit under test. ## 3. Integration Testing Integration tests verify the interactions between different units of code or modules. They provide confidence that the system works correctly as a whole, bridging the gap between unit and end-to-end (E2E) tests. ### 3.1. Standards * **Do This:** * Write integration tests that verify the interactions between different modules or services within the monorepo. * Focus on testing the flow of data through the system. * Use real dependencies or lightweight test doubles. * Write tests that are more comprehensive than unit tests but faster than E2E tests. * Ensure that integration tests clean up any test data after they run. * **Don't Do This:** * Write integration tests that are too broad, testing too many components at once. * Use mocks for everything. Integration tests should verify real interactions. * Neglect to clean up test data. This can lead to tests that fail intermittently or pollute the environment. ### 3.2. Code Examples (Node.js/TypeScript with Express) """typescript // user-service.ts import { add } from './math-service'; // Assuming math-service is another module export class UserService { createUser(firstName: string, lastName: string): string { const userId = add(firstName.length, lastName.length); return "user-${userId}"; } } """ """typescript // math-service.ts export function add(a: number, b: number): number { return a + b; } """ """typescript // user-service.test.ts (using Jest) import { UserService } from './user-service'; import * as mathService from './math-service'; describe('UserService', () => { it('should create a user with a generated ID based on math-service', () => { const userService = new UserService(); //Mock the specific function which allows testing the service independantly jest.spyOn(mathService, 'add').mockReturnValue(10); const userId = userService.createUser('John', 'Doe'); expect(userId).toBe('user-10'); expect(mathService.add).toHaveBeenCalledWith('John'.length, 'Doe'.length); }); }); """ ### 3.3. Anti-Patterns * **Testing through the UI:** Integration tests should focus on backend interactions, not UI components. * **Not using a test database:** Use a separate database for testing to avoid affecting production data. * **Relying on external services:** Mock external services or use test doubles (e.g., using "nock" to intercept HTTP requests). ### 3.4. Technology-Specific Details * Use tools like Supertest for testing HTTP endpoints in Node.js. * Use dependency injection to make it easier to replace dependencies with test doubles. * Consider using Docker Compose to set up test environments with multiple services. ## 4. End-to-End (E2E) Testing E2E tests simulate real user interactions with the application. They provide the highest level of confidence that the system works correctly from end-to-end. These are significantly slower than unit and integration tests but critical for verifying the overall system behavior. ### 4.1. Standards * **Do This:** * Write E2E tests that cover the most critical user flows. * Use real browsers or headless browser environments (e.g., Playwright, Cypress, Puppeteer). * Set up the test environment automatically before each test run. * Clean up the test environment after each test run. * Write tests that are reliable and repeatable. * **Don't Do This:** * Write too many E2E tests. Focus on the most critical user flows. * Write E2E tests that are brittle or flaky. * Run E2E tests too frequently. Ideally within the CI/CD pipeline on merges/releases or nightly builds. ### 4.2. Code Examples (Playwright - Typescript Preferred) """typescript // playwright.config.ts import { defineConfig, devices } from '@playwright/test'; export default defineConfig({ testDir: './tests', fullyParallel: true, reporter: 'html', use: { baseURL: 'http://localhost:3000', trace: 'on-first-retry', }, projects: [ { name: 'chromium', use: { ...devices['Desktop Chrome'] }, }, ], }); """ """typescript // tests/example.spec.ts import { test, expect } from '@playwright/test'; test('should navigate to the about page', async ({ page }) => { await page.goto('/'); await page.getByRole('link', { name: 'About' }).click(); await expect(page).toHaveURL(/.*about/); await expect(page.locator('h1')).toContainText('About Us'); }); test('should allow a user to log in', async ({ page }) => { await page.goto('/login'); await page.fill('input[name="username"]', 'testuser'); await page.fill('input[name="password"]', 'password123'); await page.click('button[type="submit"]'); await page.waitForURL('/dashboard'); // Or any URL after login await expect(page.locator('#dashboard-title')).toContainText('Dashboard'); }); """ ### 4.3. Anti-Patterns * **Relying on the UI for setup:** Whenever possible, use APIs for test setup and teardown rather than the UI. This makes tests faster and more reliable. * **Not waiting for elements to load:** Use explicit waits to ensure that elements are fully loaded before interacting with them. ### 4.4. Technology-Specific Details * Use Playwright, Cypress, or Puppeteer for E2E testing. Playwright is currently favored for its speed, reliability, and multi-browser support. * Use Docker to create consistent test environments. * Use environment variables to configure tests for different environments (e.g., staging, production). * Implement retries to reduce flakiness in E2E tests. Playwright and Cypress have built-in retry mechanisms. * Integrate visual regression testing to catch unexpected UI changes. Tools like Percy or Applitools can be used. ## 5. Monorepo Testing Strategies Adapting testing strategies to the monorepo context requires optimizing test execution and understanding interdependencies. ### 5.1. Selective Test Execution Only run the tests that are affected by the changes in a commit. Utilize tooling that can identify changed packages and their dependencies to select the appropriate tests. * **Do This:** * Use tools that automatically determine which packages have changed. * Configure your CI/CD system to only run tests for changed packages and their dependents. * Create a dependency graph of packages in the monorepo. * **Don't Do This:** * Run all tests for every commit. This is inefficient and slows down the development process. ### 5.2. Parallelization Run tests in parallel across multiple agents to reduce the overall testing time. Modern monorepo tools support parallel test execution. * **Do This:** * Configure test runners to run tests in parallel. * Use a CI/CD system that can distribute tests across multiple agents. * Allocate sufficient resources to your CI/CD agents to handle the parallel test load. * **Don't Do This:** * Run tests sequentially. This is slow and inefficient. ### 5.3. Code Coverage Across Packages Aggregate code coverage data across all packages in the monorepo to provide a comprehensive view of code coverage. * **Do This:** * Configure code coverage tools to generate reports for each package. * Aggregate the reports into a single dashboard to provide a complete view of code coverage. * Set code coverage thresholds to ensure that all packages are adequately tested. * **Don't Do This:** * Ignore code coverage. This makes it difficult to identify untested code paths. ### 5.4. Example: Leveraging Nx for Affected Tests Nx provides excellent support for running affected tests. """json // nx.json { "tasksRunnerOptions": { "default": { "runner": "nx-cloud", "options": { "cacheableOperations": ["build", "lint", "test", "e2e"], "accessToken": "YOUR_NX_CLOUD_ACCESS_TOKEN" } } }, "targetDefaults": { "test": { "inputs": ["default", "{workspaceRoot}/jest.preset.js"], "cache": true } } } """ To run tests affected by a commit: """bash nx affected:test --base=main --head=HEAD """ ## 6. Contract Testing Contract testing is a specialized testing technique that verifies the interactions between services, ensuring that they adhere to a defined contract. This is especially relevant when dealing with different teams owning different parts of the monorepo that interact via APIs. ### 6.1. Standards * **Do This:** * Define clear contracts between services or modules with well-defined inputs and outputs. * Implement contract tests that verify that each service adheres to its contract. * Use tools like Pact or Spring Cloud Contract to simplify the process of writing and running contract tests. * **Don't Do This:** * Assume that services will always interact correctly. Contract tests are crucial for preventing integration issues. * Neglect to update contract tests when contracts change. * Skip contract testing when changes are isolated to one service. The other side of the contract *must* also be tested. ### 6.2 Example (Pact with JavaScript) A *consumer* project wanting to consume information from the *provider* project using an API: """javascript // Consumer: consumer.test.js const { Pact } = require('@pact-foundation/pact'); const { fetchProviderData } = require('./consumer'); // This is the code under test describe('Pact Verification', () => { const provider = new Pact({ consumer: 'MyConsumer', provider: 'MyProvider', port: 1234, // Port the mock service will run on dir: path.resolve(process.cwd(), 'pacts'), // Directory to save pact files log: path.resolve(process.cwd(), 'logs', 'pact.log'), logLevel: 'info', specVersion: 2, }); beforeAll(async () => { await provider.setup() }); afterEach(async () => { await provider.verify() }); afterAll(async () => { await provider.finalize() }); describe('When a call to retrieve data from the provider is made', () => { beforeEach(() => { provider.addInteraction({ state: 'Provider has some data', uponReceiving: 'a request for the data', withRequest: { method: 'GET', path: '/data', }, willRespondWith: { status: 200, headers: { 'Content-Type': 'application/json', }, body: { message: 'Hello, Consumer!', }, }, }); }); it('should return the correct data', async () => { const data = await fetchProviderData('http://localhost:1234'); expect(data.message).toEqual('Hello, Consumer!'); }); }); }); """ """javascript // Provider: provider.test.js (using Pact CLI or library to verify pacts) const { Verifier } = require('@pact-foundation/pact'); const path = require('path'); describe('Pact Verification', () => { it('should validate the expectations of the Consumer', () => { const opts = { providerBaseUrl: 'http://localhost:3000', // Where the provider is running pactUrls: [ path.resolve(__dirname, '../pacts/myconsumer-myprovider.json'), // Path to pact file ], publishVerificationResult: true, providerVersion: '1.0.0', }; return new Verifier(opts).verifyProvider().then(output => { console.log('Pact Verification Complete!'); console.log(output); }); }); }); """ ### 6.3. Technology-Specific Details * Utilize Pact for contract testing in polyglot environments. * Spring Cloud Contract is a great option for Java-based microservices. * Clearly define the responsibilities of consumers and providers in the contract. * Automate the process of verifying contracts in the CI/CD pipeline. ## 7. Performance Testing Performance testing is vital for ensuring applications within the monorepo remain responsive and scalable. In a monorepo, performance issues in one package can potentially affect others, making this crucial. ### 7.1. Standards: * **DO**: * Conduct load, stress, and soak tests to identify bottlenecks and performance degradation. * Use tools like JMeter, Gatling, or k6 for performance testing. * Define key performance indicators (KPIs) like response time, throughput, and error rate. * Establish performance baselines to measure improvements and regressions. * **DON'T**: * Neglect performance testing until late in the development cycle. * Rely solely on manual performance evaluations. * Ignore the impact of database queries and inefficient algorithms on performance. ### 7.2: Example using k6 """javascript import http from 'k6/http'; import { sleep } from 'k6'; export const options = { vus: 10, duration: '10s', }; export default function () { http.get('http://localhost:3000/api/data'); sleep(1); } """ ### 7.3: Considerations for Monorepos: * Isolate specific packages or APIs for testing. * Use monorepo aware CI/CD tools. * Monitor resource consumption across the monorepo. ## 8. Security Testing Security testing identifies vulnerabilities in the code and ensures that the application is protected against attacks. ### 8.1. Standards * **Do This:** * Perform static analysis to identify potential security vulnerabilities in the code. * Conduct dynamic analysis to test the application for vulnerabilities during runtime. * Use tools like SonarQube, Snyk, or OWASP ZAP to automate security testing. * Follow secure coding practices to prevent common vulnerabilities like SQL injection, cross-site scripting (XSS), and cross-site request forgery (CSRF). * Conduct regular penetration testing to identify weaknesses in the application's security. * **Don't Do This:** * Ignore security vulnerabilities. Even seemingly minor vulnerabilities can be exploited by attackers. * Rely solely on automated security testing. Manual code reviews and penetration testing are also important. ### 8.2. Technology-Specific Details * Use ESLint with security-related rules to identify potential vulnerabilities in JavaScript/TypeScript code. * Use npm audit or yarn audit to identify vulnerabilities in dependencies. * Use tools like Snyk to automatically fix vulnerabilities in dependencies. * Follow the OWASP Top 10 guidelines to prevent common web application vulnerabilities. ## 9. Documentation Clear documentation is crucial for maintainability and knowledge sharing within the codebase. ### 9.1. Standards: * **DO**: * Document the purpose and functionality of tests cases. * Document the integration and end-to-end testing environments. * **DON'T**: * Overlook the importance of keeping test documentation up to date. * Skip documenting even the most straightforward-looking tests. ## 10. Conclusion These testing methodology standards are designed to promote high-quality code within our monorepo. By adhering to these guidelines, developers can build robust, reliable, and maintainable applications that meet the needs of our users. This document should be reviewed and updated regularly to reflect the latest best practices and technologies. Remember that testing is an integral part of the development process and should be considered at every stage of the software lifecycle.
# API Integration Standards for Monorepo This document outlines the coding standards for API integration within our Monorepo. It is designed to ensure consistency, maintainability, performance, and security across all services and applications that interact with external and internal APIs. ## 1. Architectural Patterns for API Integration ### 1.1. API Gateway Pattern **Standard:** Implement an API Gateway to centralize API routing, authentication, authorization, and rate limiting. **Why:** Centralizes cross-cutting concerns, simplifies client-side code, improves security, and enables easier management of API versions and policies. It specifically helps in a Monorepo where different packages may need different API access configurations. **Do This:** Use an API Gateway like Ambassador, Kong, or an internal solution built on technologies like Envoy or Traefik to manage and secure all inbound API traffic. **Don't Do This:** Expose backend services directly to the internet or allow individual services to handle authentication and authorization independently. **Example (Ambassador Configuration):** """yaml # api-gateway/ambassador/mapping.yaml apiVersion: getambassador.io/v3alpha1 kind: Mapping metadata: name: product-service-mapping spec: prefix: /products/ service: product-service:8080 rewrite: / circuitBreakers: - maxConnections: 100 pendingRequests: 50 maxRequests: 1000 maxRetries: 3 """ ### 1.2. Backend for Frontend (BFF) Pattern **Standard:** Utilize the BFF pattern for client applications (e.g., web, mobile) that require specific data aggregation or transformation. **Why:** Decouples backend services from the specific needs of each client application, reduces over-fetching of data, and improves the user experience. In a Monorepo, BFFs can be part of the UI packages, adapting shared services for specific interfaces. **Do This:** Create dedicated BFF services tailored to the data requirements of each client application. **Don't Do This:** Force client applications to directly consume backend services that provide generic or overly complex data structures. **Example (Node.js BFF):** """javascript // packages/web-app/bff/product-bff.js const axios = require('axios'); async function getProductDetails(productId) { try { const productResponse = await axios.get("/api/products/${productId}"); const reviewResponse = await axios.get("/api/reviews/${productId}"); const productData = productResponse.data; const reviewData = reviewResponse.data; return { ...productData, reviews: reviewData, }; } catch (error) { console.error("Error fetching product details:", error); throw error; } } module.exports = { getProductDetails }; """ ### 1.3. Asynchronous Communication **Standard:** Implement asynchronous communication using message queues (e.g., Kafka, RabbitMQ) for tasks that don't require immediate responses. **Why:** Improves system resilience, enables scalability, and allows services to function independently. This is especially important in a Monorepo to avoid tight coupling between services. **Do This:** Use message queues for event-driven architectures, background processing, and tasks that can tolerate eventual consistency. **Don't Do This:** Rely solely on synchronous API calls for all interactions between services. **Example (RabbitMQ Producer):** """javascript // packages/order-service/producer.js const amqp = require('amqplib'); async function publishOrderEvent(order) { try { const connection = await amqp.connect('amqp://rabbitmq:5672'); const channel = await connection.createChannel(); const exchange = 'order_events'; await channel.assertExchange(exchange, 'topic', { durable: false }); const routingKey = 'order.created'; channel.publish(exchange, routingKey, Buffer.from(JSON.stringify(order))); console.log("Published order event: ${JSON.stringify(order)}"); setTimeout(() => { connection.close(); process.exit(0); }, 500); } catch (error) { console.error("Error publishing message:", error); } } module.exports = { publishOrderEvent }; """ ## 2. API Design and Implementation Standards ### 2.1. RESTful Principles **Standard:** Adhere to RESTful principles for designing APIs. **Why:** Promotes consistency, discoverability, and interoperability. **Do This:** * Use standard HTTP methods (GET, POST, PUT, DELETE, PATCH). * Use nouns instead of verbs in endpoint paths (e.g., "/products" instead of "/getProducts"). * Return appropriate HTTP status codes to indicate success or failure. * Implement HATEOAS (Hypermedia as the Engine of Application State) where applicable. **Don't Do This:** * Create overly complex or inconsistent endpoint structures. * Use HTTP methods incorrectly (e.g., using GET to create resources). * Ignore HTTP status codes. **Example (RESTful API Endpoint):** """javascript // packages/product-service/controllers/product-controller.js const express = require('express'); const router = express.Router(); const productService = require('../services/product-service'); // GET /products router.get('/', async (req, res) => { try { const products = await productService.getAllProducts(); res.status(200).json(products); } catch (error) { console.error("Error fetching products:", error); res.status(500).json({ error: "Failed to fetch products" }); } }); // GET /products/:id router.get('/:id', async (req, res) => { try { const productId = req.params.id; const product = await productService.getProductById(productId); if (!product) { return res.status(404).json({ error: "Product not found" }); } res.status(200).json(product); } catch (error) { console.error("Error fetching product:", error); res.status(500).json({ error: "Failed to fetch product" }); } }); module.exports = router; """ ### 2.2. API Versioning **Standard:** Implement API versioning using either URI path, headers, or content negotiation. **Why:** Enables backwards compatibility and allows for updates without breaking existing clients. **Do This:** * Use a versioning strategy that is consistent across all APIs. * Clearly document the versioning strategy and the differences between versions. * Deprecate older versions gracefully. **Don't Do This:** * Make breaking changes without introducing a new API version. * Support multiple versions indefinitely. **Example (URI Path Versioning):** """javascript // packages/product-service/routes/v1/product-routes.js const express = require('express'); const router = express.Router(); const productController = require('../../controllers/product-controller'); router.get('/', productController.getAllProducts); router.get('/:id', productController.getProductById); module.exports = router; // packages/product-service/index.js const express = require('express'); const app = express(); const v1Routes = require('./routes/v1/product-routes'); app.use('/api/v1/products', v1Routes); app.listen(3000, () => { console.log('Product service listening on port 3000'); }); """ ### 2.3. Data Serialization **Standard:** Use JSON for data serialization. **Why:** JSON is widely supported, human-readable, and efficient for data transfer. **Do This:** Serialize all API requests and responses using the JSON format. Use a consistent naming convention (e.g., camelCase or snake_case). **Don't Do This:** Use XML or other less common data formats unless there is a specific requirement. **Example (JSON Serialization):** """javascript // packages/product-service/services/product-service.js async function getAllProducts() { return [ { id: 1, name: "Product A", price: 29.99 }, { id: 2, name: "Product B", price: 49.99 }, ]; } """ ### 2.4. Error Handling **Standard:** Implement consistent error handling and reporting. **Why:** Provides clear and informative error messages to clients and enables effective debugging. **Do This:** * Return standard HTTP error codes with descriptive error messages. * Include error codes and messages. * Log errors server-side. **Don't Do This:** * Return generic error messages that provide no useful information. * Expose sensitive information in error messages. * Swallow errors without logging them. **Example (Error Handling):** """javascript // packages/product-service/middleware/error-handler.js function errorHandler(err, req, res, next) { console.error(err.stack); const statusCode = err.statusCode || 500; const message = err.message || "Internal Server Error"; const errorCode = err.errorCode || "INTERNAL_ERROR"; res.status(statusCode).json({ error: { code: errorCode, message: message, }, }); } module.exports = errorHandler; """ ## 3. Security Standards ### 3.1. Authentication and Authorization **Standard:** Implement robust authentication and authorization mechanisms. **Why:** Protects APIs from unauthorized access and data breaches. **Do This:** * Use industry-standard authentication protocols like OAuth 2.0 or JWT. * Implement role-based access control (RBAC) to restrict access to sensitive resources. * Validate all incoming requests for proper authentication and authorization. **Don't Do This:** * Store passwords in plain text. * Grant excessive permissions to users or services. * Bypass authentication or authorization checks. **Example (JWT Authentication):** """javascript // packages/auth-service/middleware/auth-middleware.js const jwt = require('jsonwebtoken'); function authenticateToken(req, res, next) { const authHeader = req.headers['authorization']; const token = authHeader && authHeader.split(' ')[1]; if (token == null) { return res.sendStatus(401); // Unauthorized } jwt.verify(token, process.env.JWT_SECRET, (err, user) => { if (err) { return res.sendStatus(403); // Forbidden } req.user = user; next(); }); } module.exports = authenticateToken; """ ### 3.2. Input Validation **Standard:** Validate all incoming data to prevent injection attacks and other security vulnerabilities. **Why:** Protects APIs from malicious input that could compromise the system. **Do This:** * Use a validation library (e.g., Joi, express-validator) to define and enforce input validation rules. * Sanitize user input to remove potentially harmful characters. * Reject requests with invalid input. **Don't Do This:** * Trust user input without validation. * Rely solely on client-side validation. **Example (Input Validation with Joi):** """javascript // packages/product-service/middleware/validation-middleware.js const Joi = require('joi'); const productSchema = Joi.object({ name: Joi.string().required(), price: Joi.number().positive().required(), description: Joi.string().max(200), }); function validateProduct(req, res, next) { const { error } = productSchema.validate(req.body); if (error) { return res.status(400).json({ error: error.details[0].message }); } next(); } module.exports = validateProduct; """ ### 3.3. Rate Limiting **Standard:** Implement rate limiting to prevent abuse and protect APIs from denial-of-service attacks. **Why:** Ensures fair usage of APIs and protects against malicious traffic. Essential in Monorepos with shared deployments. **Do This:** * Implement rate limiting at the API Gateway level. * Use a rate limiting algorithm (e.g., Token Bucket, Leaky Bucket) to control the number of requests per user or IP address. * Return a 429 (Too Many Requests) error code when the rate limit is exceeded. **Don't Do This:** * Allow unlimited access to APIs without rate limiting. * Use overly restrictive rate limits that impact legitimate users. **Example (Rate Limiting with Redis):** """javascript // packages/api-gateway/middleware/rate-limit-middleware.js const redis = require('redis'); const { RateLimiterRedis } = require('rate-limiter-flexible'); const redisClient = redis.createClient({ host: 'redis', port: 6379, }); const rateLimiter = new RateLimiterRedis({ storeClient: redisClient, keyPrefix: 'rate_limit', points: 10, // 10 requests duration: 60, // per 60 seconds }); async function rateLimitMiddleware(req, res, next) { try { await rateLimiter.consume(req.ip); next(); } catch (rejRes) { res.status(429).json({ error: 'Too Many Requests' }); } } module.exports = rateLimitMiddleware; """ ## 4. Performance Optimization ### 4.1. Caching **Standard:** Implement caching to reduce latency and improve API response times. **Why:** Reduces the load on backend services and improves the user experience. **Do This:** * Use a caching strategy (e.g., HTTP caching, in-memory caching, distributed caching). * Cache frequently accessed data that doesn't change often. * Use an external cache like Redis or Memcached for distributed caching. **Don't Do This:** * Cache sensitive data without proper security measures. * Cache data indefinitely without invalidation. **Example (Redis Caching):** """javascript // packages/product-service/services/product-service.js const redis = require('redis'); const client = redis.createClient({ host: 'redis', port: 6379, }); async function getProductById(productId) { const cacheKey = "product:${productId}"; return new Promise((resolve, reject) => { client.get(cacheKey, async (err, cachedProduct) => { if (err) { console.error("Error fetching from cache:", err); } if (cachedProduct) { resolve(JSON.parse(cachedProduct)); return; } // Fetch from database if not cached const product = await fetchProductFromDatabase(productId); if (product) { client.setex(cacheKey, 3600, JSON.stringify(product)); // Cache for 1 hour resolve(product); } else { resolve(null); } }); }); } async function fetchProductFromDatabase(productId) { // ... your database logic here return { id: productId, name: "Example Product", price: 19.99 }; // Placeholder } """ ### 4.2. Connection Pooling **Standard:** Use connection pooling to reuse database connections and reduce connection overhead. **Why:** Improves database performance and reduces resource consumption. **Do This:** Configure connection pooling in database clients. **Don't Do This:** Create new database connections for every request. **Example (Connection Pooling with Sequelize):** """javascript // packages/product-service/config/database.js const { Sequelize } = require('sequelize'); const sequelize = new Sequelize('database', 'user', 'password', { host: 'localhost', dialect: 'postgres', pool: { max: 5, min: 0, acquire: 30000, idle: 10000, }, }); module.exports = sequelize; """ ### 4.3. Compression **Standard:** Enable compression to reduce the size of API responses. **Why:** Reduces network bandwidth usage and improves response times. **Do This:** Use middleware like "compression" for Express.js to compress API responses. **Don't Do This:** Skip compression for APIs serving large responses. **Example (Compression Middleware):** """javascript // packages/product-service/index.js const express = require('express'); const compression = require('compression'); const app = express(); app.use(compression()); // Enable gzip compression // ... other middleware and routes """ ## 5. Monitoring and Logging ### 5.1. API Monitoring **Standard:** Implement API monitoring to track performance, availability, and errors. **Why:** Provides visibility into API health and helps identify and resolve issues quickly. **Do This:** * Use a monitoring tool (e.g., Prometheus, Grafana, Datadog) to collect and analyze API metrics. * Monitor key metrics like response time, error rate, and request volume. * Set up alerts to notify when thresholds are exceeded. **Don't Do This:** * Ignore API performance and errors. * Fail to set up monitoring and alerting. ### 5.2. Logging **Standard:** Implement comprehensive logging to track API requests, responses, and errors. **Why:** Provides valuable information for debugging, auditing, and security analysis. **Do This:** * Use a logging library (e.g., Winston, Bunyan) to log API events. * Include relevant information in log messages (e.g., request ID, user ID, timestamp, request details, response details). * Centralize logs in a log management system (e.g., Elasticsearch, Splunk). **Don't Do This:** * Log sensitive information (e.g., passwords, credit card numbers). * Fail to log errors and exceptions. * Write logs to local files without a proper log management system. **Example (Winston Logging):** """javascript // packages/product-service/config/logger.js const winston = require('winston'); const logger = winston.createLogger({ level: 'info', format: winston.format.json(), defaultMeta: { service: 'product-service' }, transports: [ new winston.transports.Console({ format: winston.format.simple(), }), // new winston.transports.File({ filename: 'error.log', level: 'error' }), // new winston.transports.File({ filename: 'combined.log' }), ], }); module.exports = logger; // Usage: // logger.info('Fetching product details', { productId: 123 }); // logger.error('Failed to fetch product', { productId: 123, error: err.message }); """ ## 6. Monorepo Specific Considerations Since we are in a monorepo, these API integration standards must address specific concerns: * **Dependency Management:** Use consistent versioning and dependency management across all packages. Tools like "npm", "yarn", or "pnpm" with workspaces are crucial for managing shared dependencies. * **Code Sharing:** Extract common API integration logic into shared libraries or utility packages that can be reused across multiple services. This avoids code duplication and ensures consistent behavior. """javascript // packages/shared-utils/api-client.js import axios from 'axios'; const apiClient = axios.create({ baseURL: process.env.API_BASE_URL, timeout: 5000, headers: { 'Content-Type': 'application/json', }, }); apiClient.interceptors.request.use( (config) => { // Add authentication headers, logging, etc. return config; }, (error) => { return Promise.reject(error); } ); // Example usage of the shared api client export const getProducts = async () => { const response = await apiClient.get('/products'); return response.data; }; """ * **Build and Deployment:** Optimize build and deployment processes to account for the monorepo structure. Use tools that can identify and build only the packages that have changed. Deployment strategies must consider the impact of changes on other services. * **Testing:** Implement thorough integration tests within each package, and end-to-end tests that span multiple packages to ensure proper API integration between services. * **Documentation:** Document API contracts clearly within the Monorepo. Consider using tools like Swagger/OpenAPI to generate API documentation automatically. This documentation should be easily accessible to all teams working within the Monorepo. * **Inter-Service Communication:** Define clear protocols (gRPC, REST) and data contracts for communication between services within the Monorepo. Use code generation where possible to enforce adherence to these contracts. * **Service Discovery:** If services within the monorepo need to dynamically discover each other's locations (e.g., in a microservices architecture), integrate a service discovery mechanism like Consul or etcd. This allows the service to lookup the address of internal services. By adhering to these API integration standards, we can ensure that our Monorepo remains maintainable, scalable, and secure as it evolves. This documented agreement will help developers to create better code, and AI assistants to suggest the right implementation for API integration tasks.