# API Integration Standards for Supabase
This document outlines the coding standards and best practices for integrating Supabase with backend services and external APIs. It focuses on maintainability, performance, security, and leveraging the latest Supabase features. This guide is intended for developers working with Supabase and aims to establish a common understanding of how to build robust and scalable applications.
## 1. General Principles
* **Principle of Least Privilege:** Only grant the necessary permissions to API keys or service accounts accessing Supabase. This minimizes the impact of potential security breaches.
* **Idempotency:** Design API integrations to be idempotent, meaning that multiple identical requests have the same effect as a single request. This is critical for handling retries and ensuring data consistency.
* **Error Handling:** Implement robust error handling to gracefully manage failures in API calls. Log errors with sufficient detail for debugging and provide informative feedback to the user when appropriate.
* **Rate Limiting & Circuit Breakers:** Enforce Rate limiting on API requests to prevent abuse and ensure the availability of your Supabase instance and integrated services. Implement circuit breakers to prevent cascading failures when external APIs become unavailable.
* **Observability:** Implement monitoring and logging to track the performance and health of API integrations. Use metrics to identify bottlenecks and proactively address issues.
## 2. Authentication and Authorization
### 2.1. Using JWTs for Secure Communication
**Do This:** Use JWTs (JSON Web Tokens) for securely authenticating and authorizing requests to your Supabase instance from external APIs or backend services.
**Why:** JWTs provide a standardized way to verify the identity of the caller and ensure that they have the necessary permissions to access the requested resources.
**Code Example (Generating a JWT from your backend - Node.js):**
"""javascript
const jwt = require('jsonwebtoken');
// Replace with your Supabase JWT secret
const SUPABASE_JWT_SECRET = process.env.SUPABASE_JWT_SECRET;
// Data to include in the JWT payload
const payload = {
sub: 'service-account', // Subject (e.g., service account ID)
role: 'service_role', // Role (e.g., service_role, authenticated)
custom_claim: 'some-value' //Add custom claims as neeeded
};
// JWT options (expiration time)
const options = {
expiresIn: '1h' // Token expires in 1 hour
};
// Sign the JWT
const token = jwt.sign(payload, SUPABASE_JWT_SECRET, options);
console.log('Generated JWT:', token);
//Example of making a request to supabase using node-fetch and the jwt.
const fetch = require('node-fetch');
async function postData(url = '', data = {}) {
const response = await fetch(url, {
method: 'POST', // *GET, POST, PUT, DELETE, etc.
headers: {
'Content-Type': 'application/json',
'Authorization': "Bearer ${token}" // Include the JWT in the Authorization header
},
body: JSON.stringify(data) // body data type must match "Content-Type" header
});
return await response.json(); // parses JSON response into native JavaScript objects
}
postData("${process.env.SUPABASE_URL}/rest/v1/your_table", { key1: 'value1', key2: 'value2' })
.then(data => {
console.log(data); // JSON data parsed by "response.json()"
});
"""
**Important notes on Node.js (and similar backend) implementations:**
* **Secret Management:** NEVER hardcode your "SUPABASE_JWT_SECRET" directly into your code. Always use environment variables. Securely manage your secrets using a dedicated secret management solution (e.g., HashiCorp Vault, AWS Secrets Manager, GCP Secret Manager).
* **Token Expiration:** Choose an appropriate expiration time for your JWTs. Shorter expiration times improve security but may require more frequent token refreshes. Longer expiration times reduce the need for refreshes but increase the window of opportunity for compromised tokens. Carefully balance this trade-off. Adjust the "expiresIn" within the option's variable.
* **Role-Based Access Control (RBAC):** Leverage Supabase's RBAC features to define fine-grained permissions based on the "role" claim in the JWT. Refer to the Supabase documentation for details on setting up RBAC rules.
* **Custom Claims:** Use Custom claims to add information that does not exist in the default RBAC rules.
**Don't Do This:**
* Hardcoding your Supabase JWT secret in your code.
* Using the "anon" key for backend-to-Supabase communication. The "anon" key is intended for client-side use and has very broad permissions.
* Storing sensitive information directly in JWTs. JWTs are easily decoded, so avoid including sensitive data that should not be exposed.
### 2.2. Service Role Credentials
**Do This:** When server-side code, or a trusted source, such as a backend, interacts directly with Supabase without user context (e.g., batch processing, scheduled tasks), use the "service_role" key.
**Why:** The "service_role" key bypasses Row Level Security (RLS) allowing full database access. This is necessary for many backend operations, but it *must* be used with extreme caution.
**Code Example (using service_role in Node.js):**
"""javascript
import { createClient } from '@supabase/supabase-js'
// Initialize Supabase client with service_role key
const supabase = createClient(process.env.SUPABASE_URL, process.env.SUPABASE_SERVICE_ROLE_KEY, {
auth: {
autoRefreshToken: false,
persistSession: false
}
})
async function runBackendTask() {
try {
// Example: Insert data bypassing RLS
const { data, error } = await supabase
.from('my_table')
.insert([{ column1: 'value1', column2: 'value2' }])
if (error) {
console.error('Error inserting data:', error)
} else {
console.log('Data inserted successfully:', data)
}
} catch (error) {
console.error('An unexpected error occurred:', error)
}
}
runBackendTask();
"""
**Don't Do This:**
* Use the "service_role" key in client-side code. This would expose it to unauthorized users and grant them unrestricted access to your database.
* Store the "service_role" key directly in your code. Use environment variables or a secure secret management system.
* Grant the "service_role" key to services or applications that don't require full database access. Consider creating a dedicated service account with more limited permissions.
### 2.3. Row Level Security (RLS) Integration
**Do This:** Design your database schema and RLS policies to work seamlessly with your API integrations.
**Why:** RLS ensures that users can only access the data they are authorized to see, even when accessing the database through an API. Enforcing RLS is critical for preventing data breaches and protecting sensitive information.
**Code Example (Supabase SQL Editor - RLS Policy):**
"""sql
-- Enable RLS on the 'profiles' table
ALTER TABLE profiles ENABLE ROW LEVEL SECURITY;
-- Create a policy that allows users to select their own profile
CREATE POLICY "Users can select their own profile."
ON profiles
FOR SELECT
USING (auth.uid() = id);
-- Create a policy that allows users to update their own profile
CREATE POLICY "Users can update their own profile."
ON profiles
FOR UPDATE
USING (auth.uid() = id)
WITH CHECK (auth.uid() = id);
"""
**Explanation:**
* "ALTER TABLE profiles ENABLE ROW LEVEL SECURITY;" enables RLS for the "profiles" table.
* "CREATE POLICY ... ON profiles ..." defines a policy that restricts access to the "profiles" table.
* "auth.uid()" is a Supabase function that returns the current user's ID.
* "USING (auth.uid() = id)" specifies the condition that must be met for a user to be able to select a row (in this case, the user's ID must match the "id" column in the "profiles" table).
* "WITH CHECK (auth.uid() = id)" specifies an additional condition that must be met when updating a row (in this case, the user can only update their own profile).
**Don't Do This:**
* Disable RLS entirely for API integrations.
* Create overly permissive RLS policies that grant unnecessary access to data.
* Fail to test your RLS policies thoroughly to ensure they are working as expected.
## 3. Function as a Service (FaaS) integration
### 3.1. Using Supabase Edge Functions
**Do This:** For simple API integrations that require server-side logic, consider using Supabase Edge Functions.
**Why:** Edge Functions allow you to execute code close to your users, reducing latency and improving performance. They also simplify deployment and management compared to traditional backend services. Supabase handles the infrastructure and scaling for you.
**Code Example (Supabase Edge Function - TypeScript):**
"""typescript
// Supabase Edge Function (e.g., "supabase/functions/hello-world/index.ts")
import { serve } from 'https://deno.land/std@0.168.0/http/server.ts'
serve(async (req) => {
const { name } = await req.json()
const data = {
message: "Hello, ${name}! This is Supabase Edge Functions.",
}
return new Response(
JSON.stringify(data),
{ headers: { 'Content-Type': 'application/json' } }
)
})
"""
**Deployment:** You deploy this function via the Supabase CLI: "supabase functions deploy hello-world".
**Explanation:**
* This function takes a "name" parameter from the JSON request body.
* It constructs a JSON response with a greeting message.
* It sets the "Content-Type" header to "application/json".
**Example of Making a request from within another Edge Function, utilizing the Supabase client:**
"""typescript
import { serve } from 'https://deno.land/std@0.168.0/http/server.ts';
import { createClient } from 'https://esm.sh/@supabase/supabase-js@2';
const supabaseUrl = Deno.env.get('SUPABASE_URL') ?? '';
const supabaseKey = Deno.env.get('SUPABASE_ANON_KEY') ?? '';
const supabase = createClient(supabaseUrl, supabaseKey, {
global: {
headers: { Authorization: "Bearer ${supabaseKey}" },
},
});
serve(async (_req) => {
const { data, error } = await supabase.from('your_table').select('*');
if (error) {
console.error(error);
return new Response(JSON.stringify({ error: error.message }), {
status: 500,
headers: { 'Content-Type': 'application/json' },
});
}
return new Response(JSON.stringify({ data }), {
headers: { 'Content-Type': 'application/json' },
});
});
"""
**Don't Do This:**
* Use Edge Functions for long-running or computationally intensive tasks. Edge Functions have execution time limits.
* Store sensitive data directly in Edge Functions. Use environment variables or a secure secret management system.
* Over-complicate your Edge Functions. Keep them focused on specific tasks.
### 3.2. Integrating with External FaaS Providers (e.g., AWS Lambda, Google Cloud Functions)
**Do This:** If you require more complex server-side logic or need to integrate with existing FaaS infrastructure, integrate Supabase with external FaaS providers.
**Why:** External FaaS providers (e.g., AWS Lambda, Google Cloud Functions, Azure Functions) offer greater flexibility and scalability for complex server-side tasks.
**Implementation:**
1. **Create a FaaS Function:** Develop your function in your chosen FaaS provider's environment (e.g., Node.js for AWS Lambda).
2. **Secure Credentials:** Securely store your Supabase credentials (URL, service_role key) in the FaaS environment variables.
3. **Invoke the Function:** Call the FaaS function from your Supabase client-side application or from a Supabase Edge Function.
**Code Example (AWS Lambda - Node.js):**
"""javascript
// AWS Lambda function
const { createClient } = require('@supabase/supabase-js');
exports.handler = async (event) => {
const supabaseUrl = process.env.SUPABASE_URL;
const supabaseKey = process.env.SUPABASE_SERVICE_ROLE_KEY;
const supabase = createClient(supabaseUrl, supabaseKey);
try {
// Example: Insert data into Supabase
const { data, error } = await supabase
.from('my_table')
.insert([{ column1: event.column1, column2: event.column2 }]);
if (error) {
console.error('Error inserting data:', error);
return { statusCode: 500, body: JSON.stringify({ error: error.message }) };
}
return { statusCode: 200, body: JSON.stringify({ data }) };
} catch (error) {
console.error('An unexpected error occurred:', error);
return { statusCode: 500, body: JSON.stringify({ error: error.message }) };
}
};
"""
**Don't Do This:**
* Expose your Supabase credentials directly in your client-side code.
* Create overly complex FaaS functions. Keep them focused on specific tasks.
* Fail to monitor and log your FaaS functions to ensure they are performing as expected.
## 4. Database Webhooks
### 4.1. Triggering Webhooks on Database Events
**Do This:** Use Supabase Database Webhooks to trigger actions in external systems when data changes in your Supabase database.
**Why:** Webhooks provide a real-time mechanism for integrating Supabase with other applications and services. They allow you to react to database events (e.g., inserts, updates, deletes) in a timely manner.
**Implementation:**
1. **Configure Webhooks:** Configure webhooks in the Supabase dashboard or using the Supabase CLI. Specify the target URL and the events that should trigger the webhook.
2. **Handle Webhook Payloads:** Implement the backend logic to handle the webhook payloads. The payload will contain information about the database event that triggered the webhook.
**Code Example (Supabase CLI example):**
"""bash
supabase db webhooks create \
--name my_webhook \
--target-url https://example.com/webhook-endpoint \
--events insert,update,delete \
--table my_table
"""
**Code Example (Handling Webhook Payload - Node.js with Express):**
"""javascript
const express = require('express');
const app = express();
const port = 3000;
// Middleware to parse JSON request bodies
app.use(express.json());
app.post('/webhook-endpoint', (req, res) => {
const webhookPayload = req.body;
console.log('Received webhook payload:', webhookPayload);
// Process the webhook payload here (e.g., update an external system)
//Important tip, verify the webhook signature for security reasons
res.status(200).send('Webhook received successfully');
});
app.listen(port, () => {
console.log("Webhook handler listening on port ${port}");
});
"""
**Don't Do This:**
* Create overly complex webhook handlers. Keep them focused on specific tasks.
* Fail to handle errors gracefully in your webhook handlers.
* Expose sensitive data in your webhook payloads.
* Forget to verify the webhook signature - for security.
### 4.2. Securing Webhooks
**Do This:** Implement security measures to protect your webhooks from unauthorized access.
**Why:** Webhooks can be a potential security vulnerability if not properly secured. Attackers could potentially trigger malicious actions by sending fake webhook requests. Here are few important aspects:
* **Verify Webhook Signatures:** Use a shared secret to sign your webhook requests. Verify the signature on the receiving end to ensure that the request is legitimate. Supabase allows you to define a secret when configuring the webhook, which can be used to generate a HMAC (Hash-based Message Authentication Code) signature.
* **Use HTTPS:** Always use HTTPS for your webhook endpoints to encrypt the data in transit.
* **Rate Limiting:** Implement rate limiting on your webhook endpoints to prevent denial-of-service attacks.
* **Input Validation:** Validate the data in your webhook payloads to prevent injection attacks.
**Code Example (Verifying Webhook Signature - Node.js):**
"""javascript
const crypto = require('crypto');
function verifyWebhookSignature(req, secret) {
const signature = req.headers['x-supabase-signature'];
const body = JSON.stringify(req.body); // Important: needs to be stringified
console.log(req.body)
const hmac = crypto.createHmac('sha256', secret);
hmac.update(body);
const expectedSignature = hmac.digest('hex');
return signature === expectedSignature;
}
app.post('/webhook-endpoint', (req, res) => {
const secret = process.env.WEBHOOK_SECRET; // Store your secret securely!
if (!verifyWebhookSignature(req, secret)) {
console.warn('Unverified request!')
return res.status(401).send('Unauthorized');
}
const webhookPayload = req.body;
// ... process the webhook payload
});
"""
## 5. Realtime API
### 5.1. Leveraging Supabase Realtime Subscriptions
**Do This:** Utilize Supabase's Realtime API for building real-time features such as live updates, chat applications, and collaborative editing.
**Why:** Supabase Realtime provides a scalable and reliable way to push data changes to clients in real-time. It eliminates the need for polling and reduces latency, resulting in a more responsive user experience.
**Code Example (Client-Side Subscription - JavaScript):**
"""javascript
import { createClient } from '@supabase/supabase-js'
const supabaseUrl = 'YOUR_SUPABASE_URL'
const supabaseKey = 'YOUR_SUPABASE_ANON_KEY'
const supabase = createClient(supabaseUrl, supabaseKey)
supabase
.channel('any')
.on('postgres_changes', { event: '*', schema: 'public', table: 'messages' }, payload => {
console.log('Change received!', payload)
// Update your UI with the new data
})
.subscribe()
"""
**Explanation:**
* "supabase.channel('any')" creates a realtime channel.
* ".on('postgres_changes', ...)" subscribes to database changes on the specified schema and table.
* "event: '*'" subscribes to all events (insert, update, delete). You can specify individual events if desired.
* "payload" contains the data that was changed in database.
**Don't Do This:**
* Over-subscribe to realtime events. Only subscribe to the events you need.
* Perform computationally intensive tasks in your realtime event handlers. Offload these tasks to a backend service or Edge Function.
* Expose sensitive data in your realtime payloads.
### 5.2. Scaling Realtime Applications
**Do This:** Design your realtime applications to be horizontally scalable.
**Why:** As your user base grows, you will need to scale your realtime infrastructure to handle the increased load.
**Strategies for Scaling:**
* **Connection Pooling:** Use a connection pool to manage database connections more efficiently.
* **Load Balancing:** Distribute realtime connections across multiple Supabase Realtime servers.
* **Caching:** Cache frequently accessed data to reduce database load.
* **Database Optimization:** Optimize your database queries to improve performance.
**Code Example (using connection pooling in Node.js):**
"""javascript
const { Pool } = require('pg');
const pool = new Pool({
user: process.env.DB_USER,
host: process.env.DB_HOST,
database: process.env.DB_NAME,
password: process.env.DB_PASSWORD,
port: process.env.DB_PORT,
max: 20, // Maximum number of connections in the pool
idleTimeoutMillis: 30000, // How long a client is allowed to remain idle before being closed
connectionTimeoutMillis: 2000, // How long to wait for a connection before timing out
});
//Use pool to make db queries
pool.query('SELECT NOW()', (err, res) => {
if (err) {
console.error('Error executing query', err);
} else {
console.log('Connected to the database');
}
});
//Example of querying within asynchronous function
async function testQuery(){
try {
const client = await pool.connect()
const result = await client.query('SELECT * FROM your_table');
client.release() //Return the client back to the pool!
return result.rows;
} catch (err) {
console.error(err);
}
}
"""
## 6. Testing
* **Unit Tests:** Write unit tests for individual components of your API integrations. This will help you to identify and fix bugs early in the development process.
* **Integration Tests:** Write integration tests to verify that your API integrations are working correctly with other applications and services.
* **End-to-End Tests:** Write end-to-end tests to verify that your entire system is working correctly, including the client-side application, the backend services, and the database.
## 7. Documentation
* **API Documentation:** Document your API endpoints using a standard format such as OpenAPI (Swagger).
* **Code Comments:** Add comments to your code to explain what it does.
* **README Files:** Create README files for your projects to explain how to set up and run them.
* **Architecture Diagrams** Use visual aids to explain complicated workflows.
## 8. Monitoring and Logging
* **Centralized Logging:** Use a centralized logging system to collect logs from all of your applications and services.
* **Metrics:** Collect metrics to track the performance of your API integrations.
* **Alerting:** Set up alerts to notify you when errors occur or when performance degrades. Tools like Prometheus and Grafana can be integrated for efficient metrics and monitoring.
This document provides a comprehensive overview of the coding standards and best practices for API integration with Supabase. By following these guidelines, you can build robust, scalable, and secure applications that leverage the power of Supabase. Remember to stay up-to-date with the latest Supabase features and best practices by regularly consulting the official Supabase documentation and community resources. Use the "supabase --version" command frequently.
danielsogl
Created Mar 6, 2025
This guide explains how to effectively use .clinerules
with Cline, the AI-powered coding assistant.
The .clinerules
file is a powerful configuration file that helps Cline understand your project's requirements, coding standards, and constraints. When placed in your project's root directory, it automatically guides Cline's behavior and ensures consistency across your codebase.
Place the .clinerules
file in your project's root directory. Cline automatically detects and follows these rules for all files within the project.
# Project Overview project: name: 'Your Project Name' description: 'Brief project description' stack: - technology: 'Framework/Language' version: 'X.Y.Z' - technology: 'Database' version: 'X.Y.Z'
# Code Standards standards: style: - 'Use consistent indentation (2 spaces)' - 'Follow language-specific naming conventions' documentation: - 'Include JSDoc comments for all functions' - 'Maintain up-to-date README files' testing: - 'Write unit tests for all new features' - 'Maintain minimum 80% code coverage'
# Security Guidelines security: authentication: - 'Implement proper token validation' - 'Use environment variables for secrets' dataProtection: - 'Sanitize all user inputs' - 'Implement proper error handling'
Be Specific
Maintain Organization
Regular Updates
# Common Patterns Example patterns: components: - pattern: 'Use functional components by default' - pattern: 'Implement error boundaries for component trees' stateManagement: - pattern: 'Use React Query for server state' - pattern: 'Implement proper loading states'
Commit the Rules
.clinerules
in version controlTeam Collaboration
Rules Not Being Applied
Conflicting Rules
Performance Considerations
# Basic .clinerules Example project: name: 'Web Application' type: 'Next.js Frontend' standards: - 'Use TypeScript for all new code' - 'Follow React best practices' - 'Implement proper error handling' testing: unit: - 'Jest for unit tests' - 'React Testing Library for components' e2e: - 'Cypress for end-to-end testing' documentation: required: - 'README.md in each major directory' - 'JSDoc comments for public APIs' - 'Changelog updates for all changes'
# Advanced .clinerules Example project: name: 'Enterprise Application' compliance: - 'GDPR requirements' - 'WCAG 2.1 AA accessibility' architecture: patterns: - 'Clean Architecture principles' - 'Domain-Driven Design concepts' security: requirements: - 'OAuth 2.0 authentication' - 'Rate limiting on all APIs' - 'Input validation with Zod'
# Database: Create functions You're a Supabase Postgres expert in writing database functions. Generate **high-quality PostgreSQL functions** that adhere to the following best practices: ## General Guidelines 1. **Default to `SECURITY INVOKER`:** - Functions should run with the permissions of the user invoking the function, ensuring safer access control. - Use `SECURITY DEFINER` only when explicitly required and explain the rationale. 2. **Set the `search_path` Configuration Parameter:** - Always set `search_path` to an empty string (`set search_path = '';`). - This avoids unexpected behavior and security risks caused by resolving object references in untrusted or unintended schemas. - Use fully qualified names (e.g., `schema_name.table_name`) for all database objects referenced within the function. 3. **Adhere to SQL Standards and Validation:** - Ensure all queries within the function are valid PostgreSQL SQL queries and compatible with the specified context (ie. Supabase). ## Best Practices 1. **Minimize Side Effects:** - Prefer functions that return results over those that modify data unless they serve a specific purpose (e.g., triggers). 2. **Use Explicit Typing:** - Clearly specify input and output types, avoiding ambiguous or loosely typed parameters. 3. **Default to Immutable or Stable Functions:** - Where possible, declare functions as `IMMUTABLE` or `STABLE` to allow better optimization by PostgreSQL. Use `VOLATILE` only if the function modifies data or has side effects. 4. **Triggers (if Applicable):** - If the function is used as a trigger, include a valid `CREATE TRIGGER` statement that attaches the function to the desired table and event (e.g., `BEFORE INSERT`). ## Example Templates ### Simple Function with `SECURITY INVOKER` ```sql create or replace function my_schema.hello_world() returns text language plpgsql security invoker set search_path = '' as $$ begin return 'hello world'; end; $$; ``` ### Function with Parameters and Fully Qualified Object Names ```sql create or replace function public.calculate_total_price(order_id bigint) returns numeric language plpgsql security invoker set search_path = '' as $$ declare total numeric; begin select sum(price * quantity) into total from public.order_items where order_id = calculate_total_price.order_id; return total; end; $$; ``` ### Function as a Trigger ```sql create or replace function my_schema.update_updated_at() returns trigger language plpgsql security invoker set search_path = '' as $$ begin -- Update the "updated_at" column on row modification new.updated_at := now(); return new; end; $$; create trigger update_updated_at_trigger before update on my_schema.my_table for each row execute function my_schema.update_updated_at(); ``` ### Function with Error Handling ```sql create or replace function my_schema.safe_divide(numerator numeric, denominator numeric) returns numeric language plpgsql security invoker set search_path = '' as $$ begin if denominator = 0 then raise exception 'Division by zero is not allowed'; end if; return numerator / denominator; end; $$; ``` ### Immutable Function for Better Optimization ```sql create or replace function my_schema.full_name(first_name text, last_name text) returns text language sql security invoker set search_path = '' immutable as $$ select first_name || ' ' || last_name; $$; ```
# Database: Create RLS policies You're a Supabase Postgres expert in writing row level security policies. Your purpose is to generate a policy with the constraints given by the user. You should first retrieve schema information to write policies for, usually the 'public' schema. The output should use the following instructions: - The generated SQL must be valid SQL. - You can use only CREATE POLICY or ALTER POLICY queries, no other queries are allowed. - Always use double apostrophe in SQL strings (eg. 'Night''s watch') - You can add short explanations to your messages. - The result should be a valid markdown. The SQL code should be wrapped in ``` (including sql language tag). - Always use "auth.uid()" instead of "current_user". - SELECT policies should always have USING but not WITH CHECK - INSERT policies should always have WITH CHECK but not USING - UPDATE policies should always have WITH CHECK and most often have USING - DELETE policies should always have USING but not WITH CHECK - Don't use `FOR ALL`. Instead separate into 4 separate policies for select, insert, update, and delete. - The policy name should be short but detailed text explaining the policy, enclosed in double quotes. - Always put explanations as separate text. Never use inline SQL comments. - If the user asks for something that's not related to SQL policies, explain to the user that you can only help with policies. - Discourage `RESTRICTIVE` policies and encourage `PERMISSIVE` policies, and explain why. The output should look like this: ```sql CREATE POLICY "My descriptive policy." ON books FOR INSERT to authenticated USING ( (select auth.uid()) = author_id ) WITH ( true ); ``` Since you are running in a Supabase environment, take note of these Supabase-specific additions below. ## Authenticated and unauthenticated roles Supabase maps every request to one of the roles: - `anon`: an unauthenticated request (the user is not logged in) - `authenticated`: an authenticated request (the user is logged in) These are actually [Postgres Roles](/docs/guides/database/postgres/roles). You can use these roles within your Policies using the `TO` clause: ```sql create policy "Profiles are viewable by everyone" on profiles for select to authenticated, anon using ( true ); -- OR create policy "Public profiles are viewable only by authenticated users" on profiles for select to authenticated using ( true ); ``` Note that `for ...` must be added after the table but before the roles. `to ...` must be added after `for ...`: ### Incorrect ```sql create policy "Public profiles are viewable only by authenticated users" on profiles to authenticated for select using ( true ); ``` ### Correct ```sql create policy "Public profiles are viewable only by authenticated users" on profiles for select to authenticated using ( true ); ``` ## Multiple operations PostgreSQL policies do not support specifying multiple operations in a single FOR clause. You need to create separate policies for each operation. ### Incorrect ```sql create policy "Profiles can be created and deleted by any user" on profiles for insert, delete -- cannot create a policy on multiple operators to authenticated with check ( true ) using ( true ); ``` ### Correct ```sql create policy "Profiles can be created by any user" on profiles for insert to authenticated with check ( true ); create policy "Profiles can be deleted by any user" on profiles for delete to authenticated using ( true ); ``` ## Helper functions Supabase provides some helper functions that make it easier to write Policies. ### `auth.uid()` Returns the ID of the user making the request. ### `auth.jwt()` Returns the JWT of the user making the request. Anything that you store in the user's `raw_app_meta_data` column or the `raw_user_meta_data` column will be accessible using this function. It's important to know the distinction between these two: - `raw_user_meta_data` - can be updated by the authenticated user using the `supabase.auth.update()` function. It is not a good place to store authorization data. - `raw_app_meta_data` - cannot be updated by the user, so it's a good place to store authorization data. The `auth.jwt()` function is extremely versatile. For example, if you store some team data inside `app_metadata`, you can use it to determine whether a particular user belongs to a team. For example, if this was an array of IDs: ```sql create policy "User is in team" on my_table to authenticated using ( team_id in (select auth.jwt() -> 'app_metadata' -> 'teams')); ``` ### MFA The `auth.jwt()` function can be used to check for [Multi-Factor Authentication](/docs/guides/auth/auth-mfa#enforce-rules-for-mfa-logins). For example, you could restrict a user from updating their profile unless they have at least 2 levels of authentication (Assurance Level 2): ```sql create policy "Restrict updates." on profiles as restrictive for update to authenticated using ( (select auth.jwt()->>'aal') = 'aal2' ); ``` ## RLS performance recommendations Every authorization system has an impact on performance. While row level security is powerful, the performance impact is important to keep in mind. This is especially true for queries that scan every row in a table - like many `select` operations, including those using limit, offset, and ordering. Based on a series of [tests](https://github.com/GaryAustin1/RLS-Performance), we have a few recommendations for RLS: ### Add indexes Make sure you've added [indexes](/docs/guides/database/postgres/indexes) on any columns used within the Policies which are not already indexed (or primary keys). For a Policy like this: ```sql create policy "Users can access their own records" on test_table to authenticated using ( (select auth.uid()) = user_id ); ``` You can add an index like: ```sql create index userid on test_table using btree (user_id); ``` ### Call functions with `select` You can use `select` statement to improve policies that use functions. For example, instead of this: ```sql create policy "Users can access their own records" on test_table to authenticated using ( auth.uid() = user_id ); ``` You can do: ```sql create policy "Users can access their own records" on test_table to authenticated using ( (select auth.uid()) = user_id ); ``` This method works well for JWT functions like `auth.uid()` and `auth.jwt()` as well as `security definer` Functions. Wrapping the function causes an `initPlan` to be run by the Postgres optimizer, which allows it to "cache" the results per-statement, rather than calling the function on each row. Caution: You can only use this technique if the results of the query or function do not change based on the row data. ### Minimize joins You can often rewrite your Policies to avoid joins between the source and the target table. Instead, try to organize your policy to fetch all the relevant data from the target table into an array or set, then you can use an `IN` or `ANY` operation in your filter. For example, this is an example of a slow policy which joins the source `test_table` to the target `team_user`: ```sql create policy "Users can access records belonging to their teams" on test_table to authenticated using ( (select auth.uid()) in ( select user_id from team_user where team_user.team_id = team_id -- joins to the source "test_table.team_id" ) ); ``` We can rewrite this to avoid this join, and instead select the filter criteria into a set: ```sql create policy "Users can access records belonging to their teams" on test_table to authenticated using ( team_id in ( select team_id from team_user where user_id = (select auth.uid()) -- no join ) ); ``` ### Specify roles in your policies Always use the Role of inside your policies, specified by the `TO` operator. For example, instead of this query: ```sql create policy "Users can access their own records" on rls_test using ( auth.uid() = user_id ); ``` Use: ```sql create policy "Users can access their own records" on rls_test to authenticated using ( (select auth.uid()) = user_id ); ``` This prevents the policy `( (select auth.uid()) = user_id )` from running for any `anon` users, since the execution stops at the `to authenticated` step.
# Database: Create migration You are a Postgres Expert who loves creating secure database schemas. This project uses the migrations provided by the Supabase CLI. ## Creating a migration file Given the context of the user's message, create a database migration file inside the folder `supabase/migrations/`. The file MUST following this naming convention: The file MUST be named in the format `YYYYMMDDHHmmss_short_description.sql` with proper casing for months, minutes, and seconds in UTC time: 1. `YYYY` - Four digits for the year (e.g., `2024`). 2. `MM` - Two digits for the month (01 to 12). 3. `DD` - Two digits for the day of the month (01 to 31). 4. `HH` - Two digits for the hour in 24-hour format (00 to 23). 5. `mm` - Two digits for the minute (00 to 59). 6. `ss` - Two digits for the second (00 to 59). 7. Add an appropriate description for the migration. For example: ``` 20240906123045_create_profiles.sql ``` ## SQL Guidelines Write Postgres-compatible SQL code for Supabase migration files that: - Includes a header comment with metadata about the migration, such as the purpose, affected tables/columns, and any special considerations. - Includes thorough comments explaining the purpose and expected behavior of each migration step. - Write all SQL in lowercase. - Add copious comments for any destructive SQL commands, including truncating, dropping, or column alterations. - When creating a new table, you MUST enable Row Level Security (RLS) even if the table is intended for public access. - When creating RLS Policies - Ensure the policies cover all relevant access scenarios (e.g. select, insert, update, delete) based on the table's purpose and data sensitivity. - If the table is intended for public access the policy can simply return `true`. - RLS Policies should be granular: one policy for `select`, one for `insert` etc) and for each supabase role (`anon` and `authenticated`). DO NOT combine Policies even if the functionality is the same for both roles. - Include comments explaining the rationale and intended behavior of each security policy The generated SQL code should be production-ready, well-documented, and aligned with Supabase's best practices.
# Postgres SQL Style Guide ## General - Use lowercase for SQL reserved words to maintain consistency and readability. - Employ consistent, descriptive identifiers for tables, columns, and other database objects. - Use white space and indentation to enhance the readability of your code. - Store dates in ISO 8601 format (`yyyy-mm-ddThh:mm:ss.sssss`). - Include comments for complex logic, using '/_ ... _/' for block comments and '--' for line comments. ## Naming Conventions - Avoid SQL reserved words and ensure names are unique and under 63 characters. - Use snake_case for tables and columns. - Prefer plurals for table names - Prefer singular names for columns. ## Tables - Avoid prefixes like 'tbl\_' and ensure no table name matches any of its column names. - Always add an `id` column of type `identity generated always` unless otherwise specified. - Create all tables in the `public` schema unless otherwise specified. - Always add the schema to SQL queries for clarity. - Always add a comment to describe what the table does. The comment can be up to 1024 characters. ## Columns - Use singular names and avoid generic names like 'id'. - For references to foreign tables, use the singular of the table name with the `_id` suffix. For example `user_id` to reference the `users` table - Always use lowercase except in cases involving acronyms or when readability would be enhanced by an exception. #### Examples: ```sql create table books ( id bigint generated always as identity primary key, title text not null, author_id bigint references authors (id) ); comment on table books is 'A list of all the books in the library.'; ``` ## Queries - When the query is shorter keep it on just a few lines. As it gets larger start adding newlines for readability - Add spaces for readability. Smaller queries: ```sql select * from employees where end_date is null; update employees set end_date = '2023-12-31' where employee_id = 1001; ``` Larger queries: ```sql select first_name, last_name from employees where start_date between '2021-01-01' and '2021-12-31' and status = 'employed'; ``` ### Joins and Subqueries - Format joins and subqueries for clarity, aligning them with related SQL clauses. - Prefer full table names when referencing tables. This helps for readability. ```sql select employees.employee_name, departments.department_name from employees join departments on employees.department_id = departments.department_id where employees.start_date > '2022-01-01'; ``` ## Aliases - Use meaningful aliases that reflect the data or transformation applied, and always include the 'as' keyword for clarity. ```sql select count(*) as total_employees from employees where end_date is null; ``` ## Complex queries and CTEs - If a query is extremely complex, prefer a CTE. - Make sure the CTE is clear and linear. Prefer readability over performance. - Add comments to each block. ```sql with department_employees as ( -- Get all employees and their departments select employees.department_id, employees.first_name, employees.last_name, departments.department_name from employees join departments on employees.department_id = departments.department_id ), employee_counts as ( -- Count how many employees in each department select department_name, count(*) as num_employees from department_employees group by department_name ) select department_name, num_employees from employee_counts order by department_name; ```
# Writing Supabase Edge Functions You're an expert in writing TypeScript and Deno JavaScript runtime. Generate **high-quality Supabase Edge Functions** that adhere to the following best practices: ## Guidelines 1. Try to use Web APIs and Deno’s core APIs instead of external dependencies (eg: use fetch instead of Axios, use WebSockets API instead of node-ws) 2. If you are reusing utility methods between Edge Functions, add them to `supabase/functions/_shared` and import using a relative path. Do NOT have cross dependencies between Edge Functions. 3. Do NOT use bare specifiers when importing dependecnies. If you need to use an external dependency, make sure it's prefixed with either `npm:` or `jsr:`. For example, `@supabase/supabase-js` should be written as `npm:@supabase/supabase-js`. 4. For external imports, always define a version. For example, `npm:@express` should be written as `npm:express@4.18.2`. 5. For external dependencies, importing via `npm:` and `jsr:` is preferred. Minimize the use of imports from @`deno.land/x` , `esm.sh` and @`unpkg.com` . If you have a package from one of those CDNs, you can replace the CDN hostname with `npm:` specifier. 6. You can also use Node built-in APIs. You will need to import them using `node:` specifier. For example, to import Node process: `import process from "node:process". Use Node APIs when you find gaps in Deno APIs. 7. Do NOT use `import { serve } from "https://deno.land/std@0.168.0/http/server.ts"`. Instead use the built-in `Deno.serve`. 8. Following environment variables (ie. secrets) are pre-populated in both local and hosted Supabase environments. Users don't need to manually set them: - SUPABASE_URL - SUPABASE_ANON_KEY - SUPABASE_SERVICE_ROLE_KEY - SUPABASE_DB_URL 9. To set other environment variables (ie. secrets) users can put them in a env file and run the `supabase secrets set --env-file path/to/env-file` 10. A single Edge Function can handle multiple routes. It is recommended to use a library like Express or Hono to handle the routes as it's easier for developer to understand and maintain. Each route must be prefixed with `/function-name` so they are routed correctly. 11. File write operations are ONLY permitted on `/tmp` directory. You can use either Deno or Node File APIs. 12. Use `EdgeRuntime.waitUntil(promise)` static method to run long-running tasks in the background without blocking response to a request. Do NOT assume it is available in the request / execution context. ## Example Templates ### Simple Hello World Function ```tsx interface reqPayload { name: string } console.info('server started') Deno.serve(async (req: Request) => { const { name }: reqPayload = await req.json() const data = { message: `Hello ${name} from foo!`, } return new Response(JSON.stringify(data), { headers: { 'Content-Type': 'application/json', Connection: 'keep-alive' }, }) }) ``` ### Example Function using Node built-in API ```tsx import { randomBytes } from 'node:crypto' import { createServer } from 'node:http' import process from 'node:process' const generateRandomString = (length) => { const buffer = randomBytes(length) return buffer.toString('hex') } const randomString = generateRandomString(10) console.log(randomString) const server = createServer((req, res) => { const message = `Hello` res.end(message) }) server.listen(9999) ``` ### Using npm packages in Functions ```tsx import express from 'npm:express@4.18.2' const app = express() app.get(/(.*)/, (req, res) => { res.send('Welcome to Supabase') }) app.listen(8000) ``` ### Generate embeddings using built-in @Supabase.ai API ```tsx const model = new Supabase.ai.Session('gte-small') Deno.serve(async (req: Request) => { const params = new URL(req.url).searchParams const input = params.get('text') const output = await model.run(input, { mean_pool: true, normalize: true }) return new Response(JSON.stringify(output), { headers: { 'Content-Type': 'application/json', Connection: 'keep-alive', }, }) }) ```