Supabase has become one of the fastest-growing database platforms for indie developers and startup teams. The combination of PostgreSQL under the hood, a clean dashboard, instant REST APIs, and built-in auth makes it easy to get a backend running quickly.
But once your app is in production and your team grows, the original setup isn't always enough. Your ops manager wants to pull revenue numbers. Your product manager wants to understand feature adoption. Your founder wants to know how many users signed up yesterday by country.
None of them know SQL. And you didn't build an admin panel for those queries.
This guide covers how to query your Supabase database in plain English — including practical SQL examples, the limits of manual approaches, and how AI for Database makes natural language querying available to your whole team.
How Supabase Stores Your Data
Supabase is PostgreSQL with extras: realtime subscriptions, edge functions, object storage, and auth. Your application data lives in standard Postgres tables. Any tool that can connect to a Postgres database over SSL can query your Supabase data.
The built-in Supabase SQL editor is genuinely useful for developers. You can write queries directly in the browser and see results as a table. But it requires knowing SQL, knowing your schema, and having database access — which rules out most of your team.
Supabase also exposes a JavaScript client that lets you query data like this:
const { data, error } = await supabase
.from('orders')
.select('*, users(name, email)')
.gte('created_at', '2026-01-01')
.order('created_at', { ascending: false })
.limit(100);That works well in application code. It doesn't help your non-technical colleagues who want answers on a Tuesday afternoon.
The Access Gap in Most Supabase Setups
The problem with most Supabase setups is an access gap: the database has all the data, but only one or two people can actually get to it.
Consider a typical scenario: your company is a SaaS product with 40 employees. Revenue, user growth, feature usage, and subscription data all live in Supabase. When the sales lead wants to know "which plan has the highest 90-day retention?", the process looks like this:
This bottleneck is entirely avoidable. The data doesn't need to be hard to reach — you just need the right layer on top of it.
Writing SQL Queries for Common Supabase Use Cases
For developers and analysts who do know SQL, here are the queries you'll write most often against a typical Supabase-hosted SaaS database.
User growth by week:
SELECT
DATE_TRUNC('week', created_at) AS week,
COUNT(*) AS new_users,
SUM(COUNT(*)) OVER (ORDER BY DATE_TRUNC('week', created_at)) AS cumulative_users
FROM auth.users
GROUP BY week
ORDER BY week;Note: Supabase auth users live in the auth.users schema, not the public schema. Your app likely also has a public.profiles table that mirrors or extends user data — adjust the query to match your setup.
Revenue by plan (assuming a subscriptions table):
SELECT
plan_name,
COUNT(*) AS active_subscribers,
SUM(monthly_amount) AS total_mrr,
ROUND(AVG(monthly_amount), 2) AS avg_mrr_per_subscriber
FROM subscriptions
WHERE status = 'active'
GROUP BY plan_name
ORDER BY total_mrr DESC;Feature adoption — how many users have triggered a specific in-app event:
SELECT
feature_name,
COUNT(DISTINCT user_id) AS unique_users,
COUNT(*) AS total_events,
ROUND(
COUNT(DISTINCT user_id) * 100.0 / (
SELECT COUNT(*) FROM auth.users
WHERE created_at < NOW() - INTERVAL '7 days'
),
1) AS adoption_pct
FROM user_events
WHERE created_at > NOW() - INTERVAL '30 days'
GROUP BY feature_name
ORDER BY unique_users DESC;These queries are accurate and straightforward — but they require knowing your table names, understanding Supabase-specific schema quirks like the auth. prefix, and having SQL proficiency plus database credentials.
Connecting AI for Database to Your Supabase Instance
AI for Database connects to Supabase the same way any Postgres client does — via the connection string in your Supabase project settings.
Go to your Supabase dashboard → Settings → Database → Connection string. You'll see something like:
postgresql://postgres:[password]@db.[project-ref].supabase.co:5432/postgresPaste that into AI for Database when setting up your connection. The platform reads your schema automatically and uses it to understand your data model when translating plain English into SQL.
Once connected, your ops lead can type "Show me new signups by country this week" and get a table. Your product manager can ask "Which features are most used by enterprise customers?" and see a chart. The SQL is written and run automatically in the background.
One thing worth noting: AI for Database handles the auth.users schema correctly once it reads your Supabase setup — which is one of the common friction points when other generic tools try to query Supabase. The authentication and profile tables get joined properly.
Building a Self-Refreshing Dashboard on Top of Supabase
One of the most practical things you can do with Supabase data and AI for Database is build a dashboard that refreshes itself automatically.
Instead of exporting to CSV every Monday, build dashboards directly from your live database:
Each tile in the dashboard is a saved natural-language query that runs on a schedule. When it's Monday morning, the numbers are already current — no one needs to pull anything manually, no one gets a Slack ping asking for data.
This is meaningfully different from Supabase's built-in dashboard, which shows infrastructure metrics (database size, API request counts). AI for Database shows you business metrics derived from your application data.
Setting Up Alerts on Your Supabase Data
Supabase has realtime subscriptions built in, which are great for application-level event handling. But business-level monitoring — "alert me when signups drop below 20 in a day" — requires a different approach.
With AI for Database's action workflows, you describe the condition in plain English:
"When the count of new users created today falls below 20 by 6pm, send me an email"
The system converts this to a scheduled check against your database:
SELECT COUNT(*) AS signups_today
FROM auth.users
WHERE created_at >= DATE_TRUNC('day', NOW());When the result meets the threshold condition, the alert fires. No Supabase edge functions to write, no external cron service to set up, no database triggers to manage.
You can set up similar alerts for:
What Natural Language Queries Handle Well (and What They Don't)
Natural language querying works well for:
It's less suited for:
For Supabase specifically, the auth.users vs public.profiles distinction can trip up generic tools. AI for Database reads the full schema including the auth schema and handles joins across both correctly.
Row Level Security and Permissions
Supabase's Row Level Security (RLS) policies are enforced at the database level based on which role is connected. This matters for AI for Database.
If you connect using a service_role key or a Postgres superuser, RLS policies are bypassed and all rows are visible. This is usually what you want for internal analytics and reporting.
If you want RLS to apply — for example, if you're building a customer-facing analytics feature — connect with a non-admin Postgres role and configure your policies accordingly.
A sensible setup for internal use:
-- Create a read-only analytics role in Supabase
CREATE ROLE analytics_reader;
GRANT CONNECT ON DATABASE postgres TO analytics_reader;
GRANT USAGE ON SCHEMA public TO analytics_reader;
GRANT USAGE ON SCHEMA auth TO analytics_reader;
GRANT SELECT ON ALL TABLES IN SCHEMA public TO analytics_reader;
GRANT SELECT ON auth.users TO analytics_reader;Then use those credentials in AI for Database. Your team gets full read access to application data without write permissions and without bypassing your application's security logic for data it needs to protect.