Every few weeks, a developer posts on Reddit: "I connected ChatGPT to my database and it's incredible." The replies flood in asking how. It's a genuinely useful thing to do but most implementations stop at the fun part and skip the parts that would actually make it useful for your whole team.
This guide walks through the real options for connecting a database to ChatGPT, what each approach costs in time and complexity, and where the limits are so you can make an informed decision about what your team actually needs.
Why People Want to Talk to Their Database
The appeal is obvious. SQL is expressive but demanding. Even experienced engineers type queries wrong, forget the right column names, or need to look up JOIN syntax for the third time this month. For non-technical teammates product managers, sales, ops SQL might as well be Mandarin.
Natural language queries promise to cut through that. Ask "how many users signed up last week from Germany?" and get an answer. No query editor, no syntax errors, no waiting for an engineer.
The question is how to actually build that, and whether ChatGPT is the right tool to center it around.
Option 1: Paste Your Schema Into ChatGPT and Ask It to Write SQL
This is where most people start. Copy your table definitions into a ChatGPT conversation, describe what you want, and ask it to write the query.
-- Paste this schema into ChatGPT:
CREATE TABLE users (
id SERIAL PRIMARY KEY,
email TEXT NOT NULL,
country TEXT,
created_at TIMESTAMP DEFAULT NOW(),
plan TEXT
);
CREATE TABLE subscriptions (
id SERIAL PRIMARY KEY,
user_id INTEGER REFERENCES users(id),
started_at TIMESTAMP,
cancelled_at TIMESTAMP,
mrr NUMERIC
);Then ask: "Write a query to show MRR by country for users who signed up in the last 90 days."
ChatGPT will give you something reasonable. Then you paste it into your database client, run it, and check the results.
What works: It's fast for one-off queries. The output is usually correct for straightforward requests.
What breaks down:
This approach works for a developer who wants a faster query-writing assistant. It's not a solution for the broader team.
Option 2: Build a ChatGPT Plugin or Custom GPT with Database Access
OpenAI's Custom GPT feature and the older plugin system let you connect external data sources. You can build an action that calls an API endpoint, which in turn queries your database and returns results.
The architecture looks like this:
User → ChatGPT → Your API endpoint → Database → ResponseBuilding this requires:
DELETE FROM usersA minimal FastAPI example:
from fastapi import FastAPI, Depends
import psycopg2
import os
app = FastAPI()
def get_db():
conn = psycopg2.connect(os.environ["DATABASE_URL"])
return conn
@app.post("/query")
def run_query(payload: dict, db=Depends(get_db)):
sql = payload.get("sql")
# IMPORTANT: validate and sanitize the SQL before running
cur = db.cursor()
cur.execute(sql)
rows = cur.fetchall()
columns = [desc[0] for cur.description]
return {"columns": columns, "rows": rows}What works: You get a proper chatbot interface. Non-technical users can ask questions in plain English. ChatGPT generates the SQL, your API runs it, and the results come back in the conversation.
What breaks down:
This is a reasonable prototype. It's not a product.
Option 3: Use the OpenAI API Directly to Build NL-to-SQL
Instead of ChatGPT as the interface, you use the OpenAI API as the intelligence layer inside your own application. Your app accepts a natural language question, sends it to GPT-4 with your schema as context, gets SQL back, runs it against your database, and returns results.
import openai
import psycopg2
schema_context = """
Tables:
- users(id, email, country, created_at, plan)
- subscriptions(id, user_id, started_at, cancelled_at, mrr)
- events(id, user_id, event_name, properties, created_at)
"""
def natural_language_to_sql(question: str) -> str:
response = openai.chat.completions.create(
model="gpt-4o",
messages=[
{
"role": "system",
"content": f"You are a SQL expert. Given the schema below, write a PostgreSQL query to answer the user's question. Return only SQL, nothing else.\n\n{schema_context}"
},
{
"role": "user",
"content": question
}
]
)
return response.choices[0].message.content.strip()
def run_query(sql: str):
conn = psycopg2.connect(os.environ["DATABASE_URL"])
cur = conn.cursor()
cur.execute(sql)
return cur.fetchall(), [desc[0] for cur.description]
# Usage:
sql = natural_language_to_sql("Show me the top 10 countries by MRR this month")
rows, columns = run_query(sql)This gives you full control. You can build whatever UI you want, store query history, add user permissions, handle errors, and extend the system however you need.
What works: Complete control. You can build exactly what your team needs.
What breaks down:
The Gap All These Approaches Share
Here's what none of these options give you out of the box:
The ChatGPT approaches are good at the "ask a question, get an answer" loop. They're not designed for the ongoing data workflows that teams actually rely on.
What a Purpose-Built Tool Looks Like
Tools like AI for Database handle the full picture. You connect your database directly (PostgreSQL, MySQL, MongoDB, Supabase, BigQuery, and others), and then you can:
Ask questions in plain English and get instant results. The same NL-to-SQL translation, but with your actual schema already loaded, result formatting handled, and no copy-pasting required.
Build dashboards from natural language queries. Instead of writing SQL and wiring it to a charting library, you describe what you want to see: "Show daily signups by plan for the last 30 days." The dashboard builds itself and refreshes on a schedule.
Set up action workflows. Define a condition in plain English "when daily MRR drops more than 10% compared to yesterday" and specify what should happen: a Slack message, an email, a webhook call. No stored procedures, no cron jobs, no DBA involvement.
The difference between this and the ChatGPT approaches above isn't the intelligence it's the infrastructure. A purpose-built tool wraps the AI in a system that's designed for teams: permissions, persistence, scheduling, alerting, and an interface that non-technical users can actually navigate.
Which Approach Is Right for You?
Scenario | Best approach
You need quick SQL help for yourself | Paste schema into ChatGPT
You're prototyping for a demo | Build a Custom GPT
You're building a custom internal product | OpenAI API + your own infrastructure
Your whole team needs database access | Purpose-built tool like AI for Database
The ChatGPT approaches are excellent starting points. They prove the concept quickly and give you something to show stakeholders. But if the goal is to actually eliminate the bottleneck so your sales team can check their numbers, your PM can pull activation data, and your ops manager can monitor key metrics without waiting for an engineer you need something that's built for that use case from the start.
The Bottom Line
Connecting a database to ChatGPT is genuinely useful and entirely possible. The DIY path whether pasting schemas, building Custom GPTs, or using the API directly gets you to "ask a question, get SQL" quickly. What it doesn't get you is the infrastructure for a team: dashboards, scheduling, alerting, and access controls.
If you're a developer who wants a smarter query assistant, the ChatGPT approaches are worth exploring. If you want your whole organisation to have self-serve data access, try AI for Database free at aifordatabase.com it handles the pieces that the DIY approaches leave out.