Gentic Data — Documentation
Everything you need to give your AI agent a cloud database — import CSVs, query with SQL, insert and sync records.
1. Getting Started
Sign Up & Get Your API Key
Before you can use Gentic Data, you need an API key to authenticate your requests.
- Go to gentic.co/data and create an account.
- Create an organization from your dashboard. API keys, billing, and database data are all scoped to the organization.
- Go to the API Keys section in your dashboard.
- Click Create API Key. Give it a name (e.g. "Claude Code" or "n8n production").
- Copy the key immediately — it starts with
gentic_and is only shown once.
Anyone on your team can generate their own key, and they'll all share the same database and tables.
Keep your key secure. Treat it like a password. Don't commit it to version control or share it publicly.
2. Connecting to the MCP Server
Gentic Data uses the Model Context Protocol (MCP) — an open standard for connecting AI agents to external tools. You connect once, and your agent gets access to all Gentic Data tools automatically.
Server URL: https://mcp.gentic.co/data
Claude Code (CLI)
Option A — OAuth (no API key needed):
claude mcp add gentic-data --transport http https://mcp.gentic.co/dataClaude Code will prompt you to authenticate on first use.
Option B — API key:
claude mcp add gentic-data --transport http https://mcp.gentic.co/data --header "Authorization: Bearer <your-api-key>"Replace <your-api-key> with a key from your dashboard.
Claude Web / Claude Desktop / Claude Mobile
- Go to Settings → Connectors
- Click Add custom connector at the bottom of the page
- Enter the server URL:
https://mcp.gentic.co/data - Click Add — you'll be redirected to sign in and authorize access via OAuth
Custom connectors require a Claude Pro, Max, Team, or Enterprise plan.
ChatGPT
- Go to Settings → Apps & Connectors → Advanced settings and enable developer mode
- Go to Settings → Connectors → Create
- Set the connector name to Gentic Data and the URL to:
https://mcp.gentic.co/data - You'll see Gentic's tools listed if the connection is successful
- In a new chat, click + near the message bar, select More, and choose Gentic Data
ChatGPT Apps are available on Plus, Pro, Team, Business, Enterprise, and Education plans.
n8n
- Add an MCP Client node to your workflow
- Set the server URL to
https://mcp.gentic.co/data - Add your API key as a Bearer token in the authentication settings
Any Other MCP Client
Gentic Data uses standard Streamable HTTP transport. Any client that supports MCP can connect using the server URL and either OAuth or API key Bearer token authentication.
Agent Skills (Recommended)
For the best results, pair the MCP server with the Gentic agent skill. The MCP server gives your agent access to the tools; the skill teaches it the optimal workflow order — checking tables, importing CSVs, exploring schemas, querying data, and syncing records.
1. Add the MCP server (if you haven't already):
claude mcp add gentic-data --transport http https://mcp.gentic.co/data --header "Authorization: Bearer <your-api-key>"2. Install the agent skill:
npx skills add gentic-co/agent-skillsOr add the skill directly via URL:
https://gentic.co/data/SKILL.md- MCP server — provides tool access (query, import, insert, sync, etc.)
- Agent skill — teaches the optimal workflow order: check tables → import data → explore schema → query → sync
Works with Claude Code, Cursor, Copilot, and 40+ other agents.
3. List Database Tables
Tool: list_database_tables
Cost: Free
List all tables in your Gentic Data database with row counts. This is the best starting point to see what data you have available.
Parameters
This tool takes no parameters.
Example prompts
Overview:
"What tables do I have in my database?"
After import:
"Show me all my tables and how many rows each has."
4. Create Table from CSV
Tool: create_table_from_csv
Cost: Free
Create a new table in your database by importing data from any publicly accessible CSV URL. Google Sheets and Google Drive URLs are automatically converted to CSV export format.
Parameters
| Parameter | Required | Default | Description |
|---|---|---|---|
| table_name | Yes | — | Name for the new table (e.g. "sales_data", "customers"). Only letters, numbers, and underscores. |
| csv_url | Yes | — | URL to a publicly accessible CSV file. Supports S3, HTTPS, Google Sheets, and Google Drive URLs. |
Supported CSV sources
- HTTPS — any publicly accessible CSV URL
- Google Sheets — paste the sharing URL; auto-converted to CSV export
- Google Drive — paste the file URL; auto-converted to direct download
- S3 — public S3 bucket URLs
Key behaviors
- Maximum file size: 100 MB
- Table names must match
[a-zA-Z0-9_]— no spaces or special characters - Column types are auto-detected from the CSV data
Example prompts
From a URL:
"Import this CSV as a table called sales_data: https://example.com/data.csv"
From Google Sheets:
"Create a table called customers from this Google Sheet: https://docs.google.com/spreadsheets/d/abc123/edit"
5. Explore Table Data
sample_table
Get a preview of a table with sample records and column information. Use this to understand the table structure before running analysis queries.
| Parameter | Required | Default | Description |
|---|---|---|---|
| table_name | Yes | — | Name of the table to sample |
| sample_size | No | 5 | Number of sample records to return (1–20) |
get_table_schema
Get detailed schema information for a table including column names, data types, and nullability.
| Parameter | Required | Default | Description |
|---|---|---|---|
| table_name | Yes | — | Name of the table |
Example prompts
Preview data:
"Show me 10 sample rows from the sales_data table."
Check schema:
"What columns and data types does the customers table have?"
Pro tip
Always use sample_table before running complex queries. It helps your agent understand the column names, data types, and value formats — leading to more accurate SQL generation.
6. Query Data with SQL
Tool: query_data
Cost: Free
Execute SQL SELECT queries on your database for data analysis. Supports full DuckDB SQL including CTEs (WITH clauses), window functions, aggregations, joins, and more. Write operations are blocked for safety.
Parameters
| Parameter | Required | Default | Description |
|---|---|---|---|
| sql_query | Yes | — | SQL SELECT query to execute (e.g. "SELECT * FROM sales WHERE date > '2024-01-01' LIMIT 10") |
Security
Only SELECT and WITH statements are allowed. The following are blocked:
- Write operations: INSERT, UPDATE, DELETE, DROP, ALTER, CREATE, TRUNCATE
- Dangerous functions: read_csv_auto, read_parquet, read_json, read_text, read_blob, glob, http_get
Example prompts
Simple query:
"Show me the top 10 customers by total spend."
Analytics:
"What's the average order value by month for the last 12 months?"
Cross-table join:
"Join the orders and customers tables and show me which customer segments have the highest retention rate."
Pro tip
Your agent writes the SQL for you — just describe what you want in natural language. DuckDB supports advanced analytics features like window functions, PIVOT, UNPIVOT, and approximate aggregation. For best results, let your agent use sample_table first to understand the data structure.
7. Update & Sync Tables
Three tools for keeping your data up to date from CSV sources. Choose the right one based on your use case.
update_table_from_csv
Update an existing table from a CSV URL. Supports three modes:
- replace — drop and recreate the table with new data
- append — add all rows from the CSV (may create duplicates)
- upsert — add only rows that don't already exist (based on unique_column)
| Parameter | Required | Default | Description |
|---|---|---|---|
| table_name | Yes | — | Name of the existing table to update |
| csv_url | Yes | — | CSV URL with the new data (supports Google Sheets, Drive, S3, HTTPS) |
| mode | No | "replace" | Update mode: "replace", "append", or "upsert" |
| unique_column | No* | — | Column to check for uniqueness (required when mode="upsert") |
sync_table_from_csv
The recommended tool when you need to both update existing records AND add new ones in a single operation. Best for "sync", "update", or "refresh" workflows.
| Parameter | Required | Default | Description |
|---|---|---|---|
| table_name | Yes | — | Name of the existing table to sync |
| csv_url | Yes | — | CSV URL with the data to sync |
| unique_column | Yes | — | Column to match records (e.g. "id", "email", "video_url") |
batch_update_table_from_csv
Update only existing records from a CSV — ignores new records entirely. Use sync_table_from_csv if you need both updates and inserts.
| Parameter | Required | Default | Description |
|---|---|---|---|
| table_name | Yes | — | Name of the existing table to update |
| csv_url | Yes | — | CSV URL with updated data |
| unique_column | Yes | — | Column to match records (e.g. "id", "email"). Only matching records are updated. |
Which tool to use?
| Scenario | Tool |
|---|---|
| Replace all data with fresh CSV | update_table_from_csv (mode: "replace") |
| Add new rows only (no duplicates) | update_table_from_csv (mode: "upsert") |
| Update existing + add new records | sync_table_from_csv |
| Update existing only (ignore new) | batch_update_table_from_csv |
Example prompts
Full refresh:
"Replace the sales_data table with this updated CSV: https://example.com/sales-march.csv"
Sync:
"Sync the customers table from this Google Sheet, matching on the email column: https://docs.google.com/spreadsheets/d/abc123/edit"
8. Insert Records
Two tools for inserting records directly — without a CSV file.
insert_record
Insert a single record into a table with duplicate prevention. Use sample_table first to see the table structure and required columns.
| Parameter | Required | Default | Description |
|---|---|---|---|
| table_name | Yes | — | Name of the table to insert into |
| record_data | Yes | — | Record as key-value pairs (e.g. {"id": 123, "name": "John", "email": "john@example.com"}) |
| unique_column | Yes | — | Column to check for duplicates (e.g. "id", "email"). Insert is rejected if value already exists. |
batch_insert_records
Insert multiple records in one batch operation with duplicate prevention. All records must have the same columns. Maximum 1,000 records per call.
| Parameter | Required | Default | Description |
|---|---|---|---|
| table_name | Yes | — | Name of the table to insert into |
| records | Yes | — | Array of records to insert (all must have the same columns) |
| unique_column | Yes | — | Column to check for duplicates. Records with existing values are skipped. |
Example prompts
Single record:
"Add a new customer to the customers table: name is Jane Smith, email is jane@example.com, plan is pro. Use email as the unique column."
Batch insert:
"Insert these 5 product records into the inventory table, using sku as the unique column: [list of products]"
Pro tip
Use batch_insert_records instead of calling insert_record in a loop — it's much faster for multiple records. If you have data in a CSV file, use create_table_from_csv or update_table_from_csv instead.
9. Tool Reference
| Tool | Description | Cost |
|---|---|---|
| list_database_tables | List all tables in your database with row counts. | Free |
| create_table_from_csv | Create a new table by importing from any CSV URL (S3, HTTPS, Google Sheets, Google Drive). | Free |
| sample_table | Preview a table with sample records (1–20) and column information. | Free |
| get_table_schema | Get detailed schema: column names, data types, and nullability. | Free |
| query_data | Execute SQL SELECT queries with full DuckDB SQL support (CTEs, window functions, joins). Read-only. | Free |
| update_table_from_csv | Update an existing table from CSV. Modes: replace, append, or upsert. | Free |
| sync_table_from_csv | Sync a table with CSV data — updates existing records AND adds new ones in one operation. | Free |
| batch_update_table_from_csv | Batch update only existing records from CSV. Ignores new records. | Free |
| insert_record | Insert a single record with duplicate prevention via unique column check. | Free |
| batch_insert_records | Insert up to 1,000 records in one batch with duplicate prevention. | Free |
10. Pricing
All Gentic Data tools are completely free. No subscriptions. No seat licenses. No usage limits on tool calls.
| Action | Cost |
|---|---|
| List database tables | Free |
| Create table from CSV | Free |
| Sample table data | Free |
| Get table schema | Free |
| SQL queries (SELECT) | Free |
| Update/sync table from CSV | Free |
| Insert records (single or batch) | Free |
How it works
Every organization gets its own cloud database powered by DuckDB via MotherDuck. Your data is scoped to your organization and is never shared. All 10 data tools are included at no cost — import CSVs, run SQL queries, insert and sync records freely.
Rate limit: 100 requests per minute per API key. Need higher limits? Contact us.