Gentic Data — Documentation

Everything you need to give your AI agent a cloud database — import CSVs, query with SQL, insert and sync records.

1. Getting Started

Sign Up & Get Your API Key

Before you can use Gentic Data, you need an API key to authenticate your requests.

  1. Go to gentic.co/data and create an account.
  2. Create an organization from your dashboard. API keys, billing, and database data are all scoped to the organization.
  3. Go to the API Keys section in your dashboard.
  4. Click Create API Key. Give it a name (e.g. "Claude Code" or "n8n production").
  5. Copy the key immediately — it starts with gentic_ and is only shown once.

Anyone on your team can generate their own key, and they'll all share the same database and tables.

Keep your key secure. Treat it like a password. Don't commit it to version control or share it publicly.

2. Connecting to the MCP Server

Gentic Data uses the Model Context Protocol (MCP) — an open standard for connecting AI agents to external tools. You connect once, and your agent gets access to all Gentic Data tools automatically.

Server URL: https://mcp.gentic.co/data

Claude Code (CLI)

Option A — OAuth (no API key needed):

claude mcp add gentic-data --transport http https://mcp.gentic.co/data

Claude Code will prompt you to authenticate on first use.

Option B — API key:

claude mcp add gentic-data --transport http https://mcp.gentic.co/data --header "Authorization: Bearer <your-api-key>"

Replace <your-api-key> with a key from your dashboard.

Claude Web / Claude Desktop / Claude Mobile

  1. Go to Settings → Connectors
  2. Click Add custom connector at the bottom of the page
  3. Enter the server URL: https://mcp.gentic.co/data
  4. Click Add — you'll be redirected to sign in and authorize access via OAuth

Custom connectors require a Claude Pro, Max, Team, or Enterprise plan.

ChatGPT

  1. Go to Settings → Apps & Connectors → Advanced settings and enable developer mode
  2. Go to Settings → Connectors → Create
  3. Set the connector name to Gentic Data and the URL to: https://mcp.gentic.co/data
  4. You'll see Gentic's tools listed if the connection is successful
  5. In a new chat, click + near the message bar, select More, and choose Gentic Data

ChatGPT Apps are available on Plus, Pro, Team, Business, Enterprise, and Education plans.

n8n

  1. Add an MCP Client node to your workflow
  2. Set the server URL to https://mcp.gentic.co/data
  3. Add your API key as a Bearer token in the authentication settings

Any Other MCP Client

Gentic Data uses standard Streamable HTTP transport. Any client that supports MCP can connect using the server URL and either OAuth or API key Bearer token authentication.

Agent Skills (Recommended)

For the best results, pair the MCP server with the Gentic agent skill. The MCP server gives your agent access to the tools; the skill teaches it the optimal workflow order — checking tables, importing CSVs, exploring schemas, querying data, and syncing records.

1. Add the MCP server (if you haven't already):

claude mcp add gentic-data --transport http https://mcp.gentic.co/data --header "Authorization: Bearer <your-api-key>"

2. Install the agent skill:

npx skills add gentic-co/agent-skills

Or add the skill directly via URL:

https://gentic.co/data/SKILL.md
  • MCP server — provides tool access (query, import, insert, sync, etc.)
  • Agent skill — teaches the optimal workflow order: check tables → import data → explore schema → query → sync

Works with Claude Code, Cursor, Copilot, and 40+ other agents.

3. List Database Tables

Tool: list_database_tables

Cost: Free

List all tables in your Gentic Data database with row counts. This is the best starting point to see what data you have available.

Parameters

This tool takes no parameters.

Example prompts

Overview:

"What tables do I have in my database?"

After import:

"Show me all my tables and how many rows each has."

4. Create Table from CSV

Tool: create_table_from_csv

Cost: Free

Create a new table in your database by importing data from any publicly accessible CSV URL. Google Sheets and Google Drive URLs are automatically converted to CSV export format.

Parameters

ParameterRequiredDefaultDescription
table_nameYesName for the new table (e.g. "sales_data", "customers"). Only letters, numbers, and underscores.
csv_urlYesURL to a publicly accessible CSV file. Supports S3, HTTPS, Google Sheets, and Google Drive URLs.

Supported CSV sources

  • HTTPS — any publicly accessible CSV URL
  • Google Sheets — paste the sharing URL; auto-converted to CSV export
  • Google Drive — paste the file URL; auto-converted to direct download
  • S3 — public S3 bucket URLs

Key behaviors

  • Maximum file size: 100 MB
  • Table names must match [a-zA-Z0-9_] — no spaces or special characters
  • Column types are auto-detected from the CSV data

Example prompts

From a URL:

"Import this CSV as a table called sales_data: https://example.com/data.csv"

From Google Sheets:

"Create a table called customers from this Google Sheet: https://docs.google.com/spreadsheets/d/abc123/edit"

5. Explore Table Data

sample_table

Get a preview of a table with sample records and column information. Use this to understand the table structure before running analysis queries.

ParameterRequiredDefaultDescription
table_nameYesName of the table to sample
sample_sizeNo5Number of sample records to return (1–20)

get_table_schema

Get detailed schema information for a table including column names, data types, and nullability.

ParameterRequiredDefaultDescription
table_nameYesName of the table

Example prompts

Preview data:

"Show me 10 sample rows from the sales_data table."

Check schema:

"What columns and data types does the customers table have?"

Pro tip

Always use sample_table before running complex queries. It helps your agent understand the column names, data types, and value formats — leading to more accurate SQL generation.

6. Query Data with SQL

Tool: query_data

Cost: Free

Execute SQL SELECT queries on your database for data analysis. Supports full DuckDB SQL including CTEs (WITH clauses), window functions, aggregations, joins, and more. Write operations are blocked for safety.

Parameters

ParameterRequiredDefaultDescription
sql_queryYesSQL SELECT query to execute (e.g. "SELECT * FROM sales WHERE date > '2024-01-01' LIMIT 10")

Security

Only SELECT and WITH statements are allowed. The following are blocked:

  • Write operations: INSERT, UPDATE, DELETE, DROP, ALTER, CREATE, TRUNCATE
  • Dangerous functions: read_csv_auto, read_parquet, read_json, read_text, read_blob, glob, http_get

Example prompts

Simple query:

"Show me the top 10 customers by total spend."

Analytics:

"What's the average order value by month for the last 12 months?"

Cross-table join:

"Join the orders and customers tables and show me which customer segments have the highest retention rate."

Pro tip

Your agent writes the SQL for you — just describe what you want in natural language. DuckDB supports advanced analytics features like window functions, PIVOT, UNPIVOT, and approximate aggregation. For best results, let your agent use sample_table first to understand the data structure.

7. Update & Sync Tables

Three tools for keeping your data up to date from CSV sources. Choose the right one based on your use case.

update_table_from_csv

Update an existing table from a CSV URL. Supports three modes:

  • replace — drop and recreate the table with new data
  • append — add all rows from the CSV (may create duplicates)
  • upsert — add only rows that don't already exist (based on unique_column)
ParameterRequiredDefaultDescription
table_nameYesName of the existing table to update
csv_urlYesCSV URL with the new data (supports Google Sheets, Drive, S3, HTTPS)
modeNo"replace"Update mode: "replace", "append", or "upsert"
unique_columnNo*Column to check for uniqueness (required when mode="upsert")

sync_table_from_csv

The recommended tool when you need to both update existing records AND add new ones in a single operation. Best for "sync", "update", or "refresh" workflows.

ParameterRequiredDefaultDescription
table_nameYesName of the existing table to sync
csv_urlYesCSV URL with the data to sync
unique_columnYesColumn to match records (e.g. "id", "email", "video_url")

batch_update_table_from_csv

Update only existing records from a CSV — ignores new records entirely. Use sync_table_from_csv if you need both updates and inserts.

ParameterRequiredDefaultDescription
table_nameYesName of the existing table to update
csv_urlYesCSV URL with updated data
unique_columnYesColumn to match records (e.g. "id", "email"). Only matching records are updated.

Which tool to use?

ScenarioTool
Replace all data with fresh CSVupdate_table_from_csv (mode: "replace")
Add new rows only (no duplicates)update_table_from_csv (mode: "upsert")
Update existing + add new recordssync_table_from_csv
Update existing only (ignore new)batch_update_table_from_csv

Example prompts

Full refresh:

"Replace the sales_data table with this updated CSV: https://example.com/sales-march.csv"

Sync:

"Sync the customers table from this Google Sheet, matching on the email column: https://docs.google.com/spreadsheets/d/abc123/edit"

8. Insert Records

Two tools for inserting records directly — without a CSV file.

insert_record

Insert a single record into a table with duplicate prevention. Use sample_table first to see the table structure and required columns.

ParameterRequiredDefaultDescription
table_nameYesName of the table to insert into
record_dataYesRecord as key-value pairs (e.g. {"id": 123, "name": "John", "email": "john@example.com"})
unique_columnYesColumn to check for duplicates (e.g. "id", "email"). Insert is rejected if value already exists.

batch_insert_records

Insert multiple records in one batch operation with duplicate prevention. All records must have the same columns. Maximum 1,000 records per call.

ParameterRequiredDefaultDescription
table_nameYesName of the table to insert into
recordsYesArray of records to insert (all must have the same columns)
unique_columnYesColumn to check for duplicates. Records with existing values are skipped.

Example prompts

Single record:

"Add a new customer to the customers table: name is Jane Smith, email is jane@example.com, plan is pro. Use email as the unique column."

Batch insert:

"Insert these 5 product records into the inventory table, using sku as the unique column: [list of products]"

Pro tip

Use batch_insert_records instead of calling insert_record in a loop — it's much faster for multiple records. If you have data in a CSV file, use create_table_from_csv or update_table_from_csv instead.

9. Tool Reference

ToolDescriptionCost
list_database_tablesList all tables in your database with row counts.Free
create_table_from_csvCreate a new table by importing from any CSV URL (S3, HTTPS, Google Sheets, Google Drive).Free
sample_tablePreview a table with sample records (1–20) and column information.Free
get_table_schemaGet detailed schema: column names, data types, and nullability.Free
query_dataExecute SQL SELECT queries with full DuckDB SQL support (CTEs, window functions, joins). Read-only.Free
update_table_from_csvUpdate an existing table from CSV. Modes: replace, append, or upsert.Free
sync_table_from_csvSync a table with CSV data — updates existing records AND adds new ones in one operation.Free
batch_update_table_from_csvBatch update only existing records from CSV. Ignores new records.Free
insert_recordInsert a single record with duplicate prevention via unique column check.Free
batch_insert_recordsInsert up to 1,000 records in one batch with duplicate prevention.Free

10. Pricing

All Gentic Data tools are completely free. No subscriptions. No seat licenses. No usage limits on tool calls.

ActionCost
List database tablesFree
Create table from CSVFree
Sample table dataFree
Get table schemaFree
SQL queries (SELECT)Free
Update/sync table from CSVFree
Insert records (single or batch)Free

How it works

Every organization gets its own cloud database powered by DuckDB via MotherDuck. Your data is scoped to your organization and is never shared. All 10 data tools are included at no cost — import CSVs, run SQL queries, insert and sync records freely.

Rate limit: 100 requests per minute per API key. Need higher limits? Contact us.