πŸ€–
FlockMTL · OpenAI · NL→SQL · GeoJSON

Ask the Map Anything β€”
AI Handles the SQL

GeoQ's AI agent translates plain-English questions into optimised SQL, executes them against your registered geospatial sources, and returns summarised results with automatic map rendering.

Try AI Chat β†’ See Full Web Platform
πŸ‘€
Show me all buildings taller than 50 m within 500 m of the river in the current map view.
πŸ€–
Found 34 buildings matching your query. Rendering on the map now.
SELECT name, height, geom FROM features WHERE height > 50 AND ST_DWithin(geom, river.geom, 500)
βœ“ 34 features Β· GeoJSON Β· 0.28s
πŸ‘€
Now filter to only those built after 2010 and colour them by height.
πŸ€–
Refined to 18 buildings. Applied graduated colour ramp (yellow β†’ red) by height. Tallest: Riverside Tower at 112 m.
... AND year_built > 2010 ORDER BY height DESC
βœ“ 18 features Β· map updated

NL β†’ SQL β†’ GeoJSON in One Round-Trip

πŸ’¬
Your Question
Plain English
β†’
🧠
Schema Context
Table + column introspection
β†’
πŸ€–
FlockMTL / OpenAI
llm_complete β†’ SQL
β†’
πŸ¦†
DuckDB Execute
query_arrow()
β†’
πŸ—ΊοΈ
GeoJSON + Summary
Map update + prose

More Than Just SQL Generation

🧠
Schema-Aware Prompting
The agent introspects your registered sources β€” column names, geometry types, bounding boxes β€” and injects that context into every prompt for accurate SQL generation.
πŸ”
Semantic Filtering
Use llm_filter to semantically filter rows that traditional WHERE clauses can't reach β€” e.g. "features that look residential" from a free-text description column.
πŸ“
Result Summarisation
After query execution, llm_reduce compresses thousands of rows into a concise analyst-friendly answer β€” counts, extremes, patterns, and anomalies called out automatically.
πŸ”’
Semantic Embeddings
llm_embedding powers vector similarity search over metadata β€” find datasets "similar to flood risk maps" from a catalog with hundreds of entries.
πŸ“Š
Relevance Reranking
llm_rerank orders multi-source results by semantic relevance to your question β€” putting the most important features at the top of the map and result table.
🏠
Local & Private LLMs
FlockMTL supports Ollama for fully local inference β€” no data leaves your environment. Switch between OpenAI, Anthropic, and Ollama via a single CREATE SECRET statement.

Chat Endpoint

// POST /api/chat { "message": "Which census tracts have population density above 10,000/kmΒ²?", "bbox": [-122.5, 37.7, -122.3, 37.9], "session_id": "abc123" } // Response { "summary": "Found 8 census tracts with density above 10,000/kmΒ². Densest: Mission District at 18,420/kmΒ².", "sql": "SELECT geoid, pop_density, geom FROM features WHERE pop_density > 10000", "geojson": { "type": "FeatureCollection", "features": [ ... ] }, "result_type": "geojson", "row_count": 8, "session_id": "abc123" }

Let AI Do the SQL

Set OPENAI_API_KEY, start geoq-web, and start chatting with your data.

Get Started β†’ ← Back to Home