Chinna AI Chat
Conversational AI workspace for reasoning, writing, coding, document support, multimodal workflows, OCR routing, and future paid model access.
Open app →A complete AI workspace for chatting with Chinna, generating applications, managing Supabase-backed projects, routing APIs, publishing previews, and connecting future automations under one domain.
{
"project": "Project-M",
"interface": "premium-builder-style",
"modules": ["chat", "builder", "api", "storage", "mcp", "automation"],
"legal_pages": true,
"responsive": true
}
Project-M is designed as a private AI operating layer: chat with Chinna, build real applications, connect APIs, manage storage, run automation routes, and prepare future operations-controlled workflows from one domain ecosystem.
Each service is built as a distinct capability while keeping consistent navigation, visual design, routing, and deployment structure.
Conversational AI workspace for reasoning, writing, coding, document support, multimodal workflows, OCR routing, and future paid model access.
Open app →Builder interface for generating Vite + React apps from prompts with live project creation, previews, publish routes, logs, and Supabase-backed state.
Open builder →Routing surface for local and paid AI models including code generation, image-aware workflows, OCR models, embeddings, and provider expansion.
Open AI gateway →Backend API layer for project creation, health checks, generated files, build logs, deployment events, model wiring, and future developer access.
Open API →Authentication, object storage, REST APIs, database tables, edge functions, and project-scoped backend capabilities through Supabase Kong.
Open storage →Tool gateway for future agent workflows, integrations, file operations, builder actions, automation triggers, and structured tool discovery.
Open MCP →Every subdomain exists for a reason. The goal is not a random set of pages, but a structured AI product system.
The chat application acts as the user-facing AI surface. It supports conversation memory, local model routing, paid model support, image workflows, future document reading, and the Chinna identity layer.
The builder is the prompt-to-app workspace. It is designed to convert long prompts into project records, generated source files, preview routes, logs, publish URLs, and app-management screens.
Supabase and the Project-M API provide the backend core. This includes authentication, database tables, feature flags, project records, generated app metadata, storage endpoints, and secure service routes.
MCP and automation routes prepare the platform for tool-based agents, future workflows, webhook triggers, scheduled tasks, background builds, and app-to-app orchestration.
The platform flow is structured from idea to generated app, then preview, publish, and manage.
Each route redirects users to the correct capability while keeping the public experience clean and navigable.
Developers can sign in to the user dashboard, generate a Project-M API key, and connect their applications to Chinna AI, Project-M LLM, MCP tools, and builder services.
Generate API keys, view usage, open Builder, access AI Chat, and read integration docs from one place.
Open dashboard →Use OpenAI-compatible endpoints with cURL, Node.js, Python, Java, JSON config, and MCP integrations.
Read docs →Download a ready Project-M AI configuration file for app integration.
Download config →