
Mium Key Features
Without Ontul, Mium is meaningless. Every capability — natural-language SQL, job lifecycle, code generation — is built around Ontul as the data engine.
Natural-Language Data Analytics
Ask in plain language; agents generate and run Ontul SQL, return tables and charts, and let you export results in CSV, Markdown, Excel, PDF or PPTX. Advanced aggregations, joins, and trend analysis with no SQL knowledge required.
Multi-Agent Orchestration
A strict-JSON action protocol dispatches nine actions per user turn — query, submit_batch, submit_streaming, job_status, job_logs, kill_job, list_jobs, list_history, generate_code — in a single agent loop. Provider-agnostic; swap LLMs without changing application logic.
Pluggable LLM Backend
Connect the LLM you already use — Anthropic Claude or self-hosted Ollama. Credentials live envelope-encrypted in the ConnectionStore, and each user can pick a different model.
Ontul Job Lifecycle
Submit, monitor, log-stream, and kill long-running Ontul Batch and Streaming jobs from a chat conversation. Generate Java (Batch / Streaming / Class) and Python SDK source for jobs you want to ship into a pipeline.
Sovereign Operating Model
IAM, KMS, and ConnectionStore live in embedded RocksDB. Memory, Prompt, and Embedding stores live in NeorunBase. Server-rendered files live in S3-compatible storage. Masters and Workers stay stateless for application data.
IAM & Envelope Encryption
AWS-style JSON policies over users, groups, companies and organizations; access keys plus short-lived STS credentials. AES-256-GCM envelope encryption protects every sensitive payload, with a versioned KMS key set you can rotate at runtime.
High-Availability Cluster
Master-Worker topology with ZooKeeper leader election and a custom NIO control plane. Followers transparently proxy writes to the leader and self-heal via snapshot pulls; Workers parallelize LLM calls and tool execution. Netty serves the Admin UI and REST API.
Integrated Admin Console
A React-based Admin UI unifies chat, IAM and policy editing, KMS key management, node topology and metrics, real-time log tailing, and S3 export storage configuration — all in one place.
Use Cases
Business Data Analytics
Business users without SQL skills query and analyze Ontul data in natural language to derive insights.
Ontul Job Operations
Data engineers submit, monitor and kill Batch and Streaming jobs from chat, and generate SDK source to land jobs in a pipeline faster.
Reports & Exports
Hand off analysis results as Chart, Markdown, Excel, PDF or PPTX — shareable reports on demand. Server-rendered files are envelope-encrypted at rest before download.
On-Premises Sovereign AI
Run LLM-driven analytics on your own infrastructure — data never leaves your network, and operations align with the rest of the CCL stack.
Considering Mium for your data platform?
Ontul-First · Sovereign AI · On-Prem.
Start natural-language analytics and job orchestration with an AI agent platform purpose-built for Ontul.
