Getting Started

Coreai Platform Overview

Coreai Platform Overview documentation

Coreai Platform Overview

๐Ÿš€ Introduction

Welcome to Coreai Platform, this documentation will help you understand the platform's capabilities and how to use it effectively.

Built to last

We aim to offer a seamless experience by packaging open-source distributions into a cohesive, user-friendly platform. You are not tied with us for life, and are in control of the platform.

Industrialized

Benefit from a fully industrialized platform deployment with automation and Infrastructure as Code (IaC) as its core principle, a full platform can be deployed and ready to host your data and use cases in a matter of minutes.

Made to measure

Make it yours โ€“ change components, tools, models and more with only a few lines of configuration. Coreai can fit most deployment scenario and can work in tandem with your existing stack.

Cloud native

Be empowered by the power of cloud technologies even on-premise. Coreai rely and benefits from Kubernetes to enable features suck as autoscaling, self-healing and extensibility. The The can grow with your data and needs.

Maintain ownership

Own your data and AI models from end to end, guaranteeing both integrity and security โ€“ no need to entrust the backbone of your services to a third party. Retain full control.

Open Source

Choose a mature and actively maintained solution built with state-of-the-art Open Source technologies, trusted by Gov. Agencies and public sector.


๐Ÿงฑ Platform Architecture

High-Level Overview

  • Modular Kubernetes-based architecture
  • Integration of data ingestion, processing, storage, visualization, and ML/AI workflows

Component Categories

CategoryComponents
Ingestion & ETLAirbyte
Workflow OrchestrationArgo Events, Argo Workflows
CI/CD & GitOpsArgo CD, Harbor
Security & AccessKeycloak, Cert-Manager, Sealed-Secret, Trust-Manager
Data StorageMinIO, ClickHouse, Milvus, PgAdmin
API ManagementGravitee, Apache APISIX
MonitoringGrafana, Loki, Prometheus
VisualizationSuperset
InfrastructureMetalLB, Nginx-Controller, Reflector, NFS-Provisioning
Observability & SearchOpensearch
Model MonitoringLangfuse
CoreAILLM Backend, Portal, Model Installer

๐Ÿงญ User Journey

Described below would be the typical workflows for a non-technical user when working with the platform. For a technical user, meaning a developer, a data engineer or data scientist, the workflow could be the same with the addition of access to APIs and CLIs for more automation.

๐Ÿ” Authentication & Access

  1. Login to your user account via Keycloak
  2. Based on your current role in the organization your are granted access to a set of components (RBAC)
  3. In the coreai Portal you have access to your personal workspace, in other components UIs you can access authorized content based on your role.

๐Ÿ“ฆ Working with Data

  1. Trigger manually or schedule data ingestion using Airbyte or you own custom data pipeline orchestrated by Argo Workflow
  2. Your data will be stored in Minio, Clickhouse or Milvus depending on the type of data and your use-case.
  3. The datasets are available for browsing and creating charts and dashboards with Superset

โš™๏ธ Running Workflows

  1. Predefined events (file upload or file deletion for example) can be defined to trigger workflows using Argo Events
  2. Trigger manually or schedule your data/AI workdlows with Argo Workflows

๐Ÿ“Š Monitoring & Observability

  1. Monitor your pipelines and platform usage with Grafana dashboards (Prometheus & Loki)

๐Ÿงช ML & AI Workflows

  1. Deploy and serve models with KubeAI
  2. Monitor models with Langfuse

๐Ÿ†˜ Support & Troubleshooting

  • You'll find answers and help for most common questions and issues in the FAQ section of this documentation.
  • In case you encounter a bug you're more than welcome to report to one of our contact point.
  • If you require further assistance our dedicated team will help you.

On this page