Dev_guideComponents

CoreAI

coreai-llm-backend

coreai-llm-backend documentation

coreai-llm-backend

Description

coreai-llm-backend is a backend service for managing and serving large language models (LLMs). It provides APIs for model inference, management, and integration with other systems. The backend is designed for scalable, secure, and efficient LLM operations in enterprise environments.

Uses and Functionalities

Integration Method

To enable coreai-llm-backend, set the following variables in your main configuration file:

coreai_llm_backend = {
            enabled      = true
            version      = "latest"
            namespace    = "coreai-llm-backend"
            component    = "dp-components"
            release_name = "coreai-llm-backend"
            ingress      = true
            url_prefix   = "coreai-llm-backend"
        }

Note coreai-llm-backend provides RESTful APIs for LLM inference and management. Refer to the Swagger documentation for details.

API / Swagger

API documentation

Releases

DateVersionChartDescription

Official documentation

coreai-llm-backend documentation

On this page