core
mlflow_assistant.core
¶
Core functionality for MLflow Assistant.
This subpackage contains the core modules for managing connections, workflows, and interactions with the MLflow Tracking Server.
cli
¶
Command-line interface (CLI) for MLflow Assistant.
This module provides the CLI entry points for interacting with the MLflow Assistant, allowing users to manage connections, workflows, and other operations via the command line.
connection
¶
MLflow connection module for handling connections to MLflow Tracking Server.
This module provides functionality to connect to both local and remote MLflow Tracking Servers using environment variables or direct configuration.
MLflowConnection(tracking_uri=None, client_factory=None)
¶
MLflow connection class to handle connections to MLflow Tracking Server.
This class provides functionality to connect to both local and remote MLflow Tracking Servers.
Initialize MLflow connection.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
tracking_uri
|
str | None
|
URI of the MLflow Tracking Server. If None, will try to get from environment. |
None
|
client_factory
|
Any
|
A callable to create the MlflowClient instance. Defaults to MlflowClient. |
None
|
Source code in src/mlflow_assistant/core/connection.py
connect()
¶
Connect to MLflow Tracking Server.
Returns¶
Tuple[bool, str]: (success, message)
Source code in src/mlflow_assistant/core/connection.py
get_client()
¶
Get MLflow client instance.
Returns¶
MlflowClient: MLflow client instance.
Raises¶
MLflowConnectionError: If not connected to MLflow Tracking Server.
Source code in src/mlflow_assistant/core/connection.py
get_connection_info()
¶
Get connection information.
Returns¶
Dict[str, Any]: Connection information.
Source code in src/mlflow_assistant/core/connection.py
core
¶
Core utilities and functionality for MLflow Assistant.
This module provides foundational classes, functions, and utilities used across the MLflow Assistant project, including shared logic for managing workflows and interactions with the MLflow Tracking Server.
provider
¶
Provider integrations for MLflow Assistant.
This module defines the interfaces and implementations for integrating with various large language model (LLM) providers, such as OpenAI and Ollama.
workflow
¶
Workflow management for LangGraph in MLflow Assistant.
This module provides functionality for defining, managing, and executing workflows using LangGraph, enabling seamless integration with MLflow for tracking and managing machine learning workflows.