mlflow_assistant
mlflow_assistant
¶
MLflow Assistant: Interact with MLflow using LLMs.
cli
¶
CLI modules for MLflow Assistant.
commands
¶
CLI commands for MLflow Assistant.
This module contains the main CLI commands for interacting with MLflow using natural language queries through various AI providers.
cli(verbose)
¶
MLflow Assistant: Interact with MLflow using LLMs.
This CLI tool helps you to interact with MLflow using natural language.
Source code in src/mlflow_assistant/cli/commands.py
mock_process_query(query, provider_config, verbose=False)
¶
Mock function that simulates the query processing workflow.
This will be replaced with the actual implementation later.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
query
|
str
|
The user's query |
required |
provider_config
|
dict[str, Any]
|
The AI provider configuration |
required |
verbose
|
bool
|
Whether to show verbose output |
False
|
Returns:
Type | Description |
---|---|
dict[str, Any]
|
Dictionary with mock response information |
Source code in src/mlflow_assistant/cli/commands.py
setup()
¶
start(verbose)
¶
Start an interactive chat session with MLflow Assistant.
This opens an interactive chat session where you can ask questions about your MLflow experiments, models, and data. Type /bye to exit the session.
Examples of questions you can ask: - What are my best performing models for classification? - Show me details of experiment 'customer_churn' - Compare runs abc123 and def456 - Which hyperparameters should I try next for my regression model?
Commands: - /bye: Exit the chat session - /help: Show help about available commands - /clear: Clear the screen
Source code in src/mlflow_assistant/cli/commands.py
version()
¶
Show MLflow Assistant version information.
Source code in src/mlflow_assistant/cli/commands.py
setup
¶
Setup wizard for MLflow Assistant configuration.
This module provides an interactive setup wizard that guides users through configuring MLflow Assistant, including MLflow connection settings and AI provider configuration (OpenAI or Ollama).
setup_wizard()
¶
Interactive setup wizard for mlflow-assistant.
Source code in src/mlflow_assistant/cli/setup.py
20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 |
|
validation
¶
Validation utilities for MLflow Assistant configuration.
This module provides validation functions to check MLflow connections, AI provider configurations, and overall system setup to ensure proper operation of MLflow Assistant.
validate_mlflow_uri(uri)
¶
Validate MLflow URI by attempting to connect.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
uri
|
str
|
MLflow server URI |
required |
Returns:
Name | Type | Description |
---|---|---|
bool |
bool
|
True if connection successful, False otherwise |
Source code in src/mlflow_assistant/cli/validation.py
validate_ollama_connection(uri)
¶
Validate Ollama connection and get available models.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
uri
|
str
|
Ollama server URI |
required |
Returns:
Type | Description |
---|---|
tuple[bool, dict[str, Any]]
|
Tuple[bool, Dict[str, Any]]: (is_valid, response_data) |
Source code in src/mlflow_assistant/cli/validation.py
validate_setup(check_api_key=True)
¶
Validate that MLflow Assistant is properly configured.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
check_api_key
|
bool
|
Whether to check for API key if using OpenAI |
True
|
Returns:
Type | Description |
---|---|
tuple[bool, str]
|
Tuple[bool, str]: (is_valid, error_message) |
Source code in src/mlflow_assistant/cli/validation.py
core
¶
Core functionality for MLflow Assistant.
This subpackage contains the core modules for managing connections, workflows, and interactions with the MLflow Tracking Server.
cli
¶
Command-line interface (CLI) for MLflow Assistant.
This module provides the CLI entry points for interacting with the MLflow Assistant, allowing users to manage connections, workflows, and other operations via the command line.
connection
¶
MLflow connection module for handling connections to MLflow Tracking Server.
This module provides functionality to connect to both local and remote MLflow Tracking Servers using environment variables or direct configuration.
MLflowConnection(tracking_uri=None, client_factory=None)
¶
MLflow connection class to handle connections to MLflow Tracking Server.
This class provides functionality to connect to both local and remote MLflow Tracking Servers.
Initialize MLflow connection.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
tracking_uri
|
str | None
|
URI of the MLflow Tracking Server. If None, will try to get from environment. |
None
|
client_factory
|
Any
|
A callable to create the MlflowClient instance. Defaults to MlflowClient. |
None
|
Source code in src/mlflow_assistant/core/connection.py
connect()
¶
Connect to MLflow Tracking Server.
Returns¶
Tuple[bool, str]: (success, message)
Source code in src/mlflow_assistant/core/connection.py
get_client()
¶
Get MLflow client instance.
Returns¶
MlflowClient: MLflow client instance.
Raises¶
MLflowConnectionError: If not connected to MLflow Tracking Server.
Source code in src/mlflow_assistant/core/connection.py
get_connection_info()
¶
Get connection information.
Returns¶
Dict[str, Any]: Connection information.
Source code in src/mlflow_assistant/core/connection.py
core
¶
Core utilities and functionality for MLflow Assistant.
This module provides foundational classes, functions, and utilities used across the MLflow Assistant project, including shared logic for managing workflows and interactions with the MLflow Tracking Server.
provider
¶
Provider integrations for MLflow Assistant.
This module defines the interfaces and implementations for integrating with various large language model (LLM) providers, such as OpenAI and Ollama.
workflow
¶
Workflow management for LangGraph in MLflow Assistant.
This module provides functionality for defining, managing, and executing workflows using LangGraph, enabling seamless integration with MLflow for tracking and managing machine learning workflows.
engine
¶
MLflow Assistant Engine - Provides workflow functionality to process user query.
definitions
¶
Constants for the MLflow Assistant engine.
processor
¶
Query processor that leverages the workflow engine for processing user queries and generating responses using an AI provider.
process_query(query, provider_config, verbose=False)
async
¶
Process a query through the MLflow Assistant workflow.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
query
|
str
|
The query to process |
required |
provider_config
|
dict[str, Any]
|
AI provider configuration |
required |
verbose
|
bool
|
Whether to show verbose output |
False
|
Returns:
Type | Description |
---|---|
dict[str, Any]
|
Dict containing the response |
Source code in src/mlflow_assistant/engine/processor.py
tools
¶
LangGraph tools for MLflow interactions.
MLflowTools
¶
Collection of helper utilities for MLflow interactions.
format_timestamp(timestamp_ms)
staticmethod
¶
Convert a millisecond timestamp to a human-readable string.
Source code in src/mlflow_assistant/engine/tools.py
get_model_details(model_name)
¶
Get detailed information about a specific registered model.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
model_name
|
str
|
The name of the registered model |
required |
Returns:
Type | Description |
---|---|
str
|
A JSON string containing detailed information about the model. |
Source code in src/mlflow_assistant/engine/tools.py
189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 |
|
get_system_info()
¶
Get information about the MLflow tracking server and system.
Returns:
Type | Description |
---|---|
str
|
A JSON string containing system information. |
Source code in src/mlflow_assistant/engine/tools.py
list_experiments(name_contains='', max_results=MLFLOW_MAX_RESULTS)
¶
List all experiments in the MLflow tracking server, with optional filtering.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
name_contains
|
str
|
Optional filter to only include experiments whose names contain this string |
''
|
max_results
|
int
|
Maximum number of results to return (default: 100) |
MLFLOW_MAX_RESULTS
|
Returns:
Type | Description |
---|---|
str
|
A JSON string containing all experiments matching the criteria. |
Source code in src/mlflow_assistant/engine/tools.py
108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 |
|
list_models(name_contains='', max_results=MLFLOW_MAX_RESULTS)
¶
List all registered models in the MLflow model registry, with optional filtering.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
name_contains
|
str
|
Optional filter to only include models whose names contain this string |
''
|
max_results
|
int
|
Maximum number of results to return (default: 100) |
MLFLOW_MAX_RESULTS
|
Returns:
Type | Description |
---|---|
str
|
A JSON string containing all registered models matching the criteria. |
Source code in src/mlflow_assistant/engine/tools.py
workflow
¶
Core LangGraph-based workflow engine for processing user queries and generating responses using an AI provider.
This workflow supports tool-augmented generation: tool calls are detected and executed in a loop until a final AI response is produced.
State
¶
Bases: TypedDict
State schema for the workflow engine.
create_workflow()
¶
Create and return a compiled LangGraph workflow.
Source code in src/mlflow_assistant/engine/workflow.py
main
¶
Main entry point for executing the MLflow Assistant package directly.
providers
¶
Provider module for MLflow Assistant.
AIProvider
¶
Bases: ABC
Abstract base class for AI providers.
langchain_model
abstractmethod
property
¶
Get the underlying LangChain model.
__init_subclass__(**kwargs)
¶
Auto-register provider subclasses.
Source code in src/mlflow_assistant/providers/base.py
create(config)
classmethod
¶
Create an AI provider based on configuration.
Source code in src/mlflow_assistant/providers/base.py
get_ollama_models(uri=DEFAULT_OLLAMA_URI)
¶
Fetch the list of available Ollama models.
Source code in src/mlflow_assistant/providers/utilities.py
verify_ollama_running(uri=DEFAULT_OLLAMA_URI)
¶
Verify if Ollama is running at the given URI.
Source code in src/mlflow_assistant/providers/utilities.py
base
¶
Base class for AI providers.
AIProvider
¶
Bases: ABC
Abstract base class for AI providers.
langchain_model
abstractmethod
property
¶
Get the underlying LangChain model.
__init_subclass__(**kwargs)
¶
Auto-register provider subclasses.
Source code in src/mlflow_assistant/providers/base.py
create(config)
classmethod
¶
Create an AI provider based on configuration.
Source code in src/mlflow_assistant/providers/base.py
databricks_provider
¶
Databricks provider for MLflow Assistant.
DatabricksProvider(model=None, temperature=None, **kwargs)
¶
Bases: AIProvider
Databricks provider implementation.
Initialize the Databricks provider with model.
Source code in src/mlflow_assistant/providers/databricks_provider.py
definitions
¶
Constants for the MLflow Assistant providers.
ParameterKeys
¶
Keys and default parameter groupings for supported providers.
get_parameters(provider)
classmethod
¶
Return the list of parameters for the given provider name.
Source code in src/mlflow_assistant/providers/definitions.py
ollama_provider
¶
Ollama provider for MLflow Assistant.
OllamaProvider(uri=None, model=None, temperature=None, **kwargs)
¶
Bases: AIProvider
Ollama provider implementation.
Initialize the Ollama provider with URI and model.
Source code in src/mlflow_assistant/providers/ollama_provider.py
openai_provider
¶
OpenAI provider for MLflow Assistant.
OpenAIProvider(api_key=None, model=OpenAIModel.GPT35.value, temperature=None, **kwargs)
¶
Bases: AIProvider
OpenAI provider implementation.
Initialize the OpenAI provider with API key and model.
Source code in src/mlflow_assistant/providers/openai_provider.py
utilities
¶
Providers utilities.
get_ollama_models(uri=DEFAULT_OLLAMA_URI)
¶
Fetch the list of available Ollama models.
Source code in src/mlflow_assistant/providers/utilities.py
verify_ollama_running(uri=DEFAULT_OLLAMA_URI)
¶
Verify if Ollama is running at the given URI.
Source code in src/mlflow_assistant/providers/utilities.py
utils
¶
Utility modules for MLflow Assistant.
Command
¶
Bases: Enum
Special commands for interactive chat sessions.
description
property
¶
Get the description for a command.
OllamaModel
¶
Bases: Enum
Default Ollama models supported by MLflow Assistant.
OpenAIModel
¶
Bases: Enum
OpenAI models supported by MLflow Assistant.
Provider
¶
get_mlflow_uri()
¶
Get the MLflow URI from config or environment.
Returns:
Type | Description |
---|---|
str | None
|
Optional[str]: The MLflow URI or None if not configured |
Source code in src/mlflow_assistant/utils/config.py
get_provider_config()
¶
Get the AI provider configuration.
Returns:
Type | Description |
---|---|
dict[str, Any]
|
Dict[str, Any]: The provider configuration |
Source code in src/mlflow_assistant/utils/config.py
load_config()
¶
Load configuration from file.
Source code in src/mlflow_assistant/utils/config.py
save_config(config)
¶
Save configuration to file.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
config
|
dict[str, Any]
|
Configuration dictionary to save |
required |
Returns:
Name | Type | Description |
---|---|---|
bool |
bool
|
True if successful, False otherwise |
Source code in src/mlflow_assistant/utils/config.py
config
¶
Configuration management utilities for MLflow Assistant.
This module provides functions for loading, saving, and accessing configuration settings for MLflow Assistant, including MLflow URI and AI provider settings. Configuration is stored in YAML format in the user's home directory.
ensure_config_dir()
¶
Ensure the configuration directory exists.
get_mlflow_uri()
¶
Get the MLflow URI from config or environment.
Returns:
Type | Description |
---|---|
str | None
|
Optional[str]: The MLflow URI or None if not configured |
Source code in src/mlflow_assistant/utils/config.py
get_provider_config()
¶
Get the AI provider configuration.
Returns:
Type | Description |
---|---|
dict[str, Any]
|
Dict[str, Any]: The provider configuration |
Source code in src/mlflow_assistant/utils/config.py
load_config()
¶
Load configuration from file.
Source code in src/mlflow_assistant/utils/config.py
save_config(config)
¶
Save configuration to file.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
config
|
dict[str, Any]
|
Configuration dictionary to save |
required |
Returns:
Name | Type | Description |
---|---|---|
bool |
bool
|
True if successful, False otherwise |
Source code in src/mlflow_assistant/utils/config.py
constants
¶
Constants and enumerations for MLflow Assistant.
This module defines configuration keys, default values, API endpoints, model definitions, and other constants used throughout MLflow Assistant. It includes enums for AI providers (OpenAI, Ollama) and their supported models.
Command
¶
Bases: Enum
Special commands for interactive chat sessions.
description
property
¶
Get the description for a command.
DatabricksModel
¶
Bases: Enum
Databricks models supported by MLflow Assistant.
OllamaModel
¶
Bases: Enum
Default Ollama models supported by MLflow Assistant.
OpenAIModel
¶
Bases: Enum
OpenAI models supported by MLflow Assistant.