utils
mlflow_assistant.utils
¶
Utility modules for MLflow Assistant.
Command
¶
Bases: Enum
Special commands for interactive chat sessions.
description
property
¶
Get the description for a command.
OllamaModel
¶
Bases: Enum
Default Ollama models supported by MLflow Assistant.
OpenAIModel
¶
Bases: Enum
OpenAI models supported by MLflow Assistant.
Provider
¶
get_mlflow_uri()
¶
Get the MLflow URI from config or environment.
Returns:
Type | Description |
---|---|
str | None
|
Optional[str]: The MLflow URI or None if not configured |
Source code in src/mlflow_assistant/utils/config.py
get_provider_config()
¶
Get the AI provider configuration.
Returns:
Type | Description |
---|---|
dict[str, Any]
|
Dict[str, Any]: The provider configuration |
Source code in src/mlflow_assistant/utils/config.py
load_config()
¶
Load configuration from file.
Source code in src/mlflow_assistant/utils/config.py
save_config(config)
¶
Save configuration to file.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
config
|
dict[str, Any]
|
Configuration dictionary to save |
required |
Returns:
Name | Type | Description |
---|---|---|
bool |
bool
|
True if successful, False otherwise |
Source code in src/mlflow_assistant/utils/config.py
config
¶
Configuration management utilities for MLflow Assistant.
This module provides functions for loading, saving, and accessing configuration settings for MLflow Assistant, including MLflow URI and AI provider settings. Configuration is stored in YAML format in the user's home directory.
ensure_config_dir()
¶
Ensure the configuration directory exists.
get_mlflow_uri()
¶
Get the MLflow URI from config or environment.
Returns:
Type | Description |
---|---|
str | None
|
Optional[str]: The MLflow URI or None if not configured |
Source code in src/mlflow_assistant/utils/config.py
get_provider_config()
¶
Get the AI provider configuration.
Returns:
Type | Description |
---|---|
dict[str, Any]
|
Dict[str, Any]: The provider configuration |
Source code in src/mlflow_assistant/utils/config.py
load_config()
¶
Load configuration from file.
Source code in src/mlflow_assistant/utils/config.py
save_config(config)
¶
Save configuration to file.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
config
|
dict[str, Any]
|
Configuration dictionary to save |
required |
Returns:
Name | Type | Description |
---|---|---|
bool |
bool
|
True if successful, False otherwise |
Source code in src/mlflow_assistant/utils/config.py
constants
¶
Constants and enumerations for MLflow Assistant.
This module defines configuration keys, default values, API endpoints, model definitions, and other constants used throughout MLflow Assistant. It includes enums for AI providers (OpenAI, Ollama) and their supported models.
Command
¶
Bases: Enum
Special commands for interactive chat sessions.
description
property
¶
Get the description for a command.
DatabricksModel
¶
Bases: Enum
Databricks models supported by MLflow Assistant.
OllamaModel
¶
Bases: Enum
Default Ollama models supported by MLflow Assistant.
OpenAIModel
¶
Bases: Enum
OpenAI models supported by MLflow Assistant.