providers
mlflow_assistant.providers
¶
Provider module for MLflow Assistant.
AIProvider
¶
Bases: ABC
Abstract base class for AI providers.
langchain_model
abstractmethod
property
¶
Get the underlying LangChain model.
__init_subclass__(**kwargs)
¶
Auto-register provider subclasses.
Source code in src/mlflow_assistant/providers/base.py
create(config)
classmethod
¶
Create an AI provider based on configuration.
Source code in src/mlflow_assistant/providers/base.py
get_ollama_models(uri=DEFAULT_OLLAMA_URI)
¶
Fetch the list of available Ollama models.
Source code in src/mlflow_assistant/providers/utilities.py
verify_ollama_running(uri=DEFAULT_OLLAMA_URI)
¶
Verify if Ollama is running at the given URI.
Source code in src/mlflow_assistant/providers/utilities.py
base
¶
Base class for AI providers.
AIProvider
¶
Bases: ABC
Abstract base class for AI providers.
langchain_model
abstractmethod
property
¶
Get the underlying LangChain model.
__init_subclass__(**kwargs)
¶
Auto-register provider subclasses.
Source code in src/mlflow_assistant/providers/base.py
create(config)
classmethod
¶
Create an AI provider based on configuration.
Source code in src/mlflow_assistant/providers/base.py
databricks_provider
¶
Databricks provider for MLflow Assistant.
DatabricksProvider(model=None, temperature=None, **kwargs)
¶
Bases: AIProvider
Databricks provider implementation.
Initialize the Databricks provider with model.
Source code in src/mlflow_assistant/providers/databricks_provider.py
definitions
¶
Constants for the MLflow Assistant providers.
ParameterKeys
¶
Keys and default parameter groupings for supported providers.
get_parameters(provider)
classmethod
¶
Return the list of parameters for the given provider name.
Source code in src/mlflow_assistant/providers/definitions.py
ollama_provider
¶
Ollama provider for MLflow Assistant.
OllamaProvider(uri=None, model=None, temperature=None, **kwargs)
¶
Bases: AIProvider
Ollama provider implementation.
Initialize the Ollama provider with URI and model.
Source code in src/mlflow_assistant/providers/ollama_provider.py
openai_provider
¶
OpenAI provider for MLflow Assistant.
OpenAIProvider(api_key=None, model=OpenAIModel.GPT35.value, temperature=None, **kwargs)
¶
Bases: AIProvider
OpenAI provider implementation.
Initialize the OpenAI provider with API key and model.
Source code in src/mlflow_assistant/providers/openai_provider.py
utilities
¶
Providers utilities.
get_ollama_models(uri=DEFAULT_OLLAMA_URI)
¶
Fetch the list of available Ollama models.
Source code in src/mlflow_assistant/providers/utilities.py
verify_ollama_running(uri=DEFAULT_OLLAMA_URI)
¶
Verify if Ollama is running at the given URI.