Skip to content

Foundation Models API

LSFoundationModel

LSFoundationModel(
    client: LlamaStackClient,
    model_id: str,
    params: dict[str, Any] | LSModelParameters | None = None,
    system_message_text: str | None = None,
    user_message_text: str | None = None,
    context_template_text: str | None = None,
)

Bases: BaseFoundationModel[LlamaStackClient, dict[str, Any] | LSModelParameters | None]

Integration point to use any model via Llama-stack API / client

Source code in ai4rag/rag/foundation_models/llama_stack.py
def __init__(
    self,
    client: LlamaStackClient,
    model_id: str,
    params: dict[str, Any] | LSModelParameters | None = None,
    system_message_text: str | None = None,
    user_message_text: str | None = None,
    context_template_text: str | None = None,
):

    super().__init__(
        client=client,
        model_id=model_id,
        params=params,
        system_message_text=system_message_text,
        user_message_text=user_message_text,
        context_template_text=context_template_text,
    )

Attributes

params property writable

params: LSModelParameters

Get models params.

Functions

chat

chat(messages: list[MessageTyped]) -> list[MessageTyped]

Chat completion for communication with selected foundation model.

Parameters:

  • messages (list[MessageTyped]) –

    Messages to be included in the chat completion.

Returns:

  • str

    Chat response from the model.

Source code in ai4rag/rag/foundation_models/llama_stack.py
def chat(self, messages: list[MessageTyped]) -> list[MessageTyped]:
    """
    Chat completion for communication with selected foundation model.

    Parameters
    ----------
    messages : list[MessageTyped]
        Messages to be included in the chat completion.

    Returns
    -------
    str
        Chat response from the model.
    """
    response_chat = self.client.chat.completions.create(
        model=self.model_id,
        messages=messages,
        max_completion_tokens=self.params.max_completion_tokens,
        temperature=self.params.temperature,
    )
    response_choices = response_chat.choices

    return response_choices