genai.extensions.langchain package¶
Extension for LangChain library
- class genai.extensions.langchain.LangChainChatInterface[source]¶
Bases:
BaseChatModelClass representing the LangChainChatInterface for interacting with the LangChain chat API.
Example:
from genai import Client, Credentials from genai.extensions.langchain import LangChainChatInterface from langchain_core.messages import HumanMessage, SystemMessage from genai.schema import TextGenerationParameters client = Client(credentials=Credentials.from_env()) llm = LangChainChatInterface( client=client, model_id="meta-llama/llama-3-1-70b-instruct", parameters=TextGenerationParameters( max_new_tokens=250, ) ) response = chat_model.generate(messages=[HumanMessage(content="Hello world!")]) print(response)
- conversation_id: str | None¶
- get_num_tokens(text)[source]¶
Get the number of tokens present in the text.
Useful for checking if an input will fit in a model’s context window.
- Parameters:
text (str) – The string input to tokenize.
- Returns:
The integer number of tokens in the text.
- Return type:
int
- get_num_tokens_from_messages(messages)[source]¶
Get the number of tokens in the messages.
Useful for checking if an input will fit in a model’s context window.
- Parameters:
messages (list[BaseMessage]) – The message inputs to tokenize.
- Returns:
The sum of the number of tokens across the messages.
- Return type:
int
- get_token_ids(text)[source]¶
Return the ordered ids of the tokens in a text.
- Parameters:
text (str) – The string input to tokenize.
- Returns:
- A list of ids corresponding to the tokens in the text, in order they occur
in the text.
- Return type:
list[int]
- property lc_secrets: dict[str, str]¶
A map of constructor argument names to secret ids.
- For example,
{“openai_api_key”: “OPENAI_API_KEY”}
- model_id: str¶
- moderations: ModerationParameters | None¶
- parameters: TextGenerationParameters | None¶
- parent_id: str | None¶
- prompt_id: str | None¶
- prompt_template_id: str | None¶
- streaming: bool | None¶
- trim_method: str | TrimMethod | None¶
- use_conversation_parameters: bool | None¶
- pydantic model genai.extensions.langchain.LangChainEmbeddingsInterface[source]¶
Bases:
BaseModel,EmbeddingsClass representing the LangChainChatInterface for interacting with the LangChain chat API.
Example:
from genai import Client, Credentials from genai.extensions.langchain import LangChainEmbeddingsInterface from genai.text.embedding import TextEmbeddingParameters client = Client(credentials=Credentials.from_env()) embeddings = LangChainEmbeddingsInterface( client=client, model_id="sentence-transformers/all-minilm-l6-v2", parameters=TextEmbeddingParameters(truncate_input_tokens=True) ) embeddings.embed_query("Hello world!") embeddings.embed_documents(["First document", "Second document"])
- Config:
extra: str = forbid
protected_namespaces: tuple = ()
arbitrary_types_allowed: bool = True
- field execution_options: ModelLike[CreateExecutionOptions] | None = None¶
- field model_id: str [Required]¶
- field parameters: ModelLike[TextEmbeddingParameters] | None = None¶
- async aembed_documents(texts)[source]¶
Asynchronous Embed search documents
- Parameters:
texts (List[str])
- Return type:
list[list[float]]
- async aembed_query(text)[source]¶
Asynchronous Embed query text.
- Parameters:
text (str)
- Return type:
List[float]
- class genai.extensions.langchain.LangChainInterface[source]¶
Bases:
LLMClass representing the LangChainChatInterface for interacting with the LangChain chat API.
Example:
from genai import Client, Credentials from genai.extensions.langchain import LangChainInterface from genai.schema import TextGenerationParameters client = Client(credentials=Credentials.from_env()) llm = LangChainInterface( client=client, model_id="meta-llama/llama-3-1-70b-instruct", parameters=TextGenerationParameters(max_new_tokens=50) ) response = chat_model.generate(prompts=["Hello world!"]) print(response)
- data: PromptTemplateData | None¶
- execution_options: CreateExecutionOptions | None¶
- get_num_tokens(text)[source]¶
Get the number of tokens present in the text.
Useful for checking if an input will fit in a model’s context window.
- Parameters:
text (str) – The string input to tokenize.
- Returns:
The integer number of tokens in the text.
- Return type:
int
- get_num_tokens_from_messages(messages)[source]¶
Get the number of tokens in the messages.
Useful for checking if an input will fit in a model’s context window.
- Parameters:
messages (list[BaseMessage]) – The message inputs to tokenize.
- Returns:
The sum of the number of tokens across the messages.
- Return type:
int
- get_token_ids(text)[source]¶
Return the ordered ids of the tokens in a text.
- Parameters:
text (str) – The string input to tokenize.
- Returns:
- A list of ids corresponding to the tokens in the text, in order they occur
in the text.
- Return type:
list[int]
- property lc_secrets: dict[str, str]¶
A map of constructor argument names to secret ids.
- For example,
{“openai_api_key”: “OPENAI_API_KEY”}
- model_id: str¶
- moderations: ModerationParameters | None¶
- parameters: TextGenerationParameters | None¶
- prompt_id: str | None¶
- streaming: bool | None¶
- genai.extensions.langchain.from_langchain_template(template)[source]¶
Convert langchain template variables to mustache template variables
- Parameters:
template (str)
- Return type:
str
- genai.extensions.langchain.to_langchain_template(template)[source]¶
Convert mustache template variables to langchain template variables
- Parameters:
template (str)
- Return type:
str
Submodules¶
- genai.extensions.langchain.chat_llm module
LangChainChatInterfaceLangChainChatInterface.cacheLangChainChatInterface.callback_managerLangChainChatInterface.callbacksLangChainChatInterface.clientLangChainChatInterface.conversation_idLangChainChatInterface.custom_get_token_idsLangChainChatInterface.get_num_tokens()LangChainChatInterface.get_num_tokens_from_messages()LangChainChatInterface.get_token_ids()LangChainChatInterface.is_lc_serializable()LangChainChatInterface.lc_secretsLangChainChatInterface.load_from_file()LangChainChatInterface.metadataLangChainChatInterface.model_idLangChainChatInterface.moderationsLangChainChatInterface.parametersLangChainChatInterface.parent_idLangChainChatInterface.prompt_idLangChainChatInterface.prompt_template_idLangChainChatInterface.streamingLangChainChatInterface.tagsLangChainChatInterface.trim_methodLangChainChatInterface.use_conversation_parametersLangChainChatInterface.validate_data_models()LangChainChatInterface.verbose
- genai.extensions.langchain.llm module
LangChainInterfaceLangChainInterface.cacheLangChainInterface.callback_managerLangChainInterface.callbacksLangChainInterface.clientLangChainInterface.custom_get_token_idsLangChainInterface.dataLangChainInterface.execution_optionsLangChainInterface.get_num_tokens()LangChainInterface.get_num_tokens_from_messages()LangChainInterface.get_token_ids()LangChainInterface.is_lc_serializable()LangChainInterface.lc_secretsLangChainInterface.load_from_file()LangChainInterface.metadataLangChainInterface.model_idLangChainInterface.moderationsLangChainInterface.parametersLangChainInterface.prompt_idLangChainInterface.streamingLangChainInterface.tagsLangChainInterface.verbose
- genai.extensions.langchain.template module
- genai.extensions.langchain.utils module