Models¶
The Model
module is an extension of ModelInference
with langchain support, which enables you to get a WatsonxLLM wrapper for watsonx foundation models.
Modules¶
- ModelInference
ModelInference
ModelInference.aclose_persistent_connection()
ModelInference.agenerate()
ModelInference.chat()
ModelInference.chat_stream()
ModelInference.close_persistent_connection()
ModelInference.generate()
ModelInference.generate_text()
ModelInference.generate_text_stream()
ModelInference.get_details()
ModelInference.get_identifying_params()
ModelInference.set_api_client()
ModelInference.to_langchain()
ModelInference.tokenize()
- Enums
- Model
ModelInference
for Deployments