ModelInference ============== .. _model-inference-class: .. autoclass:: ibm_watsonx_ai.foundation_models.inference.ModelInference :members: :exclude-members: :undoc-members: :show-inheritance: Enums ----- .. class:: TextModels Bases: ``StrEnum`` This represents a dynamically generated Enum for Foundation Models. **Example of getting TextModels:** .. code-block:: python # GET TextModels ENUM client.foundation_models.TextModels # PRINT dict of Enums client.foundation_models.TextModels.show() **Example Output:** .. code-block:: {'GRANITE_13B_CHAT_V2': 'ibm/granite-13b-chat-v2', 'GRANITE_13B_INSTRUCT_V2': 'ibm/granite-13b-instruct-v2', ... 'LLAMA_2_13B_CHAT': 'meta-llama/llama-2-13b-chat', 'LLAMA_2_70B_CHAT': 'meta-llama/llama-2-70b-chat', 'LLAMA_3_70B_INSTRUCT': 'meta-llama/llama-3-70b-instruct', 'MIXTRAL_8X7B_INSTRUCT_V01': 'mistralai/mixtral-8x7b-instruct-v01'} **Example of initialising ModelInference with TextModels Enum:** .. code-block:: python from ibm_watsonx_ai.foundation_models import ModelInference model = ModelInference( model_id=client.foundation_models.TextModels.GRANITE_13B_INSTRUCT_V2, credentials=Credentials(...), project_id=project_id, )