Prompt Template Manager#

class ibm_watson_machine_learning.foundation_models.prompts.PromptTemplateManager(credentials=None, *, project_id=None, space_id=None, verify=None, api_client=None)[source]#

Bases: WMLResource

Instantiate the prompt template manager.

Parameters:
  • credentials (dict) – Credentials to watsonx.ai instance.

  • project_id (str) – ID of project

  • space_id (str) – ID of project

  • verify (bool or str, optional) – user can pass as verify one of following: - the path to a CA_BUNDLE file - the path of directory with certificates of trusted CAs - True - default path to truststore will be taken - False - no verification will be made

Note

One of these parameters is required: [‘project_id ‘, ‘space_id’]

Example

from ibm_watson_machine_learning.foundation_models.prompts import PromptTemplate, PromptTemplateManager
from ibm_watson_machine_learning.foundation_models.utils.enums import ModelTypes

prompt_mgr = PromptTemplateManager(
                credentials={
                    "apikey": "***",
                    "url": "https://us-south.ml.cloud.ibm.com"
                },
                project_id="*****"
                )

prompt_template = PromptTemplate(name="My prompt",
                                 model_id=ModelTypes.GRANITE_13B_CHAT_V2,
                                 input_prefix="Human:",
                                 output_prefix="Assistant:",
                                 input_text="What is {object} and how does it work?",
                                 input_variables=['object'],
                                 examples=[['What is the Stock Market?',
                                            'A stock market is a place where investors buy and sell shares of publicly traded companies.']])

stored_prompt_template = prompt_mgr.store_prompt(prompt_template)
print(stored_prompt_template.prompt_id)   # id of prompt template asset
delete_prompt(prompt_id, *, force=False)[source]#

Remove prompt template from project or space.

Parameters:
  • prompt_id (str) – Id of prompt template that will be delete.

  • force (bool) – If True then prompt template is unlocked and then delete, defaults to False.

Returns:

Status ‘SUCESS’ if the prompt template is successfully deleted.

Return type:

str

Example

prompt_mgr.delete_prompt(prompt_id)  # delete if asset is unclocked
get_lock(prompt_id)[source]#

Get the current locked state of a prompt template.

Parameters:

prompt_id (str) – Id of prompt template

Returns:

Information about locked state of prompt template asset.

Return type:

Dict

Example

print(prompt_mgr.get_lock(prompt_id))
list(*, limit=None)[source]#

List all available prompt templates in the DataFrame format.

Parameters:

limit (Optional[int]) – limit number of fetched records, defaults to None.

Returns:

Dataframe of fundamental properties of availabale prompts.

Return type:

pandas.core.frame.DataFram

Example

prompt_mgr.list(limit=5)    # list of 5 recent created prompt template assets

Hint

Additionally you can sort available prompt templates by “LAST MODIFIED” field.

df_prompts = prompt_mgr.list()
df_prompts.sort_values("LAST MODIFIED", ascending=False)
load_prompt(prompt_id, astype=PromptTemplateFormats.PROMPTTEMPLATE, *, prompt_variables=None)[source]#

Retrive a prompt template asset.

Parameters:
  • prompt_id (str) – Id of prompt template which is processed.

  • astype (PromptTemplateFormats) – Type of return object.

  • prompt_variables (Dict[str, str]) – Dictionary of input variables and values with which input variables will be replaced.

Returns:

Prompt template asset.

Return type:

PromptTemplate | str | langchain.prompts.PromptTemplate

Example

loaded_prompt_template = prompt_mgr.load_prompt(prompt_id)
loaded_prompt_template_lc = prompt_mgr.load_prompt(prompt_id, PromptTemplateFormats.LANGCHAIN)
loaded_prompt_template_string = prompt_mgr.load_prompt(prompt_id, PromptTemplateFormats.STRING)
lock(prompt_id, force=False)[source]#

Lock the prompt template if it is unlocked and user has permission to do that.

Parameters:
  • promp_id (str) – Id of prompt template.

  • force (bool) – If True, method forcefully overwrite a lock.

Returns:

Status ‘SUCCESS’ or response content after an attempt to lock prompt template.

Return type:

(str | Dict)

Example

prompt_mgr.lock(prompt_id)
store_prompt(prompt_template)[source]#

Store a new prompt template.

Parameters:

prompt_template ((PromptTemplate | langchain.prompts.PromptTemplate)) – PromptTemplate to be stored.

Returns:

PromptTemplate object initialized with values provided in the server response object.

Return type:

PromptTemplate

unlock(prompt_id)[source]#

Unlock the prompt template if it is locked and user has permission to do that.

Parameters:

promp_id (str) – Id of prompt template.

Returns:

Response content after an attempt to unlock prompt template.

Return type:

Dict

Example

prompt_mgr.unlock(prompt_id)
update_prompt(prompt_id, prompt_template)[source]#

Update prompt template data.

Parameters:
  • prompt_id (str) – Id of the updated prompt template.

  • prompt (PromptTemplate) – PromptTemplate with new data.

Returns:

metadata of updated deployment

Return type:

dict

Example

updataed_prompt_template = PromptTemplate(name="New name")
prompt_mgr.update_prompt(prompt_id, prompt_template)  # {'name': 'New name'} in metadata  

Enums#

class ibm_watson_machine_learning.foundation_models.utils.enums.PromptTemplateFormats(value)[source]#

Bases: Enum

Supported formats of loaded prompt template.

LANGCHAIN = 'langchain'#
PROMPTTEMPLATE = 'prompt'#
STRING = 'string'#