Prompt#

In this Getting Started guide on Prompt and related models, we are going to quickly skim through -

  1. What is a Prompt?

  2. Components of a Prompt

  3. How to build a Prompt

We will not deep-dive into the Prompt here. If you are interested in learning more, you can check out the detailed guide on Prompt in the deep-dive section.

Setup#

Ensure bodhilib is installed.

[1]:
!pip -q install bodhilib

What is a Prompt?#

A ‘Prompt’ serves as the input for the Language Models (LLM) service, directing it to generate a response. In addition to this, the LLM service also returns a ‘Prompt’ as a response.

Components of a Prompt?#

Prompt consists of text, role and source.

  • text is the core content of the ‘Prompt’. It represents the query asked to the LLM service. In the case of a response, it holds the text provided by the LLM service.

  • role can be system, ai and user. It identifies the author or persona of the Prompt.

  • source can be input and output. This indicates if it is an input to the LLM service, or an output from it.

For more details on the components of a Prompt, refer to the deep-dive section.

How to Build a Prompt?#

Building a Prompt is as simple as passing a string to the constructor. By default, the role is set to ‘user’ and source to ‘input’.

[2]:
# import the Prompt object from `bodhilib` library
from bodhilib import Prompt

prompt = Prompt("What day comes after Monday?")
print(">", prompt)
> text='What day comes after Monday?' role=<Role.USER: 'user'> source=<Source.INPUT: 'input'>

Prompt uses Pydantic v1 to provide the Model methods utility. So you can construct Prompt in different ways that Pydantic supports. For e.g.:

[3]:
# Using exploded keyword arguments

prompt = Prompt(**{"text": "Using the exploded keyword arguments?", "role": "user", "source": "input"})
print(">", prompt)
> text='Using the exploded keyword arguments?' role=<Role.USER: 'user'> source=<Source.INPUT: 'input'>
[4]:
# Using python object for Role and Source
# Let's first import them from the package
from bodhilib import Role, Source

# And use it to construct the Prompt
prompt = Prompt(text="Using the python objects", role=Role.USER, source=Source.INPUT)
print(">", prompt)
> text='Using the python objects' role=<Role.USER: 'user'> source=<Source.INPUT: 'input'>

🎉 We just created our Prompt model.

Next, let’s see how to generate a response from LLM.