Introduction to Operators

In this notebook, we’ll explore the basics of prompting and rendering prompts. Follow along with the code and instructions to get a hands-on experience. Feel free to modify and experiment with the code as you go!

pip install langtree --quiet
Note: you may need to restart the kernel to use updated packages.

(The above is required for compiling the docs)

Import the required modules

For this example we will be doing a full example of operator chaining! First import the required modules.

langtree.operators: This is where the operators for langtree will be accessible.

from langtree.models.openai import OpenAIChatCompletion
from langtree.core import Buffer, Prompt
from langtree.prompting import SystemMessage, UserMessage
from langtree.operators import Parallel, Sequential, chainable

Define the model to use

Here we actually define the model we will be using. Each client or adapter is a form of an Operator, this will be explained later (also in the api reference). For now, you can think of it like a function that happens to be able to store fixed keyword-args which are then used later.

model1 = "gpt-3.5-turbo"
chat = OpenAIChatCompletion(model=model1)

model2 = "gpt-4"
chatgpt4 = OpenAIChatCompletion(model=model2)

All our operators map 1:1 with their underlying functionality, so in this case anything passable parameters on the openai.Completions client are passable to our Operator (in this case OpenAICompletion)

(What this means is that we don’t necessarily need to have a fixed model type. We can choose to define it… or not!)

Next we build our prompts

In langtree, this is super simple. Just use the Prompt class we imported earlier and pass a formatting string like those you’d use with langchain!

Hint: Any word escaped by {{ word }} will automatically become a keyword arg.

system_prompt = Prompt("You are a helpful assistant")
user_prompt = Prompt("{{banana}}.{{dev}} is cool")

Then we can render the content of a prompt after the keywords are substituted

sys_content = system_prompt()
user_content = user_prompt(banana="openai", dev="com")

In order to use this content with a chat model, we need to convert them to Messages. A Message is essentially a dictionary with a role and content field. For usability, we’ve added some utility Messages for System, User, Assistant, and Function

s = SystemMessage(content=sys_content)
u = UserMessage(content=user_content)

Now we can actually use the prompts to chat! But first, we need to create some memory.

Create the Memory Buffer

In langtree, a Buffer is a simple list of fixed size. If you add more messages, it will automatically remove messages until it is the right size!

Let’s initialize the Buffer

history = Buffer(3)

Now we can add our messages to it

history += s
history += u

Now we are ready to chat with ChatGPT

Create the @chainable function

Before, we would call the chat object we created. Remember, it exposes the exact same args as the openai completions client.

Instead, we need to create a chainable function. This one is going to take history and an operator, and then return a modified version of history.

@chainable
def call_func(history, op=None):
    response = op(messages=history.memory)
    history += UserMessage(content=response.content)
    return history

The @chainable decorator wraps the original function. To use a chainable function, we must first call it.

When we call a chainable function, it only takes kwargs, and stores them as static kwargs for the function. It then returns a copy of the function WITH the static args

c1 = call_func(op=chat)

So now c1 has the param op statically set to the chat object we defined earlier.

Create a sequential chain

To create a sequential chain, we call Sequential and pass a list of operators to it.

Here we are going to initialize it as empty, and then add a chainable function that already has its static args instantiated.

seq = Sequential([])
seq += c1

Add to the sequential chain

To do this we can do one of two things. We can use the += operator or the sequential.add method

seq += call_func(op=chatgpt4)

Create a parallel chain

A parallel chain is almost identical to a sequential chain, except it passes the same values to all of its operators

Additionally a parallel chain creates a deepcopy of all arguments and kwargs. This is to ensure that each operator is isolated from the others.

To share objects across each function, make sure they can be passed as static kwargs when initializing the static kwargs.

p = Parallel([call_func(op=chat)])
p += call_func(op=chat)
p.add(call_func(op=chatgpt4))
<langtree.operators.base_operators.Parallel at 0x7fc2118aa820>

Adding operators together

We can also add sequential and parallel operators. This is totally dependent on order. Whichever operator type comes first in the expression is the resulting type. For example:

seq += p

Would result in adding p as the last operator in seq, and:

p += seq

Would result in adding seq as a parallel chain in p

Lets see this in action:

p += seq

p(history)
[[{'role': 'system', 'content': 'You are a helpful assistant'}, {'role': 'user', 'content': 'openai.com is cool'}, {'role': 'user', 'content': 'Yes, OpenAI is a very cool company! They are at the forefront of artificial intelligence research and development. They have developed many impressive models and technologies, including GPT-3, which is a state-of-the-art language processing model. OpenAI is also known for their commitment to ethical AI and promoting responsible use of AI technologies. If you have any specific questions or interests about OpenAI, feel free to ask!'}],
 [{'role': 'system', 'content': 'You are a helpful assistant'}, {'role': 'user', 'content': 'openai.com is cool'}, {'role': 'user', 'content': "Yes, OpenAI.com is a fantastic website! It offers a wealth of knowledge and resources related to artificial intelligence and machine learning, including research papers, blog posts, and software tools. It's a great place to stay updated on the latest advancements in AI and learn more about their projects and initiatives. If you have any specific questions or need assistance with anything related to OpenAI or AI in general, I'm here to help!"}],
 [{'role': 'system', 'content': 'You are a helpful assistant'}, {'role': 'user', 'content': 'openai.com is cool'}, {'role': 'user', 'content': "I'm glad you're enjoying OpenAI! OpenAI.com is known for pioneering in artificial intelligence research. If you have any questions or if there's anything specific about it that you like or want to talk about, feel free to share!"}],
 [{'role': 'user', 'content': 'openai.com is cool'}, {'role': 'user', 'content': "Yes, OpenAI is a very cool platform. It offers a wide range of powerful AI tools and services that allow developers, researchers, and businesses to utilize cutting-edge technology. OpenAI's mission is to ensure that artificial general intelligence (AGI) benefits all of humanity, and they provide various resources, including research papers and models, to help achieve this goal."}, {'role': 'user', 'content': "I'm glad you think so! OpenAI is definitely an innovative organization that's pushing the boundaries of what's possible with artificial intelligence. Their work spans across various areas of AI, including natural language processing, machine learning, robotics, and more. They are deeply committed to ensuring that AGI can be used for the benefit of all."}]]

Recap

In this notebook, we covered: Importing modules, defining our endpoints, building prompts, and chatting with ChatGPT