Chat¶
Looking for example notebooks?
For example notebooks, check out examples/ai/chat
on our
GitHub.
The chat UI element provides an interactive chatbot interface for conversations. It can be customized with different models, including built-in AI models from popular providers or custom functions.
marimo.ui.chat
¶
chat(
model: Callable[
[List[ChatMessage], ChatModelConfig], object
],
*,
prompts: Optional[List[str]] = None,
on_message: Optional[
Callable[[List[ChatMessage]], None]
] = None,
show_configuration_controls: bool = False,
config: Optional[ChatModelConfigDict] = None,
allow_attachments: Union[bool, List[str]] = False,
max_height: Optional[int] = None
)
Bases: UIElement[Dict[str, Any], List[ChatMessage]]
A chatbot UI element for interactive conversations.
Define a chatbot by implementing a function that takes a list of ChatMessages and optionally a config object as input, and returns the chat response. The response can be any object, including text, plots, or marimo UI elements.
Examples:
Using a custom model:
def my_rag_model(messages, config):
# Each message has a `content` attribute, as well as a `role`
# attribute ("user", "system", "assistant");
question = messages[-1].content
docs = find_docs(question)
prompt = template(question, docs, messages)
response = query(prompt)
if is_dataset(response):
return dataset_to_chart(response)
return response
chat = mo.ui.chat(my_rag_model)
Async functions and async generators are also supported:
The last value yielded by the async generator is treated as the model response. ui.chat does not yet support streaming responses to the frontend. Please file a GitHub issue if this is important to you: https://github.com/marimo-team/marimo/issues
Using a built-in model:
Using attachments:
ATTRIBUTE | DESCRIPTION |
---|---|
value |
The current chat history, a list of ChatMessage objects.
TYPE:
|
PARAMETER | DESCRIPTION |
---|---|
model
|
A callable that takes in the chat history and returns a response.
TYPE:
|
prompts
|
Optional list of initial prompts to present to the user. Defaults to None.
TYPE:
|
on_message
|
Optional callback function to handle new messages. Defaults to None.
TYPE:
|
show_configuration_controls
|
Whether to show the configuration controls. Defaults to False.
TYPE:
|
config
|
Optional configuration to override the default configuration. Keys include: - max_tokens - temperature - top_p - top_k - frequency_penalty - presence_penalty Defaults to None.
TYPE:
|
allow_attachments
|
Allow attachments. True for any attachments types, or pass a list of mime types. Defaults to False.
TYPE:
|
max_height
|
Optional maximum height for the chat element. Defaults to None.
TYPE:
|
batch
¶
batch(**elements: UIElement[JSONType, object]) -> batch
Convert an HTML object with templated text into a UI element.
This method lets you create custom UI elements that are represented by arbitrary HTML.
Example
user_info = mo.md(
'''
- What's your name?: {name}
- When were you born?: {birthday}
'''
).batch(name=mo.ui.text(), birthday=mo.ui.date())
In this example, user_info
is a UI Element whose output is markdown
and whose value is a dict with keys 'name'
and 'birthday
'
(and values equal to the values of their corresponding elements).
PARAMETER | DESCRIPTION |
---|---|
elements
|
the UI elements to interpolate into the HTML template.
TYPE:
|
callout
¶
callout(
kind: Literal[
"neutral", "danger", "warn", "success", "info"
] = "neutral"
) -> Html
Create a callout containing this HTML element.
A callout wraps your HTML element in a raised box, emphasizing its
importance. You can style the callout for different situations with the
kind
argument.
Examples:
form
¶
form(
label: str = "",
*,
bordered: bool = True,
loading: bool = False,
submit_button_label: str = "Submit",
submit_button_tooltip: Optional[str] = None,
submit_button_disabled: bool = False,
clear_on_submit: bool = False,
show_clear_button: bool = False,
clear_button_label: str = "Clear",
clear_button_tooltip: Optional[str] = None,
validate: Optional[
Callable[[Optional[JSONType]], Optional[str]]
] = None,
on_change: Optional[
Callable[[Optional[T]], None]
] = None
) -> form[S, T]
Create a submittable form out of this UIElement
.
Creates a form that gates submission of a UIElement
's value until a submit button is clicked.
The form's value is the value of the underlying element from the last submission.
Examples:
Convert any UIElement
into a form:
Combine with HTML.batch
to create a form made out of multiple UIElements
:
form = (
mo.ui.md(
'''
**Enter your prompt.**
{prompt}
**Choose a random seed.**
{seed}
'''
)
.batch(
prompt=mo.ui.text_area(),
seed=mo.ui.number(),
)
.form()
)
PARAMETER | DESCRIPTION |
---|---|
label
|
A text label for the form.
TYPE:
|
bordered
|
Whether the form should have a border.
TYPE:
|
loading
|
Whether the form should be in a loading state.
TYPE:
|
submit_button_label
|
The label of the submit button.
TYPE:
|
submit_button_tooltip
|
The tooltip of the submit button.
TYPE:
|
submit_button_disabled
|
Whether the submit button should be disabled.
TYPE:
|
clear_on_submit
|
Whether the form should clear its contents after submitting.
TYPE:
|
show_clear_button
|
Whether the form should show a clear button.
TYPE:
|
clear_button_label
|
The label of the clear button.
TYPE:
|
clear_button_tooltip
|
The tooltip of the clear button.
TYPE:
|
validate
|
A function that takes the form's value and returns an error message if invalid,
or
TYPE:
|
on_change
|
A callback that takes the form's value and returns an error message if invalid,
or
TYPE:
|
send_message
¶
Send a message to the element rendered on the frontend from the backend.
style
¶
style(
style: Optional[dict[str, Any]] = None, **kwargs: Any
) -> Html
Wrap an object in a styled container.
Example
PARAMETER | DESCRIPTION |
---|---|
style
|
an optional dict of CSS styles, keyed by property name
TYPE:
|
**kwargs
|
CSS styles as keyword arguments
TYPE:
|
Basic Usage¶
Here's a simple example using a custom echo model:
import marimo as mo
def echo_model(messages, config):
return f"Echo: {messages[-1].content}"
chat = mo.ui.chat(echo_model, prompts=["Hello", "How are you?"])
chat
Here, messages
is a list of ChatMessage
objects,
which has role
("user"
, "assistant"
, or "system"
) and content
(the
message string) attributes; config
is a
ChatModelConfig
object with various
configuration parameters, which you are free to ignore.
Using a Built-in AI Model¶
You can use marimo's built-in AI models, such as OpenAI's GPT:
import marimo as mo
chat = mo.ui.chat(
mo.ai.llm.openai(
"gpt-4",
system_message="You are a helpful assistant.",
),
show_configuration_controls=True
)
chat
Accessing Chat History¶
You can access the chat history using the value
attribute:
This returns a list of ChatMessage
objects, each
containing role
, content
, and optional attachments
attributes.
marimo.ai.ChatMessage
dataclass
¶
ChatMessage(
role: Literal["user", "assistant", "system"],
content: object,
attachments: Optional[List[ChatAttachment]] = None,
)
A message in a chat.
Custom Model with Additional Context¶
Here's an example of a custom model that uses additional context:
import marimo as mo
def rag_model(messages, config):
question = messages[-1].content
docs = find_relevant_docs(question)
context = "\n".join(docs)
prompt = f"Context: {context}\n\nQuestion: {question}\n\nAnswer:"
response = query_llm(prompt, config)
return response
mo.ui.chat(rag_model)
This example demonstrates how you can implement a Retrieval-Augmented Generation (RAG) model within the chat interface.
Templated Prompts¶
You can pass sample prompts to mo.ui.chat
to allow users to select from a
list of predefined prompts. By including a {{var}}
in the prompt, you can
dynamically insert values into the prompt; a form will be generated to allow
users to fill in the variables.
mo.ui.chat(
mo.ai.llm.openai("gpt-4o"),
prompts=[
"What is the capital of France?",
"What is the capital of Germany?",
"What is the capital of {{country}}?",
],
)
Including Attachments¶
You can allow users to upload attachments to their messages by passing an
allow_attachments
parameter to mo.ui.chat
.
mo.ui.chat(
rag_model,
allow_attachments=["image/png", "image/jpeg"],
# or True for any attachment type
# allow_attachments=True,
)
Built-in Models¶
marimo provides several built-in AI models that you can use with the chat UI element.
OpenAI¶
import marimo as mo
mo.ui.chat(
mo.ai.llm.openai(
"gpt-4o",
system_message="You are a helpful assistant.",
api_key="sk-proj-...",
),
show_configuration_controls=True
)
marimo.ai.llm.openai
¶
openai(
model: str,
*,
system_message: str = DEFAULT_SYSTEM_MESSAGE,
api_key: Optional[str] = None,
base_url: Optional[str] = None
)
Bases: ChatModel
OpenAI ChatModel
Args:
- model: The model to use. Can be found on the OpenAI models page
- system_message: The system message to use
- api_key: The API key to use. If not provided, the API key will be retrieved from the OPENAI_API_KEY environment variable or the user's config.
- base_url: The base URL to use
Anthropic¶
import marimo as mo
mo.ui.chat(
mo.ai.llm.anthropic(
"claude-3-5-sonnet-20240620",
system_message="You are a helpful assistant.",
api_key="sk-ant-...",
),
show_configuration_controls=True
)
marimo.ai.llm.anthropic
¶
anthropic(
model: str,
*,
system_message: str = DEFAULT_SYSTEM_MESSAGE,
api_key: Optional[str] = None,
base_url: Optional[str] = None
)
Bases: ChatModel
Anthropic ChatModel
Args:
- model: The model to use. Can be found on the Anthropic models page
- system_message: The system message to use
- api_key: The API key to use. If not provided, the API key will be retrieved from the ANTHROPIC_API_KEY environment variable or the user's config.
- base_url: The base URL to use
Google AI¶
import marimo as mo
mo.ui.chat(
mo.ai.llm.google(
"gemini-1.5-pro-latest",
system_message="You are a helpful assistant.",
api_key="AI..",
),
show_configuration_controls=True
)
marimo.ai.llm.google
¶
google(
model: str,
*,
system_message: str = DEFAULT_SYSTEM_MESSAGE,
api_key: Optional[str] = None
)
Bases: ChatModel
Google AI ChatModel
Args:
- model: The model to use. Can be found on the Gemini models page
- system_message: The system message to use
- api_key: The API key to use. If not provided, the API key will be retrieved from the GOOGLE_AI_API_KEY environment variable or the user's config.
Groq¶
import marimo as mo
mo.ui.chat(
mo.ai.llm.groq(
"llama-3.1-70b-versatile",
system_message="You are a helpful assistant.",
api_key="gsk-...",
),
show_configuration_controls=True
)
marimo.ai.llm.groq
¶
groq(
model: str,
*,
system_message: str = DEFAULT_SYSTEM_MESSAGE,
api_key: Optional[str] = None,
base_url: Optional[str] = None
)
Bases: ChatModel
Groq ChatModel
Args:
- model: The model to use. Can be found on the Groq models page
- system_message: The system message to use
- api_key: The API key to use. If not provided, the API key will be retrieved from the GROQ_API_KEY environment variable or the user's config.
- base_url: The base URL to use
Types¶
Chatbots can be implemented with a function that receives a list of
ChatMessage
objects and a
ChatModelConfig
.
marimo.ai.ChatMessage
dataclass
¶
ChatMessage(
role: Literal["user", "assistant", "system"],
content: object,
attachments: Optional[List[ChatAttachment]] = None,
)
A message in a chat.
marimo.ai.ChatModelConfig
dataclass
¶
ChatModelConfig(
max_tokens: Optional[int] = None,
temperature: Optional[float] = None,
top_p: Optional[float] = None,
top_k: Optional[int] = None,
frequency_penalty: Optional[float] = None,
presence_penalty: Optional[float] = None,
)
mo.ui.chat
can be instantiated with an initial
configuration with a dictionary conforming to the config.
ChatMessage
s can also include attachments.
marimo.ai.ChatAttachment
dataclass
¶
Supported Model Providers¶
We support any OpenAI-compatible endpoint. If you want any specific provider added explicitly (ones that don't abide by the standard OpenAI API format), you can file a feature request.
Normally, overriding the base_url
parameter should work. Here are some examples:
Note
We have added examples for GROQ and Cerebras. These providers offer free API keys and are great for trying out Llama models (from Meta). You can sign up on their platforms and integrate with various AI integrations in marimo easily. For more information, refer to the AI completion documentation in marimo.