SetAI Card: Lets you use AI to manipulate attribute data to your liking.

Created
Feb 5, 2024 8:51 PM
Tags

SetAI card

Overview

The SetAI Card allows you to dynamically set attributes in your chatbot based on a prompt that you provide. These attributes get set in real-time on the runtime of each session with your chatbot.

The SetAI is a new feature leveraging Large Language Models (LLMs) and should be used WITH CAUTION in production use cases for business critical applications because of its potential to generate misleading or false information. This card does NOT have customer data as a safeguard, it’s making direct API calls to an LLM.

***

Adding an SetAI to your chatbot

You can find the Set AI card in your chatbot cards menu, under the AI section. The step can be added into your chatbot anywhere. Once you have placed your card, you can configure it in the authoring.

image

To configure your attribute, in the Prompt field, provide a description for the type of function you'd like the AI to run for you. You can leverage attribute from your chatbot within this prompt, to make it dynamic.

image

***

Configuring your Prompt (Coming soon…)

There will be 3 ways to configure the prompt you've provided to modify the potential output:

  • Temperature setting - This feature enables you to control the degree of variation in your responses relative to the given prompt. A higher temperature setting leads to greater variability and creativity in responses. Conversely, a lower temperature setting yields responses that are more closely aligned with the prompt, offering precise and direct answers. For responses that are more precise and closely related to the prompt, it's advisable to reduce the temperature setting.
  • Max Tokens setting - This parameter determines the maximum number of tokens to be used in generating a response to your prompt. The upper limit for tokens in a single response is 512, which includes the length of your prompt and settings. Setting a higher max tokens value can lead to more extensive responses but may also increase the response time.
  • System Instructions setting - These are directives you can issue to the LLM model to guide its behavior. Assigning a 'role' to the model assists it in delivering more contextually appropriate answers. In this section, you can specify various aspects such as the length, structure, personality, tone, and language of the response. It's important to note that system instructions are integrated with the question or prompt, so ensure there is no contradiction between them.

The Set AI card is still a new feature, and for this reason and we recommend to proceed with caution when using it for serious production use cases.

Practical use cases

  • Cleaning up data (Extract first name)
  • Format data (Dates, timestamps, timezones, etc…)