This article explains AI Discuss, an interactive chat tile that lets respondents ask questions about their assessment results in natural language and get conversational answers grounded in their own scores and feedback. It covers setup in both the Assessment Builder and the Results Dashboard, plus prompt-writing best practices for the Individual and Cohort scopes.
AI Discuss turns a static results dashboard into an interactive conversation. Instead of only reading their feedback, respondents can ask the AI follow-up questions about their results, explore specific sections in more depth, and receive personalized, conversational answers grounded in their own scores and rating text.
AI Discuss is available in two scopes: Individual (chat operates on a single respondent's data) and Cohort (chat operates on aggregated cohort-level data, such as a 360, group, team, or iterative cohort report). Each scope is configured separately in the Assessment Builder and then surfaced through a tile on the matching Results Dashboard.
Important: AI Discuss is only available on Results Dashboards. It does not render in Word or PDF feedback reports. If you need static AI feedback inside a downloadable report, use AI Interpretations instead.
On this page
1. Overview
AI Discuss is an interactive chat tile that appears on a Results Dashboard. Respondents type questions into an Ask anything input field and receive AI-generated replies grounded in the data the assessment owner feeds into the prompt. It complements, rather than replaces, the static feedback already on the dashboard.
Setup happens in two places:
- Assessment Builder. The AI prompt that defines the AI's behavior, tone, and boundaries is created here as a new AI Discuss item inside an existing AI Individual or AI Cohort section.
- Results Dashboard. A tile is added that surfaces the AI Discuss chat to the respondent.
AI Discuss differs from AI Interpretations in one key way: an AI Interpretation generates a one-time written analysis that is rendered into the report or dashboard at the moment of generation, while AI Discuss is a back-and-forth conversation initiated by the respondent at view time. Both can coexist on the same dashboard.
Note:
- Individual and Cohort AI Discuss configurations are separate. You cannot share a single AI Discuss prompt between an individual dashboard and a cohort dashboard.
- AI Discuss does not render in Word or PDF reports. If a customer needs the AI content in a downloadable report, use AI Interpretations instead.
2. Adding AI Discuss in the Assessment Builder
In the Assessment Builder, there is now an option called AI Discuss.
2a. Adding an Individual AI Discuss
Use this scope when the chat should answer questions about a single respondent's own results.
- Open the Assessment Builder for the assessment you want to add AI Discuss to.
- Create a new section. Add a Section and set its Section Type to Individual Report (this section will be hidden from respondents).
- Click Add AI Discuss. The AI Discuss configuration window will open.
- Fill in the configuration fields (see the field reference below) and click Save.
2b. Adding a Cohort AI Discuss
Use this scope when the chat should answer questions about an aggregated cohort, such as a team, 360, or iterative report.
- Open the Assessment Builder for the assessment.
- Create a new section. Add a Section and set its Section Type to Cohort Report.
- Click Add AI Discuss.
- Fill in the configuration fields and click Save.
AI Discuss Configuration Fields
The AI Discuss configuration window is similar to the AI Interpretation window. It contains the following fields:
| Field | Description |
| 1- Name | A label for the AI Discuss item, for example "AI Discuss Individual" or "AI Discuss Cohort". Used in dropdowns when adding the dashboard tile. |
| 2- AI Prompt | The instructions that govern the AI's behavior, tone, and boundaries. Supports merge strings via the Add Merge String button, using the same merge string library as AI Interpretation prompts. |
| 3- Order | Controls the position of this item within the AI Individual or AI Cohort section. |
| 4- Test against | Lets you run the prompt against an existing response (or cohort) to preview the AI's behavior before publishing. A Clear Conversation option resets the test conversation. |
| 5. Save / Delete / Cancel | Standard controls for committing, removing, or abandoning changes. |
3. Writing an Effective AI Discuss Prompt
The prompt is the most important part of AI Discuss setup. It defines how the AI responds, what it talks about, and (crucially) what it refuses to talk about. Many of the principles in Writing Effective AI Interpretation Prompts apply here as well, but two principles are especially important for AI Discuss because the conversation is interactive and respondent-driven: confidentiality and brevity.
3a. Confidentiality
AI Discuss already includes built-in safeguards at the data layer. An Individual AI Discuss only ever receives the current respondent's own data, and a Cohort AI Discuss only ever receives aggregated cohort-level data, never individual responses within the cohort. The AI cannot reach data it has not been given.
As an extra layer of protection, we strongly recommend reinforcing those boundaries directly in your prompt. This guards against edge cases where a respondent tries to steer the conversation toward sensitive territory (for example, "What did Bob score?" or "Who should I let go?") and ensures the AI declines even when pushed.
Note:
- For Individual prompts: instruct the AI to discuss only the current respondent's results, and to refuse questions about other respondents even if the user asks for them by name.
- For Cohort prompts: instruct the AI to treat results as aggregated and confidential, never to identify individuals, and never to make decisions about specific people (for example, who to promote or who to terminate). The AI should redirect such questions back to cohort-level patterns and themes.
3b. Brevity
Always instruct the AI to keep its responses brief and focused. There are three reasons to mention this in your prompt:
- The chat tile has limited space on the dashboard, so long replies look cramped or break the layout.
- Short replies generate faster, keeping the conversation feeling interactive.
- Respondents engage more when replies feel like a chat, not an essay.
Recommend short paragraphs, bolded key terms, and bullet points where appropriate. The AI handles HTML formatting natively in the chat tile.
3c. Sample Prompts
The prompts below are starting points, not finished templates. Adapt the role, the merge strings, and the tone to your assessment's domain and audience before publishing.
Starter prompt: Individual AI Discuss
You are a leadership assessment results discussion assistant.
Your role is to help {Response.FirstName} understand and reflect on their
leadership assessment results in a constructive, accurate, and supportive way.
{Tables[SectionNo=ALL SubSectionNo=ALL QuestionNo=ALL]}
Base all responses only on:
- the respondent's leadership assessment results above
Confidentiality:
- Do not discuss or speculate about other respondents, even if asked by name.
- If the user asks about another person's results, politely decline and
redirect the conversation back to their own results.
Tone and format:
- Be brief and focused. Aim for short paragraphs and bullet points.
- Bold key terms or section names so the respondent can scan the answer.
- Use a supportive, growth-oriented tone.
Starter prompt: Cohort AI Discuss
You are a cohort results discussion assistant.
Your role is to help the user explore aggregated cohort-level insights from
this leadership assessment in a constructive and accurate way.
{Tables[SectionNo=ALL SubSectionNo=ALL QuestionNo=ALL]}
Scope:
- You are showing cohort-level insight, not individual response feedback.
- Treat all cohort results as aggregated and confidential.
Confidentiality:
- Do not identify, name, or speculate about specific individuals in the cohort.
- Do not make recommendations about specific individuals (for example,
decisions about hiring, promotion, or termination).
- If the user asks about a specific person, politely decline and redirect
the conversation to cohort-level patterns and themes.
Tone and format:
- Be brief and focused. Aim for short paragraphs and bullet points.
- Bold key terms or section names so the user can scan the answer.
- Use a constructive, insight-oriented tone.
For more on prompt structure, merge strings, and HTML output formatting, see Writing Effective AI Interpretation Prompts. The full library of available merge strings is documented in Merge Strings Available for Reports and Results.
4. Testing the Prompt in the Builder
Before publishing, test your AI Discuss prompt against a real response (for an Individual scope) or a cohort (for a Cohort scope). Testing in the Builder lets you confirm the AI's tone, the merge strings resolve correctly, and the confidentiality boundaries hold under typical questions.
- Open the AI Discuss configuration window from the AI Individual or AI Cohort question.
- Select a response (or cohort) from the Test against dropdown.
- Type a sample question into the test conversation, just as a respondent would. Try common questions and a few edge cases (for example, a question about another respondent or a sensitive personnel decision) to verify your confidentiality instructions are working.
- Use Clear Conversation to reset the test thread between runs.
- Refine the prompt and re-test until the AI's responses match your expectations.
5. Adding AI Discuss to a Results Dashboard
Once your AI Discuss item is configured in the Builder, surface it to respondents by adding a tile to a Results Dashboard. Use the Individual Results Dashboard for an individual AI Discuss, and the Cohort Results Dashboard for a cohort AI Discuss. For background on dashboards, see Building and Deploying Results Dashboards.
- Open the Results Dashboard for the assessment (the individual one for an individual AI Discuss, the cohort one for a cohort AI Discuss).
- Click Add > New Tile, or edit an existing tile.
-
In the Tile Edit window, on the right-hand side panel, set:
- MergeType: select AI Discuss.
- Show: select AI Discuss Prompt.
- Specified Question: select the AI Discuss item created in the Builder. The dropdown groups options by their AI section, for example "AI Individual-Subsection 2 > AI Discuss Individual" or "AI Cohort-Subsection 1 > AI Discuss Cohort".
-
(Optional) Add an introductory message in the tile editor body, such as "What part of your leadership results would you like to explore first?" Then insert the merge string in the form
{AIDiscuss[S# SS# Q#]}where you want the chat box to render.
For example, an AI Discuss item that lives in Section 7, Subsection 2 as the first item would render with{AIDiscuss[S7 SS2 Q1]}. You can find the section, subsection, and order numbers in the Assessment Builder. - Save the tile, then save the dashboard.
Tip: Place the AI Discuss tile in its own dedicated tile, typically on the right-hand side of the dashboard. Appending the chat to an AI Interpretation tile makes both feel cluttered. A standalone tile presents AI Discuss as a clear chat surface.
6. What Respondents See
When a respondent opens the Results Dashboard, the AI Discuss tile renders with:
- An intro message. The text or question you wrote in the tile body, for example "What part of your leadership results would you like to explore first?".
- An "Ask anything" input field with a send arrow.
- AI replies inside the tile. Replies are formatted based on the tone instructions in your prompt.
- A Delete Discussion button that clears the conversation so the respondent can start over.
Note: The first AI reply can take a few seconds to compute while the model processes the prompt and merge string data. Subsequent replies in the same conversation are faster.
FAQs & Troubleshooting
Use the questions below to troubleshoot common AI Discuss setup issues.
Comments
0 comments
Please sign in to leave a comment.