Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.chronosphere.io/llms.txt

Use this file to discover all available pages before exploring further.

Chronosphere Observability Platform provides generative artificial intelligence (AI) tools to generate PromQL or logs queries from natural language prompts. Prompt AI with natural language that reflects what you want Observability Platform to query, and Observability Platform generates queries that return relevant data based on those prompts. You can converse with AI to provide more context, refine the generated query, and ask questions that could be answered by new queries.
Generative AI features can produce incorrect results, hallucinate data, and deliver inaccurate analysis. Use generative AI features with care, and independently verify all information produced by generative AI tools before applying it.Certain prompts, data, or other inputs might produce irrelevant content. Don’t rely on generative AI features or responses for any uses that exceed their designed scope.

Generate queries from natural language prompts

You can generate queries from natural language in Metrics Explorer and Logs Explorer. Fields that support natural language queries include an Edit with AI button.
  1. Open Metrics Explorer to query metrics, or open Logs Explorer to query logs.
  2. In the query field, click Edit with AI to open the natural language prompt field. You can also click or focus on the query field and use the keyboard shortcut Control+I (Command+I on macOS) to open the prompt field.
  3. In the Describe your query prompt field that appears, write your query as a natural language prompt.
  4. Click Generate or press Enter (Return on macOS) to submit your prompt.
  5. Optional: To cancel a prompt being processed, click Stop.
Prompts must be related to monitoring or relevant observability data, and must also be specific about the services, metrics, or logs being queried. If you submit a prompt that doesn’t appear to be related to monitoring or lacks specificity, Observability Platform notifies you or requests more information.
Observability Platform populates the query field with its generated query and presents actions that you can take with the generated query.
  • To immediately run the generated query, click Accept and run or press Control+Enter (Cmd+Return on macOS).
  • To accept the generated query without running it, click Accept or press Tab.
  • To revert the query field to its previous state, click Reject or press Esc.
You can also manually edit the generated query and refine it with additional prompts.

Refine your query with additional prompts

When you enter a prompt while editing with AI, Observability Platform generates a query based on it. If you’ve generated a query, manually entered a query, or opened a query in an explorer from elsewhere in Observability Platform, you can also refine that query using AI. To provide additional prompts, click the natural language prompt field, now named Refine your query, and enter a new prompt. Observability Platform adjusts the query with this new context and presents it as a diff. The original prompt is listed first with a red indicator, followed by the new suggestion with a green indicator. After each iterative prompt, Observability Platform either presents the same Accept and run, Accept, and Reject options, or notifies you of issues with your prompt.

Write effective natural language prompts

You might not have enough information to write effective prompts when you begin using AI to generate a query. However, you can iterate by repeatedly refining your query with a goal of providing enough context to generate a query tailored to your issue. To write effective prompts for queries:
  • Write the goal of the query you want to generate as a specific and concise statement.
  • Provide relevant context that you already have, such as metric or label names.
  • State the form of data you want, such as a count, rate, or histogram.
  • Avoid overly terse, vague, ambiguous, or irrelevant prompts.
For example, an overly terse prompt is unlikely to provide enough context to generate a sufficiently precise query, and might not provide enough information to generate any query:
downtime
A vague and imprecise prompt with unnecessary information is less likely to generate an actionable query:
Tell me why I’m getting so many downtime alerts this week.
While this might produce a query, you can iterate on the result to focus on more specific issues. A more concise and precise prompt helps the large language model generate a more focused query:
Which shopping cart service alerts fired in the last 7 days, and why?
Providing more information or refining your criteria can further focus the result:
Show success rates versus failures for requests to the shopping cart service in the last 7 days.
Once you begin generating relevant queries, you can engage more conversationally with the AI as you continue iterating. The editor retains past context as you continue prompting to refine your query.

Troubleshoot generated queries

Confirm that the time range selector is set to a relevant span of time for your prompt. To investigate the query and identify any errors or unexpected results, click Debug. You can copy and paste any reported errors as prompts that refine the query. Remember that the resulting query represents the large language model’s hypothesis, and isn’t a definitive answer to your question. For example, it might not select the most relevant telemetry data for your request. Critically review the generated query and manually edit when necessary. Observability Platform incorporates your edits in subsequent prompts to guide further refinements or changes to the generated query.

Close the natural language query field

To close the natural language query field, click X or press Control+I (Command+I on macOS).