Chronosphere Observability Platform provides generative artificial intelligence (AI) tools to generate PromQL or logs queries from natural language prompts. Prompt AI with natural language that reflects what you want Observability Platform to query, and Observability Platform generates queries that return relevant data based on those prompts. You can converse with AI to provide more context, refine the generated query, and ask questions that could be answered by new queries.Documentation Index
Fetch the complete documentation index at: https://docs.chronosphere.io/llms.txt
Use this file to discover all available pages before exploring further.
Generate queries from natural language prompts
You can generate queries from natural language in Metrics Explorer and Logs Explorer. Fields that support natural language queries include an Edit with AI button.- Open Metrics Explorer to query metrics, or open Logs Explorer to query logs.
- In the query field, click Edit with AI to open the natural language prompt field. You can also click or focus on the query field and use the keyboard shortcut Control+I (Command+I on macOS) to open the prompt field.
- In the Describe your query prompt field that appears, write your query as a natural language prompt.
- Click Generate or press Enter (Return on macOS) to submit your prompt.
- Optional: To cancel a prompt being processed, click Stop.
Prompts must be related to monitoring or relevant observability data, and must
also be specific about the services, metrics, or logs being queried. If you submit
a prompt that doesn’t appear to be related to monitoring or lacks specificity,
Observability Platform notifies you or requests more information.
- To immediately run the generated query, click Accept and run or press Control+Enter (Cmd+Return on macOS).
- To accept the generated query without running it, click Accept or press Tab.
- To revert the query field to its previous state, click Reject or press Esc.
Refine your query with additional prompts
When you enter a prompt while editing with AI, Observability Platform generates a query based on it. If you’ve generated a query, manually entered a query, or opened a query in an explorer from elsewhere in Observability Platform, you can also refine that query using AI. To provide additional prompts, click the natural language prompt field, now named Refine your query, and enter a new prompt. Observability Platform adjusts the query with this new context and presents it as a diff. The original prompt is listed first with a red indicator, followed by the new suggestion with a green indicator. After each iterative prompt, Observability Platform either presents the same Accept and run, Accept, and Reject options, or notifies you of issues with your prompt.Write effective natural language prompts
You might not have enough information to write effective prompts when you begin using AI to generate a query. However, you can iterate by repeatedly refining your query with a goal of providing enough context to generate a query tailored to your issue. To write effective prompts for queries:- Write the goal of the query you want to generate as a specific and concise statement.
- Provide relevant context that you already have, such as metric or label names.
- State the form of data you want, such as a count, rate, or histogram.
- Avoid overly terse, vague, ambiguous, or irrelevant prompts.
downtimeA vague and imprecise prompt with unnecessary information is less likely to generate an actionable query:
Tell me why I’m getting so many downtime alerts this week.While this might produce a query, you can iterate on the result to focus on more specific issues. A more concise and precise prompt helps the large language model generate a more focused query:
Which shopping cart service alerts fired in the last 7 days, and why?Providing more information or refining your criteria can further focus the result:
Show success rates versus failures for requests to the shopping cart service in the last 7 days.Once you begin generating relevant queries, you can engage more conversationally with the AI as you continue iterating. The editor retains past context as you continue prompting to refine your query.

