Please ensure Javascript is enabled for purposes of website accessibility
Powered by Zoomin Software. For more details please contactZoomin

Analytics and Notifications for PI System Explorer (PI Server 2018)

Learn how to gain insights into analysis performance

Learn how to gain insights into analysis performance

  • Last UpdatedJan 09, 2025
  • 4 minute read

The Performance tab provides helpful insights into individual analytic and group analyses performance. The metrics on this tab can help identify potential issues and problematic analyses.

By default, analyses are grouped by template. To view a full list of analyses on the PI AF Server and their statistics, deselect the Group by: Template option.

Note: The analyses on the Performance tab are retrieved from the default PI AF Server, not just the current AF database.

From this tab, you can perform the following tasks:

  • View analysis performance

  • Sort analyses by column headers

  • Add and remove column headers

  • Filter analyses

  • Export table data to a file

Description of column headings

The following table lists the column headers you can add to the Performance tab. Column headings can be added or removed depending on the amount of information you wish to view.

Column heading

Description

AF Database

The name of the AF database where an analysis is stored.

Analysis Name

The name of the analysis

Average Analysis Count

Indicates the average number of analyses running in the group since the service started up. Generally, this will stay constant at the number of analyses that belong to the group, but can fluctuate if analyses are enabled or disabled over time.

Average Elapsed (ms)

The average amount of time in milliseconds that the analysis took to execute.

Average Error Ratio

The ratio of the average number of times an analysis failed due to an error.

Average Lag (ms)

The average amount of lag time in milliseconds between when the analysis should have run and when it actually ran.

Average Success Ratio

The ratio of the average number of times an analysis successfully ran.

Average Trigger (ms)

The average interval in milliseconds between analysis executions.

Current Evaluation Lag (ms)

The lag time in milliseconds between how long it took the Analysis Service to start and finish the last evaluation.

Current Lag (ms)

The amount of time in milliseconds between the expected time an analysis was supposed to run due to a trigger or schedule and the actual time it executed.

Current Scheduling Lag (ms)

The amount of lag time in milliseconds that an analysis execution was delayed due to waiting in the evaluation queue.

Duplicate Ignored Count

This indicates that the Analysis or group received a triggering event that matched exactly with the latest trigger time. This may be harmless and simply indicate that one or more inputs received triggering events at the same time with the same timestamp, or it could indicate that one of the triggering inputs received late data.

Element Template

Name of the AF element template where an analysis is configured.

Error Count

The number of times an analysis tried to run but failed due to an error.

First Trigger Time

The time that the first trigger executed an analysis for evaluation.

Group Name

The name of the analysis or analyses group.

Group Trigger Ratio

A ratio of the average interval between analyses group executions.

Impact score

A rating that indicates an analysis's impact on system performance. This score is calculated for each analysis group using the following formula: Average Elapsed / Average Trigger x Average Analysis Count.

Last Trigger Time

The time that the last trigger executed an analysis for evaluation.

Out of Order Ignored Count

When an analysis is triggered at a timestamp that is prior to the latest trigger time, that trigger is considered out of order and discarded. This indicates that one or more input attributes received triggering events late or out-of-order with respect to one or more other triggering input attributes.

Rank

Indicates the depth of the analysis or analysis group in a dependency chain. Analyses whose input attributes do not come from any other analyses are rank 0. For example, an analysis with 1 or more input attributes that are outputs of rank 0 analyses is rank 1. An analysis with 1 or more input attributes that are outputs of rank 1 analyses is rank 2. The rank of a given analysis corresponds to the deepest input attribute. For a given dependency chain, no rank N analyses will be evaluated until every related analysis of rank N-1 or lower has been evaluated for that same timestamp.

Schedule

How often an analysis is scheduled to run. There are two types of scheduling: Periodic and event-triggered. See Understand analysis scheduling.

Skip Count

The number of times the execution of an analysis was skipped.

Skipped Evaluation Count

The count of the number of times an analysis evaluation was skipped.

Skipped Evaluation Percentage

The percentage of skipped evaluations since the Analysis Service started.

Success Count

The number of times an analysis was evaluated successfully.

Template path

The path of the AF element template where an analysis is stored.

Total Evaluation Count

The total number of times an analysis was evaluated.

TitleResults for “How to create a CRG?”Also Available in