Help us improve your experience.

Let us know what you think.

Do you have time for a two-minute survey?

 
 

About the Tests Page

To access this page, click Observability > Active Assurance > Tests.

Tests are entities that you configure to measure metrics in a time-bound manner to produce a static measurement for the given duration. You (superusers and network administrators) can use the Tests page to view a list of Tests that the Test Agents run in your network and the details of these Tests. For more information, see Tests and Monitors Overview.

The widgets display the following information:

  • Total—The total number of Tests that you have run in your network.

  • Passed—The number of Tests completed successfully. Paragon Automation considers a Test as passed when the KPIs have not exceeded the evaluation criteria you have specified. Paragon Automation also displays the pass percentage of Tests.

  • Failed—The number of Tests that failed. Paragon Automation considers a Test as failed when any one of the KPIs have exceeded the evaluation criteria you have specified. Paragon Automation also displays the fail percentage of Tests.

  • Error—The number of Tests that have encountered errors. Paragon Automation also displays the error percentage of Tests. An error occurred can be of any reasons from a Test Agent being offline to a metric configuration failure or a timeout.

    Note:

    By default, these widgets display the data for the past 1 week. However, you can customize the data visualization for a specific time range. For example, if you click 2h, the widgets displays the Tests that were run for the past 2 hours.

Tasks You Can Perform

You can perform the following tasks on the Tests page:

  • View details of a Test—You can view the list of all the Tests that you have run and the details of Tests. To view the details of a Test, select a Test and click More > Details. You can also hover over the Test-Name and click the Detailed View icon. The Test execution details pane appears on the right side of the page displaying the Test details.

    On the Test execution details pane, you can:

    • Copy API Request URL—You can copy the API Request URL so that you can reuse the copied API request URL to fetch the Test details when you rerun the Test.

    • Download Results— You can download the Test result summary as a JSON file to your local system. When you click the Download Results button, a JSON file is generated that you can download to your local system.

      The JSON file includes information displayed in the Tests execution details pane that can be used to analyze the results locally.

    • You can view additional information such as, Test ID, Test name, Test status (passed or failed), and so on.

    Click the Close (x) icon to close the pane.

  • View details of a selected Test. See About the Test-Name Page.

  • Create a Test. See Create a Test.

  • You can also perform the following tasks on this page:

    • Sort, resize, or re-arrange columns in a table (grid).

    • Show or hide columns in the table or reset page preferences, using the vertical ellipsis menu.

    • Filter the data displayed in the table—Click the filter icon (funnel) and select whether you want to show or hide advanced filters. You can then add or remove filter criteria, save criteria as a filter, apply or clear filters, and so on. The filtered results are displayed on the same page.

    For more information, see GUI Overview.

Table 1: Fields on the Tests Page
Fields Description
Name The name of the Test that you specified when you created the Test.
Status The status of the Test:
  • Scheduled—The Test is scheduled to run.

  • Waiting—The Test Agent is getting ready to run the Test.

  • Running—The Test is running.

  • Stopping—The Test is being stopped and preparing the results.

  • Error—The Test has encountered an error while being executed by the Test Agent. The errors can be anything from a Test Agent being offline to a metric configuration failure or a timeout.

  • Passed—The Test has passed successfully.

  • Failed—The Test has failed because values for some metrics exceed the threshold you configured.

  • Canceled—The Test has been canceled by you. Paragon Automation retains the data related to the canceled Test up to the point of cancellation.

Status Message

The status message generated by Paragon Automation after you run a Test.

Execution Start Time

The date and time when the Test was executed. The timestamp is displayed in the following format: Month DD, YYYY, HH:MM:SS AM/PM.

For example, Feb 5, 2024, 4:29:52 PM.

Execution End Time

The date and time when the Test ended. The timestamp is displayed in the following format: Month DD, YYYY, HH:MM:SS AM/PM.

For example, Feb 5, 2024, 4:29:52 PM.

Executed By

The username (e-mail address) of the user who has run the Test.

Test Tags

The tags you have configured for the Test in the key:value format.

By default, only two tags are displayed in the column. To view more than two tags, click + Number more.

A tag is a key-value pair in which the key signifies a category for which you configure a value. The value is an identifier for the category.

Examples of key-value pairs are device and device name (edgedevice:acx7000), site and site name (site:bangalore).

You can configure tags for a Test at the time of creating a Test.