1. Go to Authorization > Policies.
  2. Select the policy you want to test, and then click the Test tab.
  3. Define the test scenario:
    1. Optional: To include a PingOne user ID as context in the decision request, search for a user in the PingOne User field.
      Important:

      If you are testing an element that depends on the PingOne.User attribute, you must select a user to avoid a MISSING_ATTRIBUTE error.

      The selected user must be defined in the current PingOne environment.

    2. In the Attributes list, select any attributes that you want to include as request parameters, and provide sample values.
      Tip:

      Click the angle brackets next to an attribute to display a rich JSON text editor.

    3. In the Overrides section, configure attribute and service values if you want to override those elements' default behavior.

      For example, if an attribute is defined with a request parameter resolver, and no value is specified in the test request, the decision service resolves that attribute from the Overrides configuration.

      Note:

      You can also click the Import JSON button to define the test scenario with a rich text editor.

    Using the Payment checks policy from the tutorials as an example, the following testing scenario uses the Amount attribute in a request to test whether the policy denies payments over 10,000 USD.


    Screen capture of the Testing Scenario tab showing a request with the Amount attribute set to a value of 10900.
  4. Optional: To process statements on a simulated API request or response, select the Process Statements check box.

    For more information, see Testing statement processing.

  5. Click Execute.

    The Visualization tab shows test results. As expected, the payment is denied.

    Screen capture of the Test Results Visualization tab showing a deny result.
  6. Examine the decision flow to make sure decisions are evaluated according to your expectations.

    You can click any box in the flow to show more details.

  7. Click the other tabs for additional details:
    • Request tab: Shows the JSON request sent to the decision service, allowing you to confirm that the expected information was sent.

      If you enabled statement processing, this tab also shows the API request or response.

    • Response tab: Shows the complete, high-verbosity response for the decision, including expanded errors and other helpful information.

      If you enabled statement processing, this tab also shows the transformed API request or response body and other statement processing details.

      Note:

      If the same comparison condition is attached to more than one rule in the policy subtree, the decision response only includes evaluation of the first occurrence of this condition. Despite only appearing once in the response, the decision service evaluates this condition wherever it is needed to make a decision.

      If the parent policy of the first instance of this condition is not applicable to the request, the decision response does not include evaluation of any rule containing this condition. This behavior is the same regardless of the rule's outcome (Permit, Deny, Not Applicable).

    • Output tab: Shows details about the decision, including the time it took to evaluate policies and rules.
    • Attributes tab: Shows details about the attributes used in the decision.
    • Services tab: Shows details about the services used in the decision.
    • Processing Result tab: Shows the transformed API request or response body with statements applied, along with any specified URI and headers. This tab is only displayed if you enabled statement processing.
  8. To repeat the test using a different scenario, click the Testing Scenario tab, change the parameters, and then click Execute.

    The following example tests an Amount value that is less than 10,000 USD.


    Screen capture of the Testing Scenario tab showing a request with the Amount attribute set to a value of 9900.

    A second Visualization tab shows test results. This time the payment is permitted. Test Results tabs are numbered to help you keep track of testing scenarios in the order in which they're run. If you delete a tab, for consistency, numbering remains the same for any remaining tabs.

    Screen capture of the Test Results Visualization tab showing a permit result.