Inline mode

In PingIntelligence inline deployment mode, API Security Enforcer (ASE) sits at the edge of your network to receive the API traffic. Inline mode can also be deployed behind an existing load balancer, such as AWS Elastic Load Balancing (ELB).

In inline mode, ASE deploys at the edge of the data center and terminates SSL connections from API clients. It then forwards the requests directly to the APIs, API gateways, or app servers, such as Node.js, WebLogic, Tomcat, and PHP.

To configure ASE to work in inline mode, set the mode=inline in the ase-defaults.yml file.

A diagram of API Security Enforcer inline deployment mode.

The following is a high-level description of traffic flow:

  1. A client request is received by ASE. The request is logged in the access log file. ASE then forwards the request to the backend server. The response is received by ASE and logged in the access log file.
  2. The request and response in the access log file are sent to the ABS artificial intelligence (AI) engine for processing. The ABS AI engine generates the attack list, which is fetched by ASE. The future requests received by ASE are either forwarded to the backend server or blocked by ASE based on the attack list.
  3. The AI engine data is stored in MongoDB.
  4. The PingIntelligence for APIs WebGUI fetches the data from ABS to display in the dashboard.

Sideband mode

When PingIntelligence is deployed in sideband mode, a sideband policy is added to the API gateway, which makes calls to ASE to pass API request and response metadata. In sideband mode, ASE does not terminate the client requests.

To configure ASE to work in sideband mode, set the mode=sideband in the ase-defaults.yml file.

A diagram of API Security Enforcer sideband deployment mode.

The following is a description of the traffic flow through the API gateway and Ping Identity ASE:

  1. The API client sends a request to the API gateway.
  2. The API gateway makes an API call to send the request metadata in JSON format to ASE.
  3. ASE checks the request against a registered set of APIs and checks the client identifier against the AI-generated deny list. If all checks pass, ASE returns a 200-OK response to the API gateway. Otherwise, a different response code is sent to the gateway. The request is also logged by ASE and sent to the AI engine for processing.
  4. When the API gateway receives a response from ASE, then it forwards the request to the backend server unless blocking is enabled and the client is on the deny list.
  5. The response from the backend server is received by the API gateway.
  6. The API gateway makes a second API call to pass the response information to ASE, which sends the information to the AI engine for processing.
  7. ASE receives the response information and sends a 200-OK response to the API gateway.
  8. The API gateway sends the response received from the backend server to the client.
Note:

To complete the ASE sideband mode deployment, see Integrate API gateways for sideband deployment.