Concurrent Requests Inside HTTP API Broker

When we design the API for our application, we need to make it clean and easy to access. One of the option that we can provide is to make it centralize, by means it only have one point target as API Server. But this idea seems not easy to enough when we face with many internal applications that sit in many box servers.

untitled-diagram

For example like diagram above, client provided access to list all the user’s contact profiles via API Broker. The API Broker itself allow to access the users application to get detail information about the specific user details, and allow to access the contacts application that has two API: to get list of user’s contacts and to get user contact profile details.

What happening  is when Client create new request “Get List User Contact Profiles” to API Broker, the broker will create new request “Get User Profile” to Users Application and when it received the use profile details successfully, it will continue to create new request “Get List User Contacts” to Contacts Application. The API Broker will receive the list of contact reference identifiers based on the user’s identifier that already pass to the Contacts Application. And the list of contact reference identifiers will required to read all the contact details.

The problem that we are going to face is there will be a latency issue, because based on the list of customer reference identifiers we need to do the iteration and each of the iteration need to create new request “Get Contact Profile”.

We can get rid the sequential iteration by put a concurrent with queue and workers module inside, it means that there will be a several worker that run concurrently to process message that pass thru the queue. And there will be mechanism to split message became several parts and recompose the parts back became an original message with additional information.

untitled-diagram1