API Gateways to Direct Microservices Architecture

API Gateway to direct microservices architecture nordic apis

In an age where thousands of devices interact with microservices and API-driven servers, an API gateway can act as a single entry point to internal architecture — a popular choice for developers as it increases security, improves user experience, and helps ecosystems thrive.

In this piece, we discuss what an API gateway is, the benefits and drawbacks of such a system, the differences between API gateways and API management, and how to implement an API gateway in the modern API economy.

What is an API Gateway?

An API gateway is a singular interface that handles a variety of requests to internal servers. The term has been applied to various architectural flows, but in it’s simplest form, an API Gateway is may act as a:

  • filter for incoming traffic from disparate devices — web, mobile, cloud, B2B, web services, etc.,
  • single entry point layer that can expose various APIs, microservices, and/or virtual machines on the provider’s application server,
  • common offering within API management solutions,
  • router for API rate limiting and traffic management,
  • security mechanism, with authentication, log, and more.

When a user enters an API gateway, they may be coming from a variety of disparate devices. They may require a variety of responses to their large variety of queries. The gateway unifies these requests, and presents a front end in which the requesting entity can use almost any system, browser, or client, and manage to get what they need.

api gateway visualization-flow-architecture-nordic-apis

Difference Between API Gateways and API Management

It’s worth taking a moment to discuss API management solutions, if only to differentiate the two. While there certainly is some crossover, API gateways and API management is really a comparison of apples and oranges.

API gateways create a barrier between the external requesting clients and the internal APIs that tie disparate microservices into a unified architecture. API management, on the other hand, is more focused on a larger scope. Whereas an API gateway is simply a method of redirection and filtration of third party traffic, API Management solutions handle this as well as analytics, business data, adjunct provider services, and implementation of versioning control.

API management can encompass API gateways — after all, a suite focused on management wouldn’t be worth its salt without offering such a feature. It is important to delineate the two, however, as they are often (erroneously) referred to as synonymous.

In short — an API gateway is a component in a full API management solution.

6 Benefits of an API Gateway

Integration of an API gateway can be time consuming, but is ultimately worth the time and resources. For most use cases, the implementation of an API gateway will grant such a substantial amount of benefit to the ecosystem that it is worth the relatively few drawbacks.

1: Separation of Application Layer and Requesting Parties

One of the greatest benefits of such an implementation is the fact that the API gateway by design segregates the microservices partitions within the API from the users of that API. This distance increases security and is done in such a way that, while the two realms are separate, the relationship remains fundamentally the same.

2: Increases Simplicity for the Consumer

By presenting external users a single, unified front end to a bevy of sub-APIs and systems, microservices can be partitioned into multiple derivations, each with specific functions, servers, security solutions, and implementations, without being transparent.

3: Improves Development and Increases Server Load

Segregation of functionality and purpose not only makes development more streamlined by focusing team talent where it can be best used, it also reduces server load by separating common and uncommon functions. A common function might field thousands of small requests a day, whereas an uncommon function might field only a few — but if the few requests require more bandwidth and server resources than the many, segregating them will benefit the functionality of the entire system.

4: Buffer Zone Against Attacks

As reverse engineering of APIs usually takes place when a service is exposed, segregating these servers makes them harder to take down or manipulate. By separating them, you create a buffer zone of sorts, where if a service is compromised, only that service will be affected.

To the user, however, this goes unseen — the gateway itself serves as an abstraction layer, preventing the user from ever being aware of this partition. The only difference a user might experience is the ability to make multiple requests in a single, unified request — an experience that is only positive in nature, reducing the burden of use on the user, and shifting multi-call logic (and the relevant processing power required) from the client to the gateway.

5: Cater to Specific Customers to Improve UX

Unlike the unified calls that the segregated services provide when unified at a gateway, where the evidence is in the call, the best benefit of this system is one the user will never see. By segregating microservices into various portals and filtering traffic via a gateway, the API developer can more accurately provide the best API for each client and use case.

When a user makes a request, they don’t have to wade through thousands of lines of documentation to find the specific call — they simply look at the specification for the exact API provided by the gateway, and make their request. The user will never be aware that there are more revisions of the API handling different functionality, and even if they somehow figure this out, they will not be confused or distracted by additional, unnecessary functionality.

These benefits are especially important when considering the variations of the cloud stack, where the functions and needs vary wildly between each segment of the population, and where certain services can call upwards of a thousand calls a minute.

6: Log Metrics to Anticipate Change

Finally, this approach grants a wonderful opportunity in the realm of analytics. By segregating services, tracking the data used on those services, and seeing what parts of the network undergo the most stress, developers can more accurately track and understand the incredibly important API metrics data that can often mean wide adoption or death, depending on how it’s used.

Drawbacks of an API Gateway

Don’t be misled, however — the drawbacks of the API gateway system may be significant. These arise directly from the way API gateways handle data, and how they interact with other services. Unlike issues with languages, documentation specifications, or architecture approaches, the drawbacks inherent in the API gateway approach cannot be fixed with a third party solution or a simple tweak.

First and foremost, the API gateway approach introduces an enormous level of complexity. Imagine a basic API server setup which provides 99.9% uptime with consistent bandwidth and relatively stable distribution of traffic.

In such a setup, the client request would come into your internal network, be filtered through a variety of security solutions, be passed through a load balancer, and finally arrive at a single server for processing and response. This process takes place incredibly fast, only hampered by the large number of devices passing through.

Now imagine a setup using the API gateway approach. The request would enter the internal network through a gateway. At this point, the traffic would be filtered, and passed to load balancers grouped between various servers or virtual machines. Traffic would have to be analyzed and quantified as to where it belongs. After this, the process would be passed on to the server, which would then validate that it indeed can carry out the call, which it would then do.

This complexity grows exponentially — having logical or virtual machines for each subset of an API, especially a unified API composed of ten or twenty functions, balloons not only the development and maintenance overhead of the system, but the length of time it takes to process a request.

These are the two main drawbacks of the API gateway approach:

  • Increased complexity arising from API segregation;
  • Increased response time deriving from added layers of communication.

Gateways as a Security Feature

Security is a huge deal in the API space — no discussion about a new feature in the API space would be complete without addressing the potential benefits and drawbacks in the face of security.

The good news is that implementing an API gateway is one of the best things you can possibly do for your API’s security profile. When considering security, there’s a fundamental acronym used by IT professionals the world over – CIA. CIA stands for Confidentiality (the privacy of information), Integrity (ensuring that data is not changed illicitly), and Availability (ensuring data is accessible when requested).

API gateways meet CIA in three highly effective ways:


By isolating the servers on which private data is secured either logically through an API gateway server or virtually through segregated server instances, the confidentiality of data is ensured by removing the main methods of attack.

Your servers are designed to resist intrusion, and to prevent data breaches. By segregating the data from front-facing servers, you’re creating a choke point, where, even if data is breached, you can control whether the data ever leaves the server in the first place.


In the API gateway approach, integrity is assured as a result of the way data is handled. When data enters the server ecosystem, each server routes the requests and traffic to various other servers and microservices. As these requests are vetted and fulfilled, data is maintained the entire time, and a long paper trail of sorts is maintained.

It is this very paper trail that ensures the data integrity is assured, as there are multiple points where the data can be compiled, checked, and recorded.


One of the biggest issues in the web space is availability. Unfortunately, there are groups of people out there that are going to constantly attack you, constantly test your servers. Thankfully, the API gateway provides a good defense against this.

When poorly formed requests filter in, they’re not handled by the same server handling the API, and can be rejected while approved connections maintain their stream of communication.

This can be balanced even further with load balancing solutions. Leveling your API gateway across multiple servers, and then routing these servers to the internal API microservices architectures creates a situation where an attacker can ram against your server, be rejected, and nobody is any the wiser.

Real-World Example

Auth0, a Single Sign-On and Token Authentication developer, recently pushed documentation demonstrating the API gateway concept in a very clear, easy to understand example. The example, featured below, is a simple Node.js gateway, which receives and forwards HTTP requests to internal endpoints, authenticating along the way utilizing JWT (JSON Web Tokens).

 * Parses the request and dispatches multiple concurrent requests to each
 * internal endpoint. Results are aggregated and returned.
function serviceDispatch(req, res) {
    var parsedUrl = url.parse(req.url);

    Service.findOne({ url: parsedUrl.pathname }, function(err, service) {
        if(err) {

        var authorized = roleCheck(req.context.authPayload.jwt, service);
        if(!authorized) {

        // Fanout all requests to all related endpoints. 
        // Results are aggregated (more complex strategies are possible).
        var promises = [];
        service.endpoints.forEach(function(endpoint) {   
            logger.debug(sprintf('Dispatching request from public endpoint ' + 
                '%s to internal endpoint %s (%s)', 
                req.url, endpoint.url, endpoint.type));

            switch(endpoint.type) {
                case 'http-get':
                case 'http-post':
                    promises.push(httpPromise(req, endpoint.url, 
                        endpoint.type === 'http-get'));
                case 'amqp':
                    promises.push(amqpPromise(req, endpoint.url));
                    logger.error('Unknown endpoint type: ' + endpoint.type);

        //Aggregation strategy for multiple endpoints.
        Q.allSettled(promises).then(function(results) {
            var responseData = {};

            results.forEach(function(result) {
                if(result.state === 'fulfilled') {
                    responseData = _.extend(responseData, result.value);
                } else {

            res.setHeader('Content-Type', 'application/json');
    }, 'services');

In this example, a few things are occurring. First of all, when a request is received, it is processed by using the serviceDispatch function, which attempts to match the requested data from relevant services. The service.endpoints.forEach function then sends out requests to known endpoints within the internal network — in other words, these two functions comprise the first handling of the API gateway data flow.

After this, the API utilizes the result functions to aggregate the responses returned to the gateway, at which point, the appropriate response is channeled to the user who requested it.


Fundamentally speaking, an API gateway is perhaps one of the most effective tools in the modern API development toolkit. Between the load balancing, microservices partitioning, and security benefits granted by the system, what few drawbacks do exist pale in comparison.

That is not to say, of course, that API gateways are a magic bullet. If a developer only uses a single API function, the API gateway is a pointless endeavor. But for API providers using two or more basic approaches or functions in their API, the gateway is a wonderful choice, and should be seriously looked at as a design and security feature.