Just recently, developers were certain that if we talked about API, it was the most likely about a #RESTful model. Not anymore. The ‘mess’ had started when the term (API) had been used to invoke #Microservices, which had (have) their own event-based application interfaces. Then Google and Facebook added #gRPC and #GraphQL APIs respectively. I am not going to compare them in this short post, but a few new risks might be useful to point to now.
Why REST is not enough? – Because it requires:
- a strict agreement between the developer creating the ‘consumer’ code and the developer creating the ‘provider’ code. They both need to know what-is-what in the request, i.e. share knowledge or strict documentation of the REST “service”. They both need to know which event of what Microservice means what and how to obtain it (via a poll, push or sourcing store). This information is not specified in Microservice Architecture, but if developers do not operate with it, no business tasks could be solved with Microservices
- multiple interactions about details of the same thing (too chatty), especially using GET method in RESTful API
- a fixed spectrum of API data models limited by providers, which is too much coupled with either particular consumer’s needs (via off-line agreements) or, opposite, not related at all
- special data compositions on the provider side only to support the request
- weak indirect or implicit support for integration in business functionality rather than in data.
In contrast, GraphQL can request multiple typed data subsets, identified by the consumer, and receive them at once. While GraphQL outperforms REST almost 10 times, it requires the provider
- to identify or set relationships between data – the graph – to queue (similar to SQL)
- if the data interrelationship on the provider side is real and exposed to the consumers, this couples provider with consumers and ties the provider flexibility
- GraphQL totally omits any security means over exchanged data as every consumer has the same rights to all exposed data. That is, a special security talk between provider and each consumer is due before considering GraphQL API even within the same application
- somehow the API data models should be shared with consumers and they should be notified about semantic and logical changes in the data models (should a strict agreement be used here again?). Reference on that no needs for versioning is true only for an added new type and/or data entity, i.e. new consumers can use them while the old ones would simply ignore them. If semantics, logical relationship or entities are changed or removed, all old consumers are impacted.
Special attention should be put on the gRPC API. They have resurrected the old hope of people on returning functional integration for business applications and systems; this was always awailable with language specific RPC, CORBA and WS*- interfaces. In such cases, the business applications/components minded and managed the data they needed to work with. IT’s over-concentration on data has made the collaboration between business and technology half-closed regardless of any agility.
A Business works via collaborating and cooperating on business activities – functionality – first of all. SOA and now its closest implementation via Microservices do model these business joint efforts though Microservice Architecture still does not recognise this fundamental business need (which will be fixed soon). As we know, RPC stands for Remote Procedure Call – it’s a procedure that allows one program to request a service from another program located anywhere on a network without having to understand the network’s details. That is, the service may be not necessary about data exchange.
The major benefits of gRPC are:
- it has its own protocol different from HTTP (though utilises #HTTP/2 that is backward compatible with HTTP 1.1) and, therefore is free from the necessity of coupling agreement typical to REST. At the same time, it is possible to map a request written in REST onto gRPC, where the mapping plays the role of the agreement
- it has its own interface description language – #Protobuf – that allows describing the structure of used data and supports a program for generating source code. The latter can parse a stream of bytes that represents the serialised structured data
- it is well suitable for streaming (while REST is not efficient for streams) and supports multiplexing for several parallel streams via the same connection
- it outperforms REST
The downsides and new risks of gRPC are:
- it violates the basic concept of consumer/provider (client/server) with its ability to push unsolicited content from the server to the client. It is unclear what and how the consumer can do with such extra information. It is very costly to make such a model to work in general case unless a close coupling (agreement) between consumer and provider is used again. The solutions used inside the company, in this case in Google, may not good for using across company’s boundaries (external APIs)
- the cost of using faster and more reliable HTTP/2 is that it requires HTTPS encryption from the consumer’s browser. While more and more browsers provide this feature out-of-the-box, there is still no guarantee that to-be secured communication works as expected.
The Protobuf is also able to specify several plugins such as authentication, bidirectional streaming and flow control, blocking or nonblocking network bindings, and cancellation and timeouts. I am not happy with this ability because it breaks the ultimate technology principle of separation of concerns. More is not always better.
Particularly, an interface to the entity must be flexible, but free from any business logic such as authentication or flow control – these are prerogatives of the entity and its execution context and needs. The same relates to cancellation and timeouts ‘cause they are the business’ behavioural means. If you attach such features to the interface, you will always end-up in discrepancies between life-cycles of your entity and its interface, which silly and non-professional.
Overall, the appearance of different types of API is a good thing. Apparently, it links back to the OASIS #SOA Service description that assumed many different interfaces for the Service depending on the consumer base and communication means. The developers now can feel free from the narrow-minded RESTful interactions.