Home > REST Architectural Goals > Performance

Performance

Two fundamental characteristics that distinguish distributed architectures from ones designed to operate on a single machine are the performance and reliability of communication. With Performance we need to contend with issues such as network latency and limited network bandwidth. Unreliable networks can also affect performance by losing, reordering, or delaying packets or messages, and requiring that they be retried or resent.

We often think of network performance as absolute, but it is driven by a number of factors, some of which are impacted by architectural decisions. First and foremost is the issue of what data actually needs to be transferred over the network. Data has to move if it is needed for processing at a different location from where it is stored. Therefore, architectures that keep data close to where it is processed naturally perform better than those that have to move it before it can be processed.

Once it is clear that all data flows are essential, the next question is how those flows occur and what type of overhead is involved. The fewer messages and round trips over the network taken to transfer the data and the less overhead that is included in those messages, the more efficient the transfer will be (Figure 1). Protocol overhead can be incurred as an interaction is started or a connection is established, and as each message or packet is sent as part of the interaction. Overhead can also be introduced by transferring irrelevant data that will not be used by its recipient.

Performance: Explicit design decisions can improve or negatively impact overall performance.

Figure 1 - Explicit design decisions can improve or negatively impact overall performance.

User perception is also often a performance concern. For example, a large database query that takes 30 seconds to complete but returns the first page of results in 5 seconds is perceived to perform better than a similar query that returns all results simultaneously after 30 seconds.

REST can negatively impact performance through its Uniform Contract constraint. For example, forcing services and consumers to always share a common technical contract can:

  • prevent data exchange optimizations
  • result in increased messages or round trips for a given activity to complete
  • dd redundant overhead to message content

REST can support the Performance goal by using caches to keep available data close to where it is being processed. It can further help by minimizing overhead associated with setting up complex interactions by keeping each interaction simple and self-contained (as a request-response pair).