REST APIs Turn 25: How They Came To Be and What Could Be Next

The Root Question: How Will the Emerging "AI Era" Impact the Artifacts of the "Web Era"?

Roy Fielding's PhD thesis in 2000 formally introduced the term Representational State Transfer (REST). As we near the end of 2024, REST as a concept will soon be at least 25 years old. As I will explain in detail later, these 25 years of REST characterize the "web era."

With the advent of the "ChatGPT demo" phenomenon and the new optimism fueled by AI and the automation opportunities it may provide, I wanted to revisit the topic of APIs in general and RESTful APIs in particular. In the last part of this article, I will speculate on what could be next in the field of APIs in the emerging "AI era."

I Am Interested in History Because It Provides Us Guidance on How to Navigate the Future

A warning to the reader: Fielding's thesis came before the spread of "REST APIs" across the WWW. Fielding proposed "REST" merely as one of the architectural styles for building "distributed hypermedia" on top of HTTP (more like an extension of HTTP). Therefore, Fielding's representation of the topic is quite abstract and nuanced.

I do not attempt to or claim to represent Fielding's thoughts in detail in this article. My effort will primarily be to trace the history of APIs in general, including what came before REST and some significant events that occurred after it was introduced. This before-and-after comparison should provide a good sense of how APIs have developed over time. I will focus more on how various API applications evolved than on the academic concepts that provide grounding for them.

What is the purpose of looking into history? Well, it is to see and understand more clearly what APIs are about and how they may evolve in the coming months and years. So, really, this is about studying a bit of history, after which I will speculate on a few directions and make some guesses about where things are headed. Consider this more of a personal exploration of the subject than anything serious and formal.

What REST Was About: Characterizing the Needs of the "Web Era"

When Fielding started work on his thesis:

  • The Web was around 10 years old.
  • It had accumulated a set of standards and a "way of doing things."
  • Fielding's attempt was to "avoid harmful architectures" through a thorough study of the requirements for scaling, caching, component partitioning, communication, and the evolution of the Web.
  • The focus was on collecting sets of "architectural constraints" to build up "architectural styles."
  • The practical result of Fielding's study was directly applied to improvements in HTTP and URL standards.
  • REST, as an architectural pattern, takes into consideration many properties: scalability of component interactions, general interfaces, independent deployment of components, latency, security, backwards compatibility, and more.

A Whirlwind Tour Through the History of APIs

1951 - The First API (But Not Explicitly Called That) - Maurice Wilkes

In 1951, Maurice Wilkes introduced one of the earliest concepts resembling an API in "Programs for an Electronic Digital Computer," outlining reusable software routines to simplify programming for the EDSAC computer.

1968 - The First Mention of the Term "API" - Ira W. Cotton

In 1968, Ira W. Cotton's paper "Data Structures and Techniques for Remote Computer Graphics" is one of the first documented uses of the term "API," referring to interfaces for remote graphics processing.

1974 - First Database API: C. J. Date

In 1974, C. J. Date compared relational and network database models, focusing on the differences in their Application Programming Interfaces (APIs) to facilitate database interaction.

1991 - CORBA Standard - Object Management Group

In 1991, the Object Management Group (OMG) introduced the CORBA (Common Object Request Broker Architecture) standard to enable communication between distributed, heterogeneous applications across different systems and platforms.

1993 - CGI - Roy McCool

src

In 1993, Roy McCool developed the Common Gateway Interface (CGI), an early standard for web servers to interact with external applications, laying foundational groundwork for modern web APIs.

2000 - Roy Fielding's Thesis Introduced the Idea of REST

Roy Fielding's Thesis from 2000

In 2000, Roy Fielding's PhD thesis introduced the concept of REST (Representational State Transfer), defining a scalable and stateless architecture for web-based applications.

2002: Bezos' API Mandate: Pushing for Microservices

In 2002, Jeff Bezos issued an internal mandate at Amazon requiring all teams to expose their data and functionality through service interfaces (APIs), which laid the groundwork for Amazon's adoption of microservices architecture and modern cloud computing.

2010 - Flickr's Photo API

src

In 2010, Flickr's Photo API allowed developers to programmatically access and manipulate user-uploaded photos, enabling features like photo search, upload, tagging, and metadata retrieval from the Flickr platform.

2015 - GraphQL - Meta Platforms

In 2015, Meta Platforms (formerly Facebook) introduced GraphQL, a flexible query language and runtime for APIs that enables clients to request only the data they need, streamlining data retrieval and enhancing efficiency.

2016 - gRPC - Google

In 2016, Google introduced gRPC, a high-performance, open-source framework for remote procedure calls (RPC) that enables efficient communication between distributed systems, leveraging HTTP/2 and Protocol Buffers for data serialization.

Speculations About the Future

With the Advent of AI: Standards Must Accommodate AI Bots as Much as Humans

When we look at the standards from the "Web era," the focus was on making systems work for people. Scalability, caching, security, understandability, and simplicity—all these are things we humans care about. They represent a set of concerns that are important to us.

But now we have new "members" -- AIs -- on both the producer and consumer sides. There will be teams of bots doing all sorts of work in many organizations. The ecosystem must change to maximize the productivity of both bots and humans.

APIs Must Become More Discoverable and Usable by AI Bots: HATEOAS May Make a Comeback

Fielding's thesis introduced the idea of HATEOAS — Hypermedia as the Engine of Application State. What Fielding meant by this was the importance of making APIs discoverable. In API responses, for example, you should include "links" or "references" to other resources and options available to the consumer.

The idea didn't take off in a significant way. However, with AI bots in play, this concept may become much more important, as various AIs may consume one's APIs. This could also mean that "dynamic responses" can be generated to fit the needs of a particular AI.

Producing Good APIs, Descriptions, and Documentation Will Become Easier and Cheaper

In the near term, I believe many pains associated with designing, testing, and sharing APIs can be reduced to zero with advanced AI tooling.

  • Writing documentation is a demanding task that requires significant skills, time, and energy investment.
  • Maintaining code and documentation in sync is a burden that is often disregarded.
  • Keeping the documentation friendly, helpful, and supportive of discoverability is a rare skill.

With the language understanding available from LLMs, all of these and more API-related problems can be solved. For example, at Hexmos, we are building LiveAPI, which is already producing good results in translating any code repository into a friendly and helpful documentation page—with minimal human input. And we're just getting started in this area, so I see huge improvement potential ahead.

LiveAPI: Interactive API Docs that Convert

Static API docs often lose the customer's attention before they try your APIs. With LiveAPI, developers can instantly try your APIs right from the browser, capturing their attention within the first 30 seconds.

The Rate of New APIs Introduced Will Increase Significantly

Given that LLM-aided IDEs are taking center stage these days, it is safe to assume that producing new code and features will require:

  • Less human effort
  • Reduced time

What this means is that every "implementation plan" can be expedited by an order of magnitude. You can achieve much faster implementation speed.

Even design and marketing become less burdensome, thus allowing for a faster time to market for software products, and APIs in particular.

This can only mean one thing: increased experimentation, more initiatives, and thus more APIs in the market to try out.

Self-Upgrading Client and Server Code May Become Possible: More Resilient Systems

Very few APIs last a year without experiencing breakage. As time goes on, the likelihood of an API remaining stable decreases significantly. Even venerated organizations such as AWS routinely have their APIs disrupted, causing problems.

At the root of this issue is the natural cycle of development: interfaces change, versions mismatch, features are removed, new ones are added, and communications are made (and ignored), among other factors.

There is a long process of negotiation and new understandings developed between producers and consumers on a routine basis. It may happen that the amount of "negotiation" needed to "fix" these mismatches can decrease significantly with automated discovery and refactorings of the codebase.

A New Bot Economy Is Emerging: Teams of "Skilled Bots" for Maintaining "New Infra"

One may argue that producing new sophisticated software is still too big a challenge for AIs, especially end-to-end. However, one can imagine teams of bots up to the task in terms of helping a few engineers maintain new infrastructure and APIs. They can infer customer messages, coordinate, and address maintenance issues. They can patch software, upgrade systems, and so on.

New bot capabilities for designing, implementing, testing, and sharing APIs may emerge, and I envision a bot economy developing in this area.

Software Becomes Cheaper: Most API Prices Will Decrease

With a host of automations becoming a real possibility, as outlined above, the "supply side" of software may flood, exceeding the "demand side." And we know what this means—automation and productivity in building and maintaining software lead to the ubiquity of software. This enormous increase in supply must bring down prices. I believe that the cost of software, and APIs in particular, may decrease overall. Of course, new classes of APIs may emerge, powered by more sophisticated systems that could cost more than before, but overall, at an aggregate level, prices are likely to drop.

Context-Aware APIs Are on the Horizon: APIs Can "Know" You to Adjust Responses

Right now, the "inputs" and "outputs" of APIs have been quite static by necessity. By "static," I mean that the information sent in is usually limited to snippets of strings and numbers representing some real-world entities, transactions, and such. However, the "inputs" themselves can become more sophisticated. My preferences, needs, and situations may become more important "inputs" to software in the future. "I and my situation" could influence an API more dominantly than any other feature. The trend has been moving in this direction with regard to recommendation algorithms, etc., but they have been expensive to produce and maintain. However, such things may become more commoditized, and common structures may emerge for dealing with "context."

Conclusion

Fielding's REST thesis characterizes the "Web era," and a new wave of automation is emerging as part of the "AI era." The history of APIs shows a tortured and complex route toward its present state. The future will undoubtedly involve a new set of competing standards, false turns, and so on, until new stable standards and "ways of doing things" emerge.

More Reading