People have been searching for a coherent and cohesive API standard for interoperability to hold together sets of computer systems since multiple processes could run on the same machine in the 1970s.
Computer systems go through a funny cycle --- they get bigger and bigger on a single machine until we have a giant basement full of a single mainframe and then break apart into smaller machines to distribute the load and then those machines get as big as possible and then, once they become unwieldy, break apart again. Once the systems begin to fragment and the systems need to communicate through a standard, we put together a new standard. I remember (and not with any fondness) the COM and CORBA revolutions of the mid-1990s when systems moved to the fat desktop-fat server configuration known as client-server. Then again in the mid-2000s when the COM and CORBA revolution collapsed and the new solution was XML based RPC in SOAP form.
And SOAP, which stands for simple object access protocol, turned out to neither be simple, nor object-oriented, provided weak access but was, arguably, an actual protocol. What started out as a good idea -- use a standardized XSLT to define endpoints and data transfer across HTTP POST to web services using a universal format -- turned into a nightmare of layers of SOAP add-ons, data representation woes, XML parsing slowness, and generally Java-based evil.
(Full disclosure: my last go-around with SOAP was writing a parser by hand with nokogiri and ruby 1.8.7.)
Representational state transfer or REST is the new standard for Web 2.0 light web services and microservices. The idea is simple: use existing HTTP 1.1 verbs to map to actions, and send arbitrary data across to be acted on based on those actions. Create controllers to support each of these actions for the various service types. And best of all, send the data in JSON. But it's not required -- it's just a good idea.
We just map CRUD to HTTP 1.1 verbs --> CREATE, READ, UPDATE and DELETE are passed back and forth with POST, GET, PUT and DELETE methods. And that's it for the entire REST protocol.
Surprisingly, for building a large number of herded microservices who need a coherent API, this actually works -- works better than previous RPC methods due to its straight forward simplicity. In practice, it mostly works, although when it comes to nullpotent GET calls oftentimes systems sort of drift away from strict GET implementations because applications tend to need to get back more nuanced data from queries. But otherwise... yeah.
Until a new standard comes out, it's safe to assume the standard will be to design coherent REST APIs to expose large systems of tiny machines running many tiny services to an application. It is the API design choice for the cloud and for the new set of Big Data microservices.
This is, of course, until we realize that for whatever reason REST sucks and we have a whole new paradigm. Just wait for it! It'll be here in about 5 years.