Does REST Provide Deep Interoperability?

We at ZapThink were encouraged by the fact that our recent ZapFlash on Deep Interoperability generated some intriguing responses. Deep Interoperability is one of the Supertrends in the ZapThink 2020 vision for enterprise IT (now available as a poster for free download or purchase). In essence the Deep Interoperability Supertrend is the move toward software products that truly interoperate, even over time as standards and products mature and requirements evolve. ZapThink’s prediction is that customers will increasingly demand Deep Interoperability from vendors, and eventually vendors will have to figure out how to deliver it.

One of the key points in the recent ZapFlash was that the Web Services standards don’t even guarantee interoperability, let alone Deep Interoperability. We had a few responses from vendors who picked up on this point. They had a few different angles, but the common thread was that hey, we support REST, so we have Deep Interoperability out of the box! So buy our gear, forget the Web Services standards, and your interoperability issues will be a thing of the past!

Not so fast. Such a perspective misses the entire point to Deep Interoperability. For two products to be deeply interoperable, they should be able to interoperate even if their primary interface protocols are incompatible. Remember the modem negotiation on steroids illustration: a 56K modem would still be able to communicate with an older 2400-baud modem because it knew how to negotiate with older modems, and could support the slower protocol. Similarly, a REST-based software product would have to be able to interoperate with another product that didn’t support REST by negotiating some other set of protocols that both products did support.

But this “least common denominator” negotiation model is still not the whole Deep Interoperability story. Even if all interfaces were REST interfaces we still wouldn’t have Deep Interoperability. If REST alone guaranteed Deep Interoperability, then there could be no such thing as a bad link.

Bad links on Web pages are ubiquitous, of course. Put a perfectly good link in a Web page that connects to a valid resource. Wait a few years. Click the link again. Chances are, the original resource was deleted or moved or had its name changed. 404 not found.

OK, all you RESTafarians out there, how do we solve this problem? What can we do when we create a link to prevent it from ever going bad? How do we keep existing links from going bad? And what do we do about all the bad links that are already out there? The answers to these questions are all part of the Deep Interoperability Supertrend.

One important point is that the modem negotiation example is only a part of the story, since in that case, you already have the two modems, and the initiating one can find the other one. But Deep Interoperability also requires discoverability and location independence. You can’t interoperate with a piece of software you can’t find.

But we still don’t have the whole story yet, because we must still deal with the problem of change. What if we were able to interoperate at one point in time, but then one of our endpoints changed. How do we ensure continued interoperability? The traditional answer is to put something in the middle: either a broker in a middleware-centric model or a registry or other discovery agency that can resolve abstract endpoint references in a lightweight model (either REST or non-middleware SOA). The problem with such intermediary-based approaches, however, is that they relieve the vendors from the need to build products with Deep Interoperability built in. Instead, they simply offer one more excuse to sell middleware.

The ZapThink Take

At its core Deep Interoperability is a peer-to-peer model, in that we’re requiring two products to be deeply interoperable with each other. But peer-to-peer Deep Interoperability is just the price of admission. If we have two products that are deeply interoperating, and we add a third product to the mix, it should be able to negotiate with the other two, not just to establish the three pairwise relationships, but to form the most efficient way for all three products to work together. Add a fourth product, then a fifth, and so on, and the same process should take place.

The end result will be IT environments of arbitrary size and complexity supporting Deep Interoperability across the entire architecture. Add a product, remove a product, or change a product, and the entire ecosystem adjusts accordingly. And if you’re wondering whether this ecosystem-level adjustment is an emergent property of our system of systems, you’ve hit the nail on the head. That’s why Deep Interoperability and Complex Systems Engineering are adjacent on our ZapThink 2020 poster.