On Leaving OAuth

I have to admit, I’ve been surprised by the tone and magnitude of the reaction to my announcement. I’ve been following the reactions on Twitter and every follow-up blog post I could find. I’d like to share some follow-up thoughts about this decision and the reactions to it. First, I didn’t say ‘dead’, I said ‘bad’.

Everything is fine

Few people came to the defense of the protocol, the working group, and the IETF. Surprisingly few. In fact, no one has raised the issue on the mailing list. This is not exactly a Hy-Brazil moment, but some of the reactions do make me wonder.

One response worth reading was from John Bradley in a post titled “The OAuth 2 Sky is NOT Falling”. John is a past collaborator (XRD) and someone I respect and his post offers a quality reaction from someone on the other side of the issue. As a leading contributor and co-author of the OpenID Connect family of specifications, it is easy to understand why John and that community are happy with OAuth 2.0 and the process it followed.

OpenID Connect is heavily based on OAuth 2.0 – I personally find it overly complex. Most of the OpenID Connect members joined the OAuth list in order to protest decisions that would have made it too restrictive for OpenID Connect to use OAuth 2.0, and their participation explains a few of the underspecified areas of the specification.

Another community that has been very satisfied with OAuth 2.0 is UMA. Some of the UMA project leads are people are like and respect, like Eve Maler. In the past I have invited members of the UMA community to share their project with the OAuth community on the mailing list, at IETF meetings, and on this blog. It has been a long time since I read up on UMA but I was always skeptical about its relevancy to the consumer web world I care about. UMA is also based on OAuth 2.0 and relies on many of its extensibility areas to operate. If you want to get an idea of the complexity (and richness) of this world, this is a good place to start.

One last example of an OAuth 2.0-based work that can help explain and justify the “decisions made” on OAuth 2.0 is OMAP (Online Multimedia Authorization Protocol). OMAP is an enterprise specification dealing with authorizing access to multimedia content and was authored by lead players in the space. I honestly don’t understand the value of basing this work on OAuth 2.0 or how such an extension could be claimed to operate in the same echo-system. It is the perfect example of the real use cases guiding the working group and the real-world application some members of the group were focusing on.

So, yeah. For some people “the sky is not falling” and OAuth 2.0 is a success. It is easy to see where they are coming from and why they hold this view. If OAuth 2.0 was a contest, they are clearly the winners and they won fair and square by playing by the IETF rules. I just don’t want to lend it my name, whatever it is worth.

Eran is an asshole

I fully respect the opinions of those who have directly interacted with me in person or online and formed positive or negative opinions about me. My style is aggressive, direct, honest, and often combative. I don’t (and cannot) pretend otherwise. All it takes is reading a small sample of my blog posts or mailing list communications to get a clear and accurate sense of who I am and what I’m like. I am exactly the same way in person.

But this also means that three years into this effort, to paint my move in negative personal attacks is childish and hypocritical, especially from people who never interacted with me directly (or those who pretended to be my friends).

In a comment to my announcement, Dick Hardt in a personal attack, points to the fact that I insisted on full editorial control and therefore, can only blame myself. The facts are somewhat different. I was offered to join the editorial team with two others. I declined and said I don’t edit with other people, but happy to let them do the work without me. I was made solo editor. It was all very pleasant. The IETF process makes it trivial for the working group chairs and area director to remove, replace, or supplement an editor at will. They really don’t need any reason.

A few months ago, when I started considering removing my name from the specification, I reach out to Mr. Hardt as a listed co-author on the document. His advice? “I don’t think it will serve you well. Better to stay the course and be disappointed with result. You can remain editor and blame IETF post editor role process.

Facebook and Google are doing just fine thank you

A quality, respectful, and worthy response from Tim Bray makes the argument that clearly, the existing (and very successful) deployment of OAuth 2.0 on the web tells a different story. I don’t disagree. As I said before, in the right hands, after applying the right security and making some decisions, OAuth 2.0 can be great. Facebook made those decision (one of which was to ignore the last 20-something drafts) and so did Google. However, even Facebook with their vast resources had issues with 2.0 and ended addressing them with proprietary solutions (e.g. a callback signature, login parameters).

I would have been thrilled if the final result was a documentation of the Google or Facebook implementations and nothing else. That would have been a useful protocol.

If you are not sure what I am so upset about, it is the total lack of making choices. Given the new architecture and capabilities, OAuth 2.0 should have been much more restrictive and narrow.

The IETF is good, the IETF is bad

A few reactions focused on the IETF part of the story. No one seemed to offer an actual defense of the organization. Joe Gregorio offered one only to immediately take it away. At best, people pointed to past successes at the IETF as proof the organization is still relevant and effective. The problem is that their examples are all from two decades ago. The second line of defense simply states that standards are hard and that’s just the nature of the beast. OAuth 1.0 proves it doesn’t have to be.

Glee

I can live with all the reaction above. Even those dismissing my arguments or calling me an asshole. I expected those. What I didn’t expect was an overwhelming expression of joy and schadenfreude. I didn’t realize just how much this specification was hated among web developers. I knew people were not thrilled about it but never imagined the level of discontent.

What is actually upsetting is that all these people (and we are talking hundreds of individual posts on Twitter) never bothered to engage. Three years is a long time to hate something this much without sending an email to the mailing list expressing it. If only a fraction of these people showed up, we would not be here today. People actually were paying attention but doing nothing about it.

Now, that’s just sad and pathetic.

(As always, comments are welcomed)

23 thoughts on “On Leaving OAuth

  1. Eran, regardless of everything, I think the standards industry needs more people like you, and also like Dick, and other troublemakers. People that stir up the staid, challenge conventions, and introduce new ideas. Sorry it didn’t seem to work out.

  2. I read your entire post and all the comments even though I know nothing about OAuth. I was very impressed and don’t think I am the only one. Admitting failure is hard enough but to do it in a thorough and yet balanced way while at the same time providing insight is extraordinary. One reason it’s rarely done is the power of schadenfreude – why expose yourself? Your ‘antagonists’ are probably triggered not by disagreement but because they sense that you are better at what you do.

    On the general topic: to produce technical solutions of value you need to 1) make decisions, 2) make good decisions and 3) make decisions that work together to realize a coherent vision. Committees often fails at point 1. and then on 2 and 3 as well.

    I don’t suppose you will have a hard time finding new work or are in need of a pat on the back – but anyway: If I was hiring your post would beat most CVs.

  3. You almost seem like you’re putting some of the blame on the devs not on the Oauth 2.0 team there, at the end.

      • It sounds like the resulting spec served the needs of those who showed up, and not those who did not. Whether that means the process worked exactly as designed or is horribly horribly broken is a rather subjective question.

        Incidentally, the HTML5 standards process hsa been set by very similar issues, and there are quite loud calls for authors to get off their butts and get involved so that the browser vendors don’t run off with implementations that no one can actually use in practice. It’s not just IETF, nor is it just W3C.

        Of course, the real problem is that there’s an opportunity cost to showing up. I already work an 8 hour day building web stuf ffor clients; I work on a large open source project; I’m trying to get involved in a number of web standards efforts. But I also want to enjoy a quiet dinner at home sometimes without having to think about things. We humans only have 40 hours of productive work in a week, and no one but “enterprise” can really afford to put their really smart people on long-game investments like steering a public standard toward what would benefit them in 5 or 6 years. I have no solution for that, sadly. If I did, I’d already be implementing it.

  4. The problem I see with OAuth 2 is not that it exists, but that there doesn’t seem to be anything better. OAuth to me is a login ticket system that allows (optional) delegated login, gives the user control over his data and the ability to revoke it from certain applications and does so in a standard way. There are other standards (e.g., SAML) which address a part of it, but OAuth is the only standard I know that specifies a full application stack. Are there other alternatives? (Not something like OpenID since our applications already handle authentication/user management. OAuth allows to put a ticketing system on top of an existing login system)

  5. The Glee aspect is indeed very sad – but unfortunately I think this happens to a lot of critical technologies – while it’s there people take it for granted, when it disappears they treat it as some sort of “victory for chaos against the establishment” – but what they don’t realise is that there really is *no establishment* – it’s all just people with motivation getting shit done. Without that, we wouldn’t have the Web.

    Whatever has gone wrong or right in this process, oAuth has played a huge role in moving everything forward – I don’t believe it’s dead but there are clearly some things to sort out. Hopefully the consensus can get built to do that.

  6. Your Glee section is utterly depressing… :( Although that seems to be the way of the majority of people work – follow success, attack failure (rather coldly Darwinian). I can guarantee most of those involved have never attempted to build anything original, or invested a vast amount of enegery in making a change to anything they feel strongly about. I should think trying to construct the “second album” of a standard is an extremely difficult thing to do, and as such, I totally respect your efforts… it does seem like an argument for the constant creation of entirely new 1.0 standards each time – rather than progress to 2.0 – less corporate interest the better ;)

    Good luck with your future endeavours.

  7. Glee: maybe it’s simply because many web devs have already had the bitter experience you just had with enterprise people.

    OTOH, if there are shortcomings of OAuth1 which should get fixed, and OAuth2 was in some way taken away from the community by the enterprise guys, why not start a new standardization effort, not aligned with the IETF, driven strictly by web devs, and get it right? Many internet-related standards nowadays, sanctioned by the IETF, are overly complex, but non-enterprisey protocols still catch on, once there’s a viable community behind them, and the enterprise guys eventually accept them. One of the examples that come to my mind right now is JSON-RPC.

  8. Like most groups of people, standardization efforts work best when narrowly focused. In my mind, the best standards emerge from cases where the committee codifies existing practice, rather than making stuff up.

    The more leeway the standards committee has for adding ‘new things’, the more likely the committee is to evolve the standard into a massive framework of standards that, because they can’t agree on some things, doesn’t actually answer the hard questions, but leaves it open for everyone to interpret as they see fit.

    I’m presently implementing some things based on the IEEE 11073 standards, and they clearly suffered the same design-by-committee approach. There’s way too many of them, they leave a bunch of hard questions up to the implementer, and they are overly complex for any of the tasks that the average user would like to accomplish.

  9. Any idea why David Recordon yanked his name from both specifications too? He seems less talkative than you are.

  10. Personally, I felt that OAuth 1.0 failed to do real standardization: yeah, it was there, and everyone had their own incompatible versions.

    I expected OAuth 2.0 to be more complex, perhaps even too complex for my simple mind, and I expected client
    Iibraries on every possible language from brainf.ck to PIC assembly to have a client-side standard implementation, which would connect to Google or Facebook with the same ease as to a freelancer’s RoR or django site, just by changing a single domain name and a single API key perhaps in a config file.

    I expected also server libraries and/or proxies, which could be easily put in front of any webservice, written in any popular server-side framework, from Rails to Spring, which would handle all the juggling for us, simulating a simple authenticated user for us, perhaps just a flag telling us that it’s in fact an API client and wether if it’s an offline usage, just to make sure.

    Anything which has an if provider==GOOGLE, or if provider==FACEBOOK inside in it is a failure, just as well as anything which would take more than 10 minutes for a seasoned developer to deploy securely, using gem install or npm install or similar methods.

    Security is complex, environments are unpredictable. This is how life goes. The only question is wether we have full interoperability, or we have the same situation as where we started before OAuth 1, everyone having an authentication system working based on the same principles, but still differently, only this time with an OAuth logo, falsely claiming this is standardized by now.

  11. As a kind of “emotional support”:

    I left the XMPP Standards Foundation, when I finally realized, we won’t be able to create a universal, unified experience on par with proprietary chat software of that time. In fact, sometimes I believe I was the only one who wanted to do this, others simply wanted a stamp, saying they’re standards-compliant or standards-based.

    The worst was PubSub, which is so complex, than it was three times larger than the protocol specification for XMPP itself, and it was so general that it was a glorified, XML-driven version of something between a TCP-over-XMPP, an ESB, and a MessageQueue system, in a way that it made impossible to create implementations on par in speed with marathon runners bringing stone boards between cities, with information carved in as morse code.

    I think even a pigeon carrier would be less general, more performant and more standard than xmpp-pubsub, even considering the fact that pigeons are only able to return to their home address, they have to be carried by different means to everywhere else.

    I realized, that ANYTHING I want to do with it, I have to implement it basically from scratch, and in case I’d exchange the underlying XMPP library to a pure TCP stack or a message queue system, I’d be exactly there, except this time the speed wouldn’t be reminescent of ENIAC, nor the data connection would need gigabit or sophisticated content-optimized compression frontends.

    I held a lightning talk once, on FOSDEM 2009, in the XMPP room. I said, that if we have a network full of clients and servers, with support for different optional extensions (and even inside the extension standards, there are optional parts or choices officially lef to the implementor), from a network perspective, we have a protocol support of zero, as only the common denominator counts, only the base protocol is supported.

    (this is actually not entirely true: network support can be only meaningful from a single node’s perspective, and even for that, it’s weighted by communication frequency: that is, if 98% of your xmpp contacts use GTalk and Adium including yourself, you can do nearly everything Adium offers; but that means no file sending, audio/video talk /conferencing, shared online games or screen sharing)

    Nowadays, all chat networks are XMPP-compatible, but that basically means that you see your buddylist on facebook, with offline/online state (no away!), and you can send them text messages. Party like in 1999… Oh, did I mention that a lot of XMPP clients support only one single account, expecting you to be present on one network, and federated to others?

    I hope that there’ll be something out of OAuth, which provides at least this level of interoperability, that a general OAuth client can connect to any provider without hassle, and perform the basic tasks. But I have my fears…

  12. What obstacles were there for You finding these 300 web debs before you had given up?

    Building a community is hard. Getting it to engage and pull for you is harder.
    Seems like that kind of support could have generated an outcome you were more pleased with?
    Is there a lesson besides the people who care didn’t do enough?

  13. Long after the crowds called for it, Ive recently been converted to OAUTH. And its probably precisely for the reason that, eran, you left the movement. It has gone beyond a movement, and a research task ; into being a standard. And we only use standards.

    Standards suck, technically; only by v3 do we in our space touch them. They are supposed to suck, having been through the mill of tradeoffs. But, the point is that they have passed the bar, being suited for the 80% use cases and the 80% of folks (like me) who dont know one end of a technology from the other. But thats the point … if its going to become mainstream.

    Ive resigned from lots of working groups. In general, it meant the group and its mission had gone beyond what i considered to the mission, and typically had become (at IETF, in particular) hijacked by some entity set with large US government backers. Having worked in US govt circles, I know the process (it could not care who it loses along the way, so long as it gets what it wants in terms of “owning” the agenda, even if it has to throw out the baby to own the perfumed bathwater it so craves…)

    At the end of the day, you ask yourself: did I make a difference – at the scale you were working at. If the technology concept you had didnt make it (but some variant did) dont worry. NSA spent a lot of effort in ISO standardizing TP4 to lead the world of secure TCP sessions and connections; but ended up with a society that used SSL!

    You live with it. Next decade, you try again, once folks catch up, and the facebooks and googles are no longer pre-eminent playing their vendor games. Of course, there will be (and must be) another set of high ego types, just as painful, and opiniated (and manipulative). But again, thats STANDARDS!

    • I refuse to accept the broken status-quo and just live with it. It is this attitude that prolonged the awful state of affairs. The world will be much better if more people stood up and called crap what it is. I think your statement that you only use standards shows the other side of the problem, where a meaningless label is made significant.

      • Eran,

        It’s great that you’ve called stuff out — have you got any suggestions/recommendations about what the way forward is/should be. I assume that since you are refusing to accept the broken status-quo you are doing something to help fix it (more than just pointing out the problem).

  14. Personally, I think you (Eran) did a good job against what, at times, seemed like a snowballing of creep-and-compromise which while no single component was *that* bad for OAuth2, they added up to what felt complex, sprawling and unfocused. As an implementor, I didn’t stand up when I could have, watching this happen on the mailing list etc, but we all pick our fights.

  15. One question, and maybe I should be searching your blog archives to find the answer but could not on first try:
    “Why was the decision made to go with IETF for an Application-level Authentication protocol for the web?”

    Wouldn’t you agree that the W3C pretty much owns Web Standards and others such as IETF or OASIS have been far less effective at getting their standards finished and to market, not to mention gaining widespread adoption. I’m not saying W3C is perfect (they certainly have their fair share of blunders i.e. SOAP, WS-*, xHTML2, WAI-ARIA, etc) but in general most application-layer Web Standards tend to do better there, and I’d imagine an “Authentication & Authorization Working Group” would have garnered quite a bit of support.

    You may not have gotten around the aforementioned “design by committee” problems, but at least it would have given it a home that once abandoned by its founders, the spec would have been carried by the communitee or rolled into something else (just as xHTML2 was rolled into HTML5).

    Personally – it’ll never happen – but I’d love to see a smart combination of WebID for Identity (except the only requirement should be having a primary email address as in WebFinger… not the knowledge to create an X.509 Digital Certificate), plus some of the few remaining good bits of OpenID for SSO Authentication, and last but not least a modified OAuth 1.0a for Authorization that offers the choice of:
    A) Time-delimited tokens signed by a key generated from your WebFinger/WebID certificate (that was auto-generated for you on first registration to the Provider service)
    B) Reliance on SSL and Transport-layer security ala OAuth 2.0

    Also, to top it all off for rating security and protecting WebIDs would be the combination of something like Web Of Trust signing:: http://xmlns.com/wot/0.1/ with the Web Of Trust consumer rating system: http://www.mywot.com/
    It’s a grand vision, and maybe a flawed or naiive one, but a single product could and should tie in the best of these products to finally solve the web’s Authentication/Authorization headaches.

Comments are closed.