More OAuth Nonsense

ComputerWorld is the latest to run a scary story about OAuth 2.0 and how insecure it is. Unfortunately, instead of doing their homework and paying attention to my post, they borrowed a bunch of my quotes (almost half the article), added some original nonsense, sprinkled a few errors, and gave it a sensational headline: “OAuth 2.0 security used by Facebook, others called weak”.

OAuth 2.0 without signatures works just fine for companies like Facebook, because their developers hard-code their API endpoints. There is no danger what-so-ever for an access token to leak or get phished (any more than if they used signatures) because Facebook uses HTTPS for their OAuth 2.0 API.  This is why I titled one of the subsections: “Why None of this Matters Today”. My post was about discovery and long term security improvements of the web – not that there is anything broken about today’s implementations.

Let me make this crystal clear: OAuth 2.0 without signatures over HTTPS is perfectly secure for proprietary APIs where the client only sends the access token to a hardcoded endpoint, and uses HTTPS properly.

My favorite part is the quote from Google’s Eric Sachs (who, BTW, is not a co-author of any OAuth specification), explaining why having no signatures is a good thing:

These guys are not experts in security. They can have a hard time understanding security models.

This is exactly the problem. OAuth 2.0 moves some of the enforcement requirements (again, when used with dynamic service discovery) to the client, and the client developer is rarely a security expert. This is why the specification has to assume that the client developer is going to make mistakes and do everything possible to render them harmless.

Raise your hand if you ever wrote a client application using HTTPS and got tired of exceptions about bad certificates, so you just ignored them silently. Unless your HTTPS code is written in a way that completely blocks the connection if there is any error, it might not do you much good. HTTPS is great when done right, and this is just one example of the difficulty in trusting clients to do the right thing.

To be clear, I don’t want mandatory signatures in OAuth 2.0. I think there are valuable use cases for the current bearer token approach. And I agree that developer ease is an important design factor. Signatures were not mandatory in OAuth 1.0a either, but they are an equal part of the protocol, as they should be.

Given the recent support for putting signatures back in (as an optional component), I have pushed back the publication of draft 11 until we can add a signature proposal to the core specification. I expect it sometimes in late October.

One thought on “More OAuth Nonsense

  1. I don’t understand the point of OAUTH. Website to website authentication and data reuse is rare and no I’m not joking. Are you going to authenticate and allow a third part website to access your yahoo email data, etc? Not likely. If we are just looking for a simple ‘ID’ mechanism for single-sign-on, ok fine, but OAUTH is far more complicated than it needs to be if that is all it is trying to accomplish (we have open ID instead)

    The MAIN use of web-service authentication is by native third party applications (I’m stated that as fact without evidence … maybe I’m wrong). If a user is installing a third party native application, then no authentication mechanism in the world is going to protect the user against a bad native program. If it’s a native program, the user is already inherently trusting it. OAUTH is pure obfuscation in this scenario.

    Let’s review:
    – Some website provided interesting data.
    – People wrote programs to scrape this data to do interesting things with it.
    – Some websites decided to make this easier by providing a web-service API (usually REST) avoiding many of the obvious problems and hassles with scraping.
    – Useful ‘native’ programs got turned out that other people could use (popping in their own username/password).

    Then … for a reason I don’t understand, people decided this was not the primary use-case at all. No, suddenly the use-case that needed to be supported was website to website data sharing and the resulting complexity broke all the simple ‘scripts’ out there.

    Fine … back to html-scraping, sigh …..

Comments are closed.