GtkD blog, forum fundraiser, ref accept rvalue DIP argument, and more. Adam rants about libraries.
See more at the announce forum.
The week was pretty crazy again, lots of bad weather here this week, and some day job work. I wrote some oauth2 client code in D (which is pretty trivial) after seeing a hard fail from a third party gem we were using at work.
Which brings me to...
I'm not a fan of the dependency culture common among developers nowadays. We try to make everything into libraries, and try to use libraries for everything, which in turn, use even more libraries, which use even more libraries, and so on.
Creating a library adds quite a bit of cost to the author's development. You are committing to some kind of stability, documenting an interface, and being responsive to the needs of multiple users - which tends to bloat the code and complicate the interface over time. These costs tend to be minimized, meaning the majority of libraries out there are terrible.
Using libraries also adds quite a bit of cost, even if they are one of the rare good libraries. You need to evaluate libraries, manage the recursive dependencies (which is my pet peeve), and figure out that interface. These costs also tend to be minimized - you just pick the first one a search brings up, let the package manager pick whatever version it wants and grab all the dependencies without evaluation, and write whatever code that can use the interface with minimal regard to how it fits into the bigger picture you are trying to solve. Your codebase may even take on an architecture centered around your use of the library! (Of course, you can then rebrand it to a "framework" lol)
But the worst case is when these two cost-cutters combine. You use a library, and it looks OK at first... then it turns out it is awful.
That's my story this time: we had to access a Google API from Ruby and grabbed a gem that seemed suitable. We created relevant credentials, populated a configuration file and wrote some glue code. It looked like a win for libraries, yay.
A year later, that function suddenly stopped working. Looking into it, the authentication token was being rejected. OK, maybe it just expired, easy fix. But no, the process described in the documentation no longer worked. Well, it is a Google service, and they break stuff all the time, maybe I will try the new version.
Of course, this "minor" release broke half the API - this is why I don't trust SemVer, it is a nice idea but depends on everyone actually doing it right - but no big deal, I'll make the glue code use the new functions. But, it still didn't work. Maybe I need to do something like enable new access on the account.
Well, I log into it and add the access... but nothing changed. Yada, yada, yada, out of desperation, I find the client id from the credentials file doesn't match that account and google it, kinda hoping it might tie it to one of the other company accounts.
Instead, it returns hits on github. Hardcoded in the source code of one of the library's dependencies. uuuuuuhhhhhh what?!
I now take a closer look at the library's source code. Oh, it passed the app info - client id and secret - to the inner library, but then the inner library completely ignored those parameters in favor of its own hardcoded thing. (And a fail on my part when I authorized it the first time, I must not have paid any attention to the app name. I wasn't really scrutinizing it since it is a strictly internal use application and I assumed it was my own account since I passed my own app id to the function.)
I don't think this was nefarious on the part of the dependency author; it looked to me like their debugging or convenience code they forgot to remove when committing, and there's no evidence they actually got the generated access tokens or anything. But wow, how many other people got bit by this too? Did the first library's author even notice their dependency was totally wrong?
They have since put out another major version release that fixes this, but still, wow. And Google's documentation says it is hard to implement OAuth correctly, use a library instead. L-O-freaking-L.
Well, after fighting with that for several hours, I just wrote my own code to do it from "scratch". About twenty minutes later, I had a working implementation - and I was confident it was ACTUALLY working, since I understood what it was doing on all layers.
I say "scratch" with scare quotes because I actually did use millions of lines of library code - the operating system. I didn't actually write a TCP/IP stack, a TLS implementation, or graphics device driver. I didn't actually write a filesystem or virtual memory manager. Those things are legitimately reusable, and actually pretty well debugged through use. I did happen to write the http implementation, but meh.
After all that foundation, a few redirects and HTTP posts and file writes - about 50 lines of code - is barely from scratch! So I'm not against code reuse per se, I just want smart reuse.
This story is a bit of an extreme case, but I see it as confirmation of my preconceived biases ( ;) ) - package managers bring a net negative effect on software quality because they encourage this corner-cutting on both sides of the library equation.
I propose an experiment: write your own implementation, but publish it to a public package manager under a pen name and then import it as a dependency instead of committing it directly to your own code as a private function. Would that make it pass code review it would have otherwise failed?
If yes, are you sure you are actually coming out ahead with library/package use in other cases too?