--- title: False Advertising template: blogentry.tmpl date: Thu Apr 07 11:26:42 2011 -0700 filters: - strip - textile --- If there's one trend I wish would go away in the Open Source community, and especially the Ruby community, it's the practice of over-marketing. There seem to be an endless stream of projects which start to gain in popularity, usually tangentially to the popularity or prestige of some website or another, but then are hyped and marketed and start to make grandiose claims about how much better or novel it is, and it's almost always gross exaggeration. I love to see companies release their internal technology; I think it benefits them in all the usual Open Source ways, and provides the community with not only potentially-useful software, but also more examples of how programmers are thinking about problems. This kind of corpus of reference code is instrumental in the betterment of the legions of us that are self-taught. But just because Yahoo or Facebook or Github or Engine Yard are using it, doesn't mean that it's necessarily better or brighter or a new way of solving a problem. People seem to assume that just because the code is behind a very high-traffic site, or used by thousands of people, that it's inherently great. As I imagine anyone that's worked behind the scenes at a company whose website became popular very quickly would agree, that's a flawed assumption. Code that's written under the gun to shore up a sagging infrastructure is usually expedient code, haphazardly tested, and solves only a very narrow subset of its problem domain. It can, and often does, get better, but code that's a cornerstone of a site that has to be up all the time can only grow and improve at a cautious pace. This is especially true of software that's hyped as being "faster" than something it's intended to replace. What the marketing hype usually fails to mention is that all that speed comes at the cost of cutting some fairly critical corners. Of course a "data structure server" can claim to be "blazing fast" if it sacrifices reliability to do so. If you don't have to worry about things like transactional isolation and two-phase commit, you can make any software much faster than software that provides these kinds of safeguards. The apologists are quick to point out that you don't always need reliability, and that's true, but the second you add persistence to a datastore, I'd argue that you expect to be able to predict the state that's persisted at any given time. Taken to the ridiculous extreme, a datastore that doesn't actually "store" anything would be blindingly fast, but useless; it's the careful balance between reliability and performance that informs our decisions about what technology to use. The same is true for the raft of software claiming to be "less bloated", or "pure Ruby", or (my personal peeve) "more DRY". It always seems to accrete around anything that gains critical mass, usually with a commensurate minimal fraction of the original's functionality. At the very least, these kinds of caveats should be taken into consideration when talking about and comparing software, and should temper the fanatical hype that usually surrounds whatever New Shiny is in favor with the Code As Fashion people. Good software takes time to perfect, so I don't expect that it'll come out with all the problems solved immediately, but the marketing should take that into account. Also: please stop thinking of memcached like a database. It's a _distributed_ *cache*.