Couple changes to the ol’ site here. First, I’ve moved it off Github Pages and onto the Dreamhost shared-hosting account I’ve had for ages. If you tried to visit this morning (unlikely) and had trouble, it’s probably owing to the DNS and SSL setup getting switched over.

Also I dug up a bunch of old tech-related posts from old blogs I had years back and added them on here. I haven’t read them recently and I’m sure they’re probably pretty cringe, but on the other hand it probably looks good to show I’ve been at this for a minute and have learned and evolved.

I’m decently settled into a new job and it’s going well, so with any luck I’ll be writing more here in the near future.

Recently I was given some job interview questions to prepare for. I decided to try writing my thoughts on the subjects. I had fun with it so I decided to clean them up a bit and make them blog posts. One thing I noticed was that when writing, I didn’t have so much a name let alone a mental picture of a hypothetical senior engineer I’d be interviewing with and seemed instead to be writing for the recruiter I’d been in contact with as intended audience. I was explaining concepts less to impress someone and more to actually teach. Therefore I hope these posts are informative to someone.

Q: When might you want to use a NoSQL database instead of SQL?

Generally speaking, a database is any store of data and the means to find and use what is in it, which may include a query language such as SQL. The file system(s) used by the operating system on your computer is also a database, and you might even think of file names/paths as part of a query language for it. A file system has metadata for keeping track of where the bits that are called by a given filename in a given folder are physically kept on the disk (or whatever storage medium you might be abstractly referring to as a “disk”), so as to help the software to find, use, add, and modify files in ways a user might request.

These same factors also define other kinds of databases: a way of organizing data in some storage medium; facilities for accessing, calculating from, adding to, and modifying it; and metadata (such as indexes) to support these facilities. Various database architectures have different ways of implementing these things that can support different logical models of data access to varying degrees of speed and space efficiency.

The term “NoSQL” is interesting because I suspect that it may be highlighting the wrong variable and thereby misleading. The defining factors of NoSQL databases as opposed to what users of this term mean by SQL databases seems to me to have more to do with the underlying architecture and features of a database engine, than with the SQL query language itself. But SQL is closely correlated to what are called relational databases, a model which has dominated the field of databases, in the sense of which most people think of the term “database”, for a number of years now.

Relational databases provide a logical model of data arranged in tables, the rows being data records, containing fields (arranged in columns) that hold the component data of the records. There are keys to uniquely identify particular records, foreign keys to express relationships between records, be they in the same table or different, and indexes to help the system locate records by keys. It’s all fairly intuitive to folks who know their way around spreadsheets and also quite logically flexible. Relational databases almost universally offer an SQL interface, but I have also seen SQL-like languages offered for other kinds of data stores such as ksqldb for Kafka, and proprietary specialized dialects like this one thing Salesforce has that I can’t remember the name of right now. They have become the default choices of database to back most applications that need something they can’t get from, say, sticking stuff in files; to the degree that the term “database” has become nearly synonymous with them, and the occupation of “database administrator” likewise synonymous with someone skilled at working with SQL, various extensions of SQL, and relational database solutions in general or specific.

Relational databases have been so dominant for so long for good reason. They are good general-purpose databases and a lot of work over a long time has gone into making them good. They give you facilities to be able to do just about anything with the data they contain that you might conceivably dream up, almost as quickly as you can dream it up. You can put lots of data in an SQL database and get a lot out functionality of it easily. Right now I’d still say if you’re building a new product, there’s almost no reason not to at least start off with one of these databases even if you’re keeping your options open in the longer term. Free and open-source solutions like PostgreSQL are competitive with commercial offerings like Microsoft SQL or Oracle, and at small scale and/or early stages of a product you can even go with something easy and low-maintenance like MySQL or SQLite. You can run them on your own laptop and do all the nasty things to them you could ever want to try as you develop a product.

But yeah, this is the hegemony that birthed the NoSQL meme. Some tech companies were getting big doing some innovative stuff and found themselves reaching performance and scale needs that relational databases weren’t up for. Being pretty good at a lot of things sometimes leaves you coming up short on very specific things, especially things that were rarer when the system was initially designed. Everything’s a trade-off. Demand increased for distributed and parallel computing to handle high load, and relational databases often weren’t the best at scaling out. Replication exists but it’s pretty meh, and sharding is helpful for some things but can be difficult to get right. Other kinds of databases needed to be found or built to meet new needs.

The NoSQL movement may be little more than a recognition of these conditions. Its effect has seemed to be to bring attention to other kinds of database systems. A caching system can get by with less features for structuring data so long as it can access it quickly. A reporting system would have more need for pulling a lot of data at once and flexibility in structuring it, but place less importance on the data being up-to-the-millisecond consistent across your enterprise. Through message- and event-driven system design techniques, one can even represent the same information in multiple different databases to support different applications. NoSQL was mostly about recognizing that we had more choices than we might have realized.

So ultimately, the answer to the question of when you might want to use a SQL database versus a NoSQL one comes down to the old “consultants’ answer” of “it depends.” But what I hope I have been able to get across is a good sense of what kinds of things it might depend on. There exist now a variety of database solutions besides SQL/relational, all focused on being good at different things. The question isn’t so much SQL versus NoSQL, as it is a choice among a variety of databases of which SQL is but one subcategory.

Got a tech question you’d like me to write on? Send it to hoff2 at HEY dot com.

Recently I was given some job interview questions to prepare for. I decided to try writing my thoughts on the subjects. I had fun with it so I decided to clean them up a bit and make them blog posts.

Q: “Can you explain what inversion of control or dependency injection is and what benefit it provides?”

Dependency Injection/Inversion Of Control is a concept that came up in the theory around object-oriented programming some years back. The general idea is, you have some procedure in your program that needs to get at some data values, system resources, and other procedures to be able to do its job. When or how you specify all these things are a matter of how you design your code. Given an OO programming language, this procedure is probably a method on some class, so some things may have been specified to the objects’ constructor; others might be given as arguments to the method call. Or they may have been calculated, provided through setter methods, or constructed from other things in the meantime. The constructor is itself a method with parameters, whose job is to give you a new instance of this class, and those parameters can technically be anything or nothing, just like any other method. In between the constructor, other setup of state done beforehand within the class scope, and parameters, eventually the method has everything it needs if it is to work.

You find that in purely Functional Programming circles you rarely hear dependency injection talked about, let alone stressed about, the way it has in OO programming. At a basic level one could say that this because in these languages everything is made out of pure functions, so technically the only way to get things into a function is through its parameters, so you don’t have all these choices to weigh. This might sound like a nightmare if you care much about functions having a lot of parameters. But then that’s what closures are for, which are how FP “does” objects, as it turns out, and out of them you also get currying and monads.

In the early days of object orientation, as its dominance in industry was new, many design principles, patterns, and practices we’ve since come to take for granted were yet to grow. Much of the field was still thinking in Procedural or Structured Programming terms and was still figuring out how to use the tool of OO well. This is normal. I’ve since seen it happen with Actors and Microservices and arguably even Functional Programming, with three-quarters of a century of history behind it, is still lacking in some of this area.

Object Oriented programming languages’ native facility for invoking an object is to call a class constructor. Often there is a special language keyword to do this, such as new. If you need some kind of object, this is supposedly how you get it. If that constructor needs some other objects though, what does it do? It’s not entirely unreasonable to think that that constructor would go off and call other constructors and so people did that sort of thing a lot.

However, this started to lead to some nasty smells. One would find oneself writing constructors that need a whole lot of parameters, even if it’s just so it can pass most of those parameters on to other constructors; writing out the very long calls to those many-parametered constructors, “threading through” parameters from one class to another to another, and probably getting parameters in the wrong order several times and other prosaic human errors that happen when you’re dealing with complicated code. To make matters worse, you ended up with classes and methods that are bound to specific context-dependent things in your application and don’t easily generalize to other circumstances; OO’s promise of reusable code fails to come through. This got especially hairy with the growing popularity of unit testing and test-driven development – you would try to set up a test suite for some class in your code and find that you had to build out a whole hairball of stuff just to set up an instance to test on, and then someone would tell you that if you were doing that then it wasn’t true “unit” testing, and you’d throw up your hands and go back to your old ways.

Eventually some genius remembered that we can pass objects along as parameters too, so we could just build those downstream dependencies ahead of time and pass them in. “Injecting” your dependencies’ dependencies into them, instead of just letting your dependencies build all their own dependencies, this “inversion” of the “control” over dependencies, passed for enough of a mindshift to make it a buzzword.

Indeed, it was considered such a mindshift that people decided they couldn’t handle the responsibility themselves and needed to have frameworks called Dependency Injection “Containers” to do the heavy lifting. It was new’s fault for letting us call it wherever we want; instead we should let this container use reflection and/or big XML configuration files to find all the right things to plug into all the constructors for us so that wherever we wrote methods, whatever they needed could just be there. After all, “dependency resolution” was something of a solved problem in the realm of the installation of software. We had package systems for Linux distros, as well as those to manage library code our projects depend on. They build a data representation of what depends on what, and what other whats those whats depend on, on down to the whats that don’t depend on any other whats, and go and find the right versions of the right things and put them where they’re needed.

DI/IoC Containers caught on widely in the worlds of enterprise Java and C#. This way of handling dependencies within a project had apparent time-saving benefits due to what it lets you avoid thinking about, against the tradeoff that it gives you an incentive to not think about those things. When you write a class you simply stick on the annotation that plugs it into the DI container, and now you don’t have to think much about how to structure dependencies.

In practice, how this often plays out is that if you’re coding in some new functionality and want some dependency the class doesn’t already have in scope, you just add it to the constructor and keep going. Heck, often the IDE will just do it for you with a keyboard shortcut. You don’t care how long that constructor signature gets because you never have to write a call to it anyway. The constructor’s signatures get bigger and bigger and the import lists at the top of the files get longer and longer and your unit tests have to set up more and more mocks but because all this grew a little at a time you’ve stopped noticing it, like the people on Hoarders who are immune to the smell of their own homes. Without applying some design discipline, at the end of the day, this state of affairs is not only hardly better than just cramming a lot of stuff into a global namespace, it’s worse because it fills your code with boilerplate, and your classes lose all cohesion or real meaning beyond being arbitrary bags of dependencies. What’s a separation of concerns?

In Scala’s early days of aiming to be a “better Java”, they tried to improve on this with a scoping feature called implicit that ended up replicating similar problems in the form of a certain crime against humanity called the Cake Pattern. Implicits came to be widely hated, though maybe not always fairly; they ended up being repurposed to build out an FP concept called typeclasses, which sound awesome until you realize they’re kind of just interfaces.

DI/IoC containers continue to be popular, especially in the context of opinionated-architecture application frameworks in popular enterprise platforms/languages, and I’m able to work with them. When it’s up to me, however, I prefer to use basic constructor- or setter-based DI methods in a manner influenced by things like hexagonal/ports-and-adapters or DCI architectures. Even then, messes get made.

Got a programming topic you’d like me to rant on? Send it to hoff2 at HEY dot com.

The following was written June 1, 2020 and originally posted to mental health-related internal chat rooms where I worked at the time

Yesterday was the last day of Mental Health Awareness Month.

Yesterday would also have been Wesley Willis’s 57th birthday.

If you’re from the Alternative Nation Generation, you may remember Wesley Willis as something of a curiosity, maybe a bit of a joke and a distastefully exploitative one at that, the sort of thing that would never pass in today’s social climate and should probably be left where we left it. A mentally ill street musician from Chicago (to most, with the word “musician” enclosed in irony-quotes) whom seemingly somebody allowed to start releasing albums and playing clubs for amusements’ sake, sparking a weird 1990s fad with a rather long tail on it.

The truth has a lot more to it though, and today I want to talk about what that story obscures and misses that Wesley Willis fans and friends-of-friends like myself knew. I want to shine a light on the man Wesley Willis was. The man who faced adversity after adversity and never lost the beautiful soul that inspired friends, brought joy to mass numbers of music nerds, and ultimately made it possible for Wesley Willis to make way for himself to live authentically in a world all but designed specifically against him. Whose sincerity and enthusiasm made him an effortlessly savvy self-promoter and brought him opportunities and agency denied to so many others like him. Who worked hard at doing what he both loved and needed and radiated love as he did it. It would be to no one’s surprise his time on Earth was short, but he made the absolute most of it.

Two documentary films have been made about Mr. Willis, and of the two, one still has a bit of that snicker-at-the-freak smell on it. The one I’d like to invite people to check out if they can find it, Wesley Willis’s Joyrides, focuses on his life story and personal accounts from his closest friends and also gives space to his no less substantial visual arts career. While many have found Wesley Willis’s compositional methods primitive and maddeningly repetitive, there is much depth to be found for those willing to grant time and attention to his oeuvre. In this moment I would particularly like to highlight those songs in which Wesley Willis dealt frankly with his experiences with mental illness: “Chronic Schizophrenia” and “Outburst” from “Greatest Hits Vol. 1” are easy to find but there are many more.

On June 1 2020, I can say that not only is the world a bit cooler place for having had Wesley Willis in it, but fortunate and downright lucky that Wesley Willis never met the fate of George Floyd or so many other black men and neurodivergent black men throughout our country’s history.

RIP Wesley Willis, you are still loved like a milkshake.

It’s been weird times lately for me, and for many others. If it hasn’t been weird times for you, that must be some nice bubble you’re living in.

I write this on a Sunday. Friday was my last day as an employee of The Des Moines Forge Of Pillar Part Of Industry X.0 By Accenture. Fortunately I’m a techbro and had a job offer the same day at a slightly higher salary. I don’t think I’m going to take it.

I happened to rewatch Atari: Game Over yesterday. In it, Howard Warshaw, developer of not only the ill-fated E.T. game but also such unquestionable classics as Yar’s Revenge, reflected on how his brief experience at Atari kind of ruined him for subsequent job prospects. Made him expect better, want more, be less willing to compromise for the sake of going along with the typical corporate environment. Not just because of the wild party culture of Atari in the early 1980s but just as much because of the heights of creative achievement that he and his co-workers reached together, enabled by that openness in which they could bring their whole selves to work.

When I came on at Pillar, it was in the process of being acquired by Accenture. Accenture is the biggest corporation you never heard of. It employs literally half a million people globally in all sorts of consulting and professional services sectors. I knew of this, it was brought up in my interviews. I seem to have a knack for coming into companies at these moments; my hiring at Banno, my previous job, came at much the same juncture of their acquisition by Jack Henry & Associates. And just like with Banno, the message that went out in Pillar was that what the acquirer saw in the acquiree, the thing to which they attributed the smaller company’s success that made it such an attractive acquisition target, was its great culture, and that the big company’s intentions were to learn from that and spread the lessons out to the larger company and with that we were going to rock the world and it was going to be great.

I’ve had a fraught relationship with employment all my legally-employable life. The workplace was always just an even more repressive corner of an already repressive world where I never quite belonged and suffered because of it, yet if I didn’t find a way to slot myself into it, I would never be able to support myself or a family. Pillar brought me out of the shell I’d built around myself since I was a child, showed me my viewpoint was valid and valued in a way I didn’t realize was possible. It took time for me to adjust and really begin to open up, and I still feel like that process was interrupted in the middle, but I am so overwhelmingly grateful to the people I worked alongside at the Forge that helped that happen. It was a culture that delighted in being eccentric and that spirit made us powerful and because of it we did awesome work. Pillar showed me that I could be seen as more than just a human resource or a code monkey, that I could be valued as a fully three-dimensional human being with all my flaws and foibles and ambitions and silly jokes and wild ideas and nerdy interests.

A week ago Friday though, Accenture told me something very different. Even after all the apparently sincere and ground-breaking diversity and inclusion rhetoric I’d heard, and what I took to be so many refreshingly bullshit-free statements from CEO Julie Sweet, ultimately the message Accenture gave me was that I was expendable: that literally some opaque process running in some cloud, triggered by a fluctuation in a stock price or some such thing, could go and query an employee database according to some algorithm, based off data like how many of my work hours were directly billable to an already-won client, and by those results could decide my professional fate, and that that was just totally normal and the way things are done. Because that is evidently the world we are in now.

And this may be what sucks the most now, is that, after Pillar, I don’t think I can bring myself to settle for anything less… but what more, really, is out there?

Last Fall, I had the great honor, made possible by Accenture even, to be paid a normal work day’s salary to be a guest lecturer for Hour Of Code, which is to say that I along with several of my Forge co-workers spent a day helping to guide public-school classrooms full of middle-school kids through some fun coding exercises, where they could learn a little about what I do for a living. One of the biggest things I was struck by and took away from that day was how much more diverse those classrooms were from the rooms I typically spent a workday in. To be slightly flippant about it, there were actually Black kids there. And Hispanic kids. There was a Muslim girl (I’m assuming a little bit here – she was brown-skinned and had a hijab on) who was joyously competing with her Black friend next to her for who could complete the exercises quicker. I remember thinking that if I’m in a position to influence hiring decisions in ten years or so I sure hope these kids apply. And there were also kids that were totally not into it, who thought all this techie stuff was bullshit, like the Black girl who aspired to be a journalist, and I saw her point of view as no less valid, because kids like her are used to seeing white dudes with glasses like me doing this stuff, not people like them, and also because journalism is totally important and we need it, and yet, we’re talking about language here, which is all code really is, and even if you don’t make software your career, understanding how it works is only going to be more crucial to your success in this society going forward. I didn’t quite have the guts to say what I really wanted to say, because I wasn’t the real teacher and it wasn’t my class. But that was the beginnings of some notions in my head that I have higher purposes to serve than just building another accounts-payable system.

I don’t know what my future holds but I don’t think I’m likely to just take another plain old tech job. The kind of thing I really want to do involves sharing the joy of creativity that comes with my profession and helping young people find their way into it that might otherwise give it up before they get the chance; because I nearly did too, and that’s a whole other story. Pillar was the first place I worked that tried to truly treat that kind of ambition as part of the job, and whatever your company is, if you can’t get down with that, I don’t think I can fuck with you anymore.

Fortunately, I have maybe a couple months worth of financial cushion in which to try to sort this all out. That’s not a privilege I ever had before.

Like I said at the top of this rant, these are weird times. Hunter S. Thompson is famously credited with saying, “When the going gets weird, the weird turn pro.” I’ve been weird my whole life and this is my time to shine.