A couple of weeks ago, I got bored and decided to come up with a list of
things that have bothered me when trying to run software to get things
done. These might be reliability concerns, or development issues,
or really anything else that bothered me at the time. This was actually
I would actually recommend other people try it with their own annoyances
and see how things stack up. It was interesting to look at the rows to
see which choices were particularly bad because they hit so many of
them, and then to look at the columns to see how often they showed up
regardless of the language or environment.
But, the problem is that when you key off the language name, it's going
to bring the clown brigade down on you, and I mean even more than you'd
typically get for a post grousing about something dumb that people use.
So I decided to skip the table, ignore the rows, and focus on the
columns - that is, the actual things that bug me. Instead, we'll be
looking at it in written form, and you can just guess at whatever
languages I'm talking about.
Hint: if you can't find something stunningly wrong with your "chosen
one" language, you probably haven't been using it long enough... or you
created it. Either way, try harder.
So then, on to the list.
I don't like virtual machines. I find them particularly goofy when the
so-called "write once, run anywhere" programs only ever run in one place
(Linux, and usually a particular distribution at that) on one
architecture (x86_64) in reality. I find it even worse when the VMs
have to be bundled with the code because they are so tightly coupled,
and you find yourself running multiple distinct versions to cope.
I don't like meaningful whitespace. If a tab means one thing and a
space means another and they both print the same way (i.e., NOTHING, a
big blank space on the screen), I'm not going to be happy. Also, if the
amount of whitespace somehow controls branching or blocking of code, you
better believe I'm not going to be happy when it trips someone up.
I don't like interpreters. They tend to bring along their own
dependency hell, particularly when they themselves keep getting revised
and aren't fully compatible with the existing code. It's shades of the
VM situation, only with the added benefit of being even slower!
I don't like syntax errors and other time bombs that hide until runtime.
They should be ferreted out during the development phase with a compiler
pass. But, if there's not something which constructs a whole parse tree
ahead of time, it's quite likely it'll bite you much later. Note: this
is NOT a 1:1 with interpreted languages. Some of them will quite
happily notice your derpy syntax error when you start the program and
will refuse to run, even if it's in a branch somewhere. Others will
just barrel along until they finally arrive there some hours, days, or
even weeks later. Then... BOOM.
I don't like a ton of magic characters that amount to line noise that
absolutely need to be understood to completely follow the code. If you
are constantly tacking on ! or ? or things like this in ordinary code,
or are making little ASCII-art drawings, this suggests you went down the
wrong road with your syntax somewhere.
I don't like dependency hell in terms of libraries. Some ecosystems are
built around the assumption that you WILL import every single thing that
you can think of from somewhere on the Internet if you can. They will
add dozens of dependencies directly, and will pick up dozens or hundreds
more transitively as those projects continue the pattern on down the
I don't like ecosystems that assume you will just talk to the Internet
any time you decide to do something involving development, building,
deploying, or even starting additional instances. You need to be able
to do your work effectively while completely airgapped. Otherwise, you
end up subject to any number of opportunities for downtime when
something you don't control acts in a manner that's not helpful to you.
Also, I'm just talking about the unintentional acts here. The
*intentional* attacks are a thing as well, and they're even worse!
I don't like garbage-collected memory management. I've seen too many
places plow stupid amounts of time and energy into trying to avoid the
dreaded "stop-the-world GC" problem while still being based around it.
I mean, okay, it keeps them busy, so there's job security if your boss
doesn't realize what's going on, but some of us would rather call
something done and walk away from it.
I don't like any environment where "monkey patching" code is done in any
way, shape, or form. If your solution to running something is to reach
into some other code and do something truly nasty to the way it works,
you are already out in the weeds and I want nothing to do with it. If
your system literally cannot run without having to do this to other
parts of the system, you have a serious problem and need to reconsider
your choices. I mean, it's probably Conway's Law in action, but COME
I don't like systems that can't actually run code in parallel.
Individual processor speeds aren't still constantly increasing like they
were 20 years ago, so instead, we get more of them. If I have two
threads which need to do work and two open processors in the machine, I
expect the kernel to pick them up and run them! If there's something
about the situation which precludes them from actually running in
parallel, then obviously it's going to end up being a waste of
resources. It'll also run slowly, and the "solution" will be to just
run MORE of the standalone things so it can't "switch gears"
mid-request. Great. And you wonder why stupid services turn into
1200-machine monsters that then require additional monstrosities
(kubernetes, anyone?) to manage.
I don't like situations where the type system is useless or just
sufficiently loose to where it might as well not even exist. This can
lead to all kinds of crazy stuff hiding in the source until runtime, at
which point it can blow up because a Foo doesn't have a bar() method,
whereas a Bar does, but you can store both in x, and the code is
"x.bar()", so ... BOOM.
I don't like designs which use exceptions as a matter of course. Every
exception amounts to an "ON ERROR GOTO x", and it happens far away from
where the action is. When something blows up, you are suddenly ejected
from that spot and end up somewhere else far far away and have to deal
with it. Unsurprisingly, people in these situations frequently deal
with it poorly. It's "COME FROM", only for real, and it sucks.
I don't like code which plays fast and loose with pointers like it's the
80s all over again. Odds are, whatever you're doing doesn't actually
need you playing around at that level of things. If you're working on
some tiny little machine where the memory is measured in bytes and the
clock speed is measured in kHz, then fine, sure, bum instructions out
and be clever. Everyone else, knock it off. Hint: x[y] is really,
really unsafe in a WHOLE BUNCH of languages. Those brackets are just
waiting for you to stick your head in so they can snap closed and crush
There. Those are my dislikes. If they don't match yours, surprise! We
are not the same. Figure out what works for you.