- Fixing broken windows, computer security. Code habits.
You know what? I have been kind of disappointed the way management works at work. You know, it is always like we’re trying to work on a new project to make things so much bigger and faster, but we don’t really have culture that goes back and revisits past work that we’ve done and try to improve it. I think this is especially important in the case of past work that is still being used in an active production process.
Yeah, seriously, on my own time, I have very much developed the idea of practical organizational habits. Again, I reiterate, because this is important! How do you avoid wasting time organizing something that is short lived and will be deleted soon? Easy, you just allocate a fixed time schedule to organize the collection a little bit at a time. What will happen over long periods of time is that long-lived information will end up being very well organized, and short-lived information will only be partly organized before it gets deleted.
But seriously, how do I feel about the practice as is being done at work? I feel like this fits in with the “fixing broken windows” criminology theory. As the theory goes, the idea is that if you spend time to fix broken windows in an abandoned building, then it is less likely to become a criminal organization’s crime center, but it you leave it be with broken windows, more people will come and vandalize the structure until it finally collapses. I feel like a similar thing is going on with our imperfect infrastructure. The fact that it is imperfect tends to cause more things to break whenever we try to service or maintain it, and likely this problem is just going to keep getting worse until the system finally collapses.
But do you know what is really interesting about this analysis? Think about how this applies to mass market software. All the vendors are competing to add maximum features to increase productivity, right? But what about the security of the code? Well, a similar thing is actually going on here. One by one little security vulnerabilities pile up in the code. The development team would very clearly notice most of these vulnerabilities had only they’ve been allocated some time to examine the code for them, but do they have the time? Nope, management busies them up so that they can’t even think about that. But anyways, being a fairly public product, criminal passer-bys go and look at this seemingly dilapidated building, and they say, “Oh, there is this vulnerability here, and that vulnerability there. Why don’t I take advantage of them?” And then once the criminals really get started with their illegal businesses, a criminal economy starts to thrive around the product, and everyone else is affected by the chaos.
And guess what? The simple idea of management to implement a “fixing broken windows” project would have been an easy way to steer clear of this issue.
As I have stated just below, there is this thing that I dub “the tourism effect.” And actually, when you think about it, this exact same problem applies to software and innovation. The extreme innovators want to keep breaking new ground and churning out half-baked solutions. Yet computers must be programmed with great degrees of precision if they are to be expected to deliver highly precise results, and highly precise results are required for highly secure operations of computer systems. For companies that define themself to be in the boat of constant innovators, one has to expect that from such companies comes software full of security holes and vulnerabilities.
So anyways, that’s the real background on the lack of the “fixing broken windows” cultural practice at high-tech companies. Yep, the root of the problem comes from the same human psychology that causes innovation, just as has been pointed out in The Design of Everyday Things book. Again, I reiterate, because this is important!