A customer uses software developed by a 45-year-old developer.
-
A customer uses software developed by a 45-year-old developer. It has been in production for over 10 years, the code is theirs, and the developer is paid to write and maintain it. In my opinion, this approach is correct, as it gives my client the freedom to use their own code. The program is written in C and works well, never having caused any particular issues, although, in my opinion, it’s quite heavy for what it actually needs to do.
Yesterday afternoon, I was at this client’s for an emergency (which I will describe later), and since I was there, we tried to run it on FreeBSD, but it didn’t work. The client asked me to take a look at the code, hoping I could figure out what was wrong. Despite not doing much development, I realized the issue was that the program was trying to access files on static Debian paths, and once fixed, everything worked fine.
However, I noticed a detail: at the start, it parses a configuration file (old-school, not json) to look for values set to "ON" and "OFF". The parsing happens on characters, but in the end, it doesn't matter much since it’s done only when the program launches. I noticed that a series of ints are defined, where ON is set to "1" and OFF to "0".
Out of curiosity (I imagine the developer might have labeled me as a troublemaker), I asked why they hadn’t used bools, and if a different numeric value was expected. He replied, irritated, that it didn’t make sense to use bools just to save "a few bytes". After taking a closer look at the code, I realized how it wasn’t remotely optimized, resulting in "monstrous" computational complexities compared to the simple operations needed. Of course, I said nothing, but clearly, the developer noticed my expression. Unfortunately (or fortunately), I’m very expressive. He started saying that hardware is advancing, the world is advancing, and that the "obsession" with optimization was pointless. I didn’t respond. The client wasn’t around, and I had no intention of sharing my thoughts on the code with him.
Back in the office, the client called and asked why the developer had complained that "I criticized the code," and what I thought about it. I simply replied that, in my opinion, and from what little I had seen, the code seemed fine; I had only asked questions about some implementation choices. Everything was fine, everything was resolved - the developer isn’t a bad person, just a little sensitive.
My takeaway here: the mini-PC I now have at home consumes half the power of the previous one, has extremely superior performance, and opens up new possibilities. What’s the point of creating better hardware if we "eat up" the advantage by filling the software with inefficiencies?
-
@stefano Even back in the 1990s, personal computers were sufficiently slow that optimization could make a very noticable difference in software performance.
For a *very* long time, games truly pushed the envelope on what hardware could do.
These days? "Meh, that's just the way things are." No one gives all that much thought as to whether things *need* to be slow.
In at least one case, I've even seen software slowness being touted as a *feature*.
-
-
-
@mkj @stefano @mhd customers actually feel more confident in an algorithm if it takes a while to complete a result or feel like the site is checking more things.
It's the same as making a consumer vacuum loud to seem "powerful"
However sometimes this gets misused on loading screens that are by far too long
-
Stefano Marinellireplied to pizza in 10 days last edited by
@risottobias @mkj @mhd True. I see people see "slow" things in a positive way.
It reminds me of the charm of listening to the floppy disks making noise while loading (or watching the cassettes spin). It felt like, in some way, the computer was "thinking." The immediacy of today almost seems like an approximation. -
@stefano Did booleans exist in k+r C ? I distantlyremember writing a lot of code where int was the default type for Boolean logic - because the processor was going to load 16 bits from memory anyway, so narrowing would cost time…
I also suspect the complexity may reflect the slowly changing requirements, each change adding a layer. Your dev might be prioritising not breaking stuff over performance.
-
@[email protected] From K&R C section 2.2, page 34: there are 4 types: char, int, float, double. shot, long, and unsigned are qualifiers to the int type but not types themselves. Int is the natural size and has no requirement that it be smaller or equal to long (something I think ANSI C got wrong). So no bool.
My copy of K&R is starting to fall apart from Acid paper - I wonder if it could be reprinted.
-
@bluGill @stefano @mkj @mhd @steely_glint hmmm, that notified me but didn't @ me. maybe Fedia's running weird software.