The Artless World of Software Engineering

It didn't take me very long to show interest in writing commercial software. I was dreaming of it by the age of 12, perhaps younger. In high school, I developed my interest in animation, and thus thought I must be good at it. Somewhere in the middle was computer animation.

In college, I discovered CS 185, taught by the now dismissed R. Hyde. It was an important lesson, and a well-earned A. The lesson was our group tried to write a video game in 10 weeks, in assembly. Concept art, planning, and module identification was as far as we got. What we learned is that unless there's enough money, there is never enough time to write it correctly.

An artist can spend years on a painting. A programmer cannot spend years on software unless it is a work of art. Who considers software a work of art? There's vivid, artistic software out there, even the experimental/visual efforts are noteworthy (just gaze at WinAmp's visualizations as an example). None of it is regarded as art, not widely.

Paint is really interesting. You can slop some on your hand and press it against canvas, or flick it off your brush. You can even fill a balloon and drop it from heights to burst it onto the canvas, or do wild things like spin the canvas while dripping paint onto it. A child can create art. I envy visual arts. They end up on the family refrigerator. Software, however, does not. Avoiding the anticipated drone of self-pity, you can imagine that software engineering is difficult to reflect approval, acceptance, and praise. Such things are important as a child, hence the interest in computer animation.

There was a period in time where computer software was regarded as an art form, back when there were limitations of the equipment. Programmers used to devise clever mechanisms for accomplishing certain previously inescapable goals, such as vivid color screens on a 32-color raster display long ago. For a computer software engineer, this borders on impropriety, but the genius, the art of coding still thrives in a dark corner. True viruses remain to me the most stunning example of art. Code capable of polymorphism, self-replication, stealth, and even mutation; a rare effort. The beauty is they are given away, even if you don't want them. Sadly, that's evil genius.

Intrusion Detection Systems naturally must exceed the sophistication of viruses, but knowledge of these is several orders of magnitude less common. A first-grader may know the difference between a worm and a virus, but how many government employees know what is helping keep them safe, 8-5?

Every sufficiently complex program has errors. Errors are often accepted in the world of art. Look at the works of M. C. Escher: he made a career partly out of his impossible perspective drawings, a career out of errors. Errors in software are not perceived as art. It may be an art form to actually use error-prone software with any stability, but the software's errors are not art. The errors get in the way of human expression. Errors in software are as useful to most as a sweater with no opening for a neck. Immediately, your mind thinks of ways to make use of such a sweater, doesn't it?

Humans need to relate to what they experience in order to regard it as art. Be it music evoking an emotion, comedy connecting with our own experiences, or a photograph that captures a word-thought. Until software reaches us on that level, it is abandoned as potential art. At this point, you might be considering something like a video game to be art. Can you do that fairly to all involved? It is multimedia, but the art is in the media, not the software. I carve a careful exception to some AI programming in video games and custom software such as Massive by Weta Digital, where the art is distinct from the media.

Art in software engineering lies on the fringes. The endless sea of software is a vast and jumbled space. When the sea rises to consume those breaking shores, the waters on the fringe are drenched in mass, and the art, the limits of software move elsewhere.

Head for shore.


The Tower of Babel

The Holy Bible was written by humans inspired by God, and it has meaning today and tomorrow. It never changes, and what was once freaky and weird is starting to eerily make some sense. The Tower of Babel Genesis 11 is historical. Some consider the Ziggurat at Eridu to be this tower of Babel. It is of no doubt that the Tower of Babel was in Messopotamia, the problem, of course, is that many of these massive structures were ravaged to build others. I've read as high as 50 ziggurats. Regardless, the structure stood at one point in the land of Shinar.

The effect of the Tower of Babel is that language diverges. God introduced this phenomenon. That's why (for example) an Urban-American likely won't make sense of the spoken words of a Creole-American, and the rest of us have trouble with both. If French, Italian, Spanish, and English are all Romansch languages, (as in, Romans gave us the root: Latin), going back further in time, there were fewer languages. I, with my California roots, often intentionally mix some trite Spanish expressions into what I say. Compound that with the odd cross-section of mid-western words (pop, not soda, for example) and a very mild case of Minnesotan accent, and I've ended up in a peculiar corner of English. Give me enough time and may I raise up a nation through my children, and an exchange on the order of:

You want to come with for dinner, dontcha? Steve caught a grande' walleye, Amigo.

Human language divergence point made, if a bit forced.

Enter computer languages! Computer languages do not require social acceptance to exist. In fact, all they require to become prevalent is financing. IBM has kept REXX alive decades longer than prudent, and the mere existence of vbscript is a testimony to Microsoft's die-hard attitude about languages they make. Ten years ago, Markup Languages did not exist in vast numbers; I think it was only SGML and HTML. Now, XML has spawned a rebirth of dialecting. I used to think there were 100 programming languages (loose sense of the word programming). Now, knowing more than 30, counting all the proprietary languages (such as the command scripting of a Cisco router), I'd guess there are beyond 1000 computer languages. Twice that if you count versions and revisions.

Every once in a while, someone comes around and decides it is time to try to converge. Esperanto ring a bell? There's also a son-of-Esperanto, Ido. I believe someone said that Ada was to be a computer language convergence. My study of it suggests that they tried: a little of everything.

As humans, we need to be aware of this phenomenon. If it ever goes away in my lifetime, that with several dozen other reasons would suggest that the end of the world is nigh!

Since this is my introduction to web chronicling, I'd like to point out that I claim Vexar as my own 'net name. I was Vexar in late 1991 as an email address at my first college of attendance, pre-dating the use by countless others as a personal identity. Does it predate the trademarked orange fencing? Probably not. A dear friend of mine pulled the name off the side of a Revell plastic model, back in the years when computers were running at or below 1Mhz; this suggests commonality as both are plastic. Perhaps it is a fitting name for a space ship of war, when taken in the original Esperanto.

Subsequent to this claim, I have attracted a challenge to its use. Victor Mercieca, who is likely of Russian origin, debates that his original naming, Vexzar (a Russian hybridized word), predates my own use. However, Victor has yet to provide historical evidence of its use prior to 1991, and uses only a homonym, with an Americanized spelling adjustment. The debate continues...