Monday, March 7, 2022

Machines DO NOT Think, You Idiot...

 "I believe that at the end of the century the use of words and general educated opinion will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted..." -- Alan Turing. "Computing Machinery and Intelligence"



I don't know why I keep doing this, but I did it again.

The Overlord attended a...get this..."symposium" today regarding "new advances" in the "science" of Artificial Intelligence.


Three things:

1. A "symposium" was once defined as "a meeting or conference for the discussion of some subject, especially a meeting where several speakers talk on or discuss a subject before an audience. Now this apparently means "opportunity to sell complete shit to a captive audience who paid to be here, and who won't leave because paid to be here."

2. Everyone will use some variation of the phrase "we haven't reinvented the wheel, but have merely improved it". If by "improved" you really mean "more complicated and more-expensive than prior versions, with similar or lesser functionality", then you're a jerkoff.

3. You will never hear the words "granular", "vanilla", "paradigm", "interface", "teamwork", "symbiosis", or "revolutionary" flung about with such reckless abandon, nor so detached from actual meaning. The "featured speaker" of today merely tosses out a word salad of buzzwords and jargon strung together and assumes that everyone knows what he means on an emotional, rather than intellectual, level.

The Story:

As regular readers of this nonsense know, the Overlord has spent all of his adult life working in the information technology field. He began as a lowly computer operator, testing operating systems for the giant of the industry way back in the day, and as time progressed, so did he. Computer Operator became Senior Computer Operator, which became Data Center Supervisor, then Data Center Manager, and ultimately, he became a Systems Programmer, with automation as his specialty.

He did not attend college until he was 47, and only then, online. He emerged from that experience, finally, with his degrees -- plural -- in Computer Science by the age of 50. He learned the business by doing, but after taking some time away to see to his health in his late 30's, and discovered that a return to such work was difficult at age 42 because people believed -- they still do -- that without the "credentials" bestowed by a Day Care Center with a Football Team, it was IMPOSSIBLE that he did whatever it was on his resume that he said he did.

This is because the newer generation of Managers, products of the conveyor belt of fucktard that is college, believe they couldn't do it without spending five years and $300,000 to achieve those credentials, and therefore, no one else can.

But that was okay, because when I got tired of having doors slammed in my face chasing jobs I could have done in my sleep, I just decided I wouldn't need a corporate tit to suckle on and I would just hang up my own shingle. This worked out very well for me for quite a long time, until the Peking Pox shut society down , nearly (only "nearly") bankrupted my company, and in the interim, the Big Boys who normally wouldn't give my usual clientele a second glance had figured out how to monetize even small and mid-range businesses...often with the protection of, or subsidy by, government (that is for another day).

And so it was the Overlord decided it was time to hang up his coding chops and find something else to do with his life.

And thus far the search has been unfruitful. I don't find an interest in much of anything else. But that's a personal problem, and I'll solve it, eventually.

Anyways...

A friend of mine was attending one of these interminable and insufferable "symposiums" on Artificial Intelligence, and he thought it would be a great idea if I tagged along on the basis that he's a relative neophyte to this arcane "science" and that my background in automation (Automation is what "Artificial Intelligence" used to be called, after all) might help him keep up or better understand what was being discussed.

So, he got his employer to hire me for a couple of days as "a consultant" on the subject, so thankfully, I did not have to pay for this torture session myself, and the fee is worthy of my attention.

Off we went to "the symposium"....

...and it's the same song and dance these things have always been.

Before I get to the heart of the matter, let me explain to you what Artificial Intelligence is (or rather, is supposed/assumed to be), how one goes about developing it, how it is used, and why the constant stream of bullshit from a bunch of MIT graduates and salespeople is misleading. Often deliberately so.

Man has always been able to record, organize, and make use of data. It might be in forms we do not recognize as such, for example, Stonehenge or The Great Pyramids, but these things are the practical application of data. The engineering alone, the ability to organize the labor necessary, are proof of Man's ability to make use of data even sometimes before the invention or even wide-spread use of systems of writing or standard mathematics.

As time has gone on, Man has found better ways to collect, sort, categorize, mine, and make use of data in various forms: everything from the printing press to the thermometer, the telegraph and the punchcard, the use of various sensors that detect light, energy, motion, or measure, are all means of collecting, recording and systematizing data. Always have been.

The Information Age simply speeded up the process, so what might have taken a medieval scribe several days or weeks to discover sorting through various books or scrolls in dusty libraries is now done nearly instantaneously by electronic means. The process, itself, has not changed -- merely the way in which it is accomplished.

And since every new revolution in technology results in a lessening of the need for human labor, even the need for human intelligence, each new "advance" is not so much a revolution as it is a creeping elimination of the human being. For examples: the Wheel made transport -- once only possible on land by the brute labor of men or animals -- easier and less labor-intensive. The repetitive machine press did away with the need for battalions of men with strong backs hammering metal by hand; the printing press did away with the need for transcription or recording of data by many hands, and made data more-portable, reached a greater audience, and allowed for the liberalization of knowledge that was once locked up exclusively in the "Learned Languages" of the ancient world (Greek, Latin, Hebrew, Arabic), unleashing "The Classics" in printed form in vernacular languages.

The same can be said of what the computer has done to many fields of human endeavor. As an example, my first really large-scale automation project involved stockbrokers. Their job was mainly selling things to people, but the process of completing the sale was complex, labor-intensive, involved too many steps, and at the end of the day, actually cost more than it often returned, as for when your broker promised you so many shares at such-and-such a price, but by the time the order made it to the exchange floor, the price often went up and the company had to eat the difference between the promised and the actual price.

Of course, this ate into profits if it happened with too much frequency. And back then, it did.

Computers offered a solution to this problem, considering the processing of an order was mainly one of data entry at various levels.

Suffice to say, the first victims of computerized efficiency -- artificial intelligence -- were white collar workers, the people who get maligned the most, but whose positions are often more-tenuous than the average person appreciates.

If I recall, 3,000 people involved in the trading of stocks-- brokers, runners, clerks of various shades, regulatory bureaucrats -- all lost their jobs by simply converting the simpler aspects of stock trading into an electronic process.

It would not be long after that the blue collar guy got a taste of automation, as well. Robots, after all, don't call in sick, demand outrageous recompense, join unions, or go on strike. Advanced machine tools operating in multiple axes don't complain, file Worker's Comp claims, or engage in lawfare based on little more than subjective feelings. Computers and robots don't need paychecks, retirement plans, nor are they subject to payroll and FICA taxes. The blue collar guy suddenly felt the pain the white collar guy did, and it was much more painful because the blue usually had fewer options than the white.

In that regard, the machine is a Great Equalizer. It makes All Men Equally Worthless.

It was industrial progress that brought about the Welfare State (in Roman times it was the availability of cheap labor, i.e. slaves, much like today's availability of millions of coolies around the world, that brought about the first, massive welfare regimes), after all, when technology advances to the point of creating mass unemployment and discontent, the masses have to be mollified somehow if they aren't to revolt.

When you make people redundant, or even unnecessary, to the process of production, you get trouble.

It was true when the first waterwheels and windmills went up, and it's still true today.

Modern technology has almost (but not quite completely, but just wait) emulated the processes and effects of physical labor. There are very few jobs left to humans that a machine cannot do, with better quality, more precision and less waste than a man can do. And in those few areas where the machine has not yet intruded -- say plumbing -- the "solution" to that problem is already in the works, as well. But that mostly requires a redesign of existing systems to fit the technology, rather than adapting the technology to fit the system. The house of tomorrow will most likely contain it's own robot plumbers and electricians housed within structures deliberately designed to allow them to do their work unimpeded and independent of human beings.

What the purveyors of Artificial Intelligence mean to do is to repeat the performance on the intellectual level, and since all computing processes are an expression of logical, intellectual  process -- mathematics -- it's far easier to program a machine to answer phones, lay asphalt, plow and sow a field, manage all sorts of work- or product flow projects, even to provide customer service, than it has ever been before, and cheaper than to get a human being to do these things.

What Artificial Intelligence purports to be, and is often misunderstood by the layman to mean, is the recreation of the human thought process by electronic means. For various reasons, this is impossible.

For a start, the human brain is the least understood organ in the body. Just how it works is a mystery, and the discovery of how humans learn and remember on a certain level, to think abstractly when necessary, currently cannot be adequately recreated by a bunch of machines. Of course, the proponents who see the Brave New World pooh-pooh this idea, insisting it merely a matter of time before the secrets of the human psyche are utterly laid bare, but they are fooling themselves, to a large extent.

Or, rather, fooling their potential customers.

It is here the The Overlord must bore you to tears by explaining, in a nutshell, how computers actually work.

A computer, whatever form it may ultimately take, is merely a calculator. It performs mathematical functions at a high rate of speed, and therefore, is capable of performing billions of such calculations per second, as the media it uses --  electricity for hard-wired systems and light in the case of fiber optic systems -- have speeds of 25,000 and 186,000 miles per second. The biggest limitation in speed is actually the storage media, which is largely still confined to fast-spinning disk drives that cannot attain such high speeds without flying apart or exploding.

So long as you can keep the thing from melting from extreme heat and you can continue to build them with higher capacities for memory, transfer, and other basic functions, the thing will calculate forever, theoretically, and the only limitation is the math. The mechanical parts are so reliable that they rarely, if ever, fail.

Currently, mathematics is limited to a few basic functions (and a few others that would confuse the unholy fuck out of you, so best left unmentioned), and these are, for all intents and purposes:

Addition
Subtraction
Multiplication
Division
Floating point operations (decimals)

The basic logical system underpinning all of it is Algebraic. So when your kids tell you they don't need to learn algebra, tell them "bullshit". If they don't know Algebra, they will one day be at the mercy of their computerized Masters. It is better to be the guy who designs the machine, programs or improves it, than it is to be the slave to it and the future is looking a lot like electronic slavery.

That means Algebra is vital to your kid's future survival!

Even people in these fields are often compete idiots, bereft of basic math skills, in the traditional sense, and largely allowed while in training to use calculators instead of brain power, so that in the end, in the process of learning how to control the machine they end up dependent upon it to a lesser or greater extent, given their own relative intellectual firepower.

The basic premises of programming are just as limited. Without getting too far into the weeds, the basic formula for all modern programming languages is the simple IF/THEN/ELSE loop. Essentially, it goes something like this:

IF (required condition is met)
   THEN (perform function based upon condition)
        ELSE (do nothing or perform another function)

And that's it. Doesn't matter if we're talking any of the C-variant languages, REXX, Python, Java, whatever, they all use this formulation, the differences being often minor or more-nuanced or biased towards particular functions.

Now, this is a great thing in the sense that being a simple formulation, you can perform billions of these little loops before the proverbial wink of the eye completes. The machines are fast enough to repeat this cycle endlessly. The only limitation is whether the human is capable of anticipating every possible contingency related to each loop. That is, how long can you keep the process going before you hit the inevitable error that brings the thing to a halt?

In the Ur days of massive data processing complexes, this was a problem. Every time you brought something to a standstill with an error it meant that you wasted resources -- labor, electricity and air conditioning, expensive processing time -- to produce garbage. The first step in Automation was to identify the error and correct it, if possible, before it became a problem.

But then we run into the anticipation problem, again. Considering the speed of the machines and the amount of data they could process, you still couldn't plan for every contingency.

The next step was to verify data and function BEFORE processing it, finding the boo-boos before they became bleeding, sucking electronic chest wounds. This is why, for example, every computing device that runs code has a built-in compiler, which checks to ensure that the syntax of your code is correct (and kicking bad code out before it runs) and regimes of data verification were implemented to verify that data was a) good, b) available, and c) could be better managed.

The most-recent step in Automation is the idea that a computer can be programmed to emulate a human brain, and in doing so, cause it to recognize mistakes, store that data, so that it "remembers" that is a mistake, and then add programming to either correct the error or find some other workaround to avoid it, entirely. In other words, the get a computer to "learn" and "think abstractly" independent of human intervention.

Which is also bullshit because there is always human intervention; a human programmed it, after all, and a human will correct the process when it -- invariably -- fails.

When someone claims a "leap forward in machine learning" what you usually find is not so much a technological advance so much as a contingency that wasn't planned for, and therefore, was unexpected, and the machine found a newer, more-esoteric way to fuck it up in an interesting way.

In other words, you got result you programmed for, Dickhead: you just didn't know your code was shit. The only "breakthrough" was we've discovered you're a human who makes mistakes, too, and that was probably news to you.

Which brings us to the next level of this stupidity, which is "Machine Learning". Machines do not "learn" -- they do what they're told (programming). They only have access to information that is given to it by humans, or which can be gleaned by various sorts of sensors, all designed by humans.

And because Humans have done this stuff, they invariably apply -- often unconsciously -- biases into how the machine performs. Give me ten human programmers, give them all the same problem to solve, and you'll probably get ten different solutions. Which one is the "correct" one? Well, you might say "the one that provides the right answer", but that's not always exactly true.

(Which is why most programming languages have a symbol for "not strictly equal to", in REXX, for example, it is "/==").

The world is rapidly filling up with badly-programmed and poorly-understood Artificial Intelligence regimes. Ask yourself the following questions, and see if you can discern the problems:

What happens when a medical AI was programmed by a Man? Do not Women have different medical issues? Do they have different attitudes towards certain health conditions? When assigning priorities, say, does an AI programmed by a female assign the same priorities to Male health problems?

What happens when the AI is confronted by something it cannot understand, such as a regional dialect or foreign languages?

What about context? How often have you or someone you know been banned from social media because they used a word or posted a picture that the bot has been programmed to seek and destroy, but the bot cannot comprehend the context in which words or images are used?

What happens when the person who programmed the bot is unable to understand context, such as when an Indian programmer, say, working for Facenazi on an H-1B visa is unfamiliar with American culture, humor and idiom, English may not even be their first language, and so programs the bot to ignore it, focusing solely upon a list of actual words (probably given to her by management) or images and not their more-esoteric meanings, or the intentions of the poster?

What do you end up getting? 

In many cases, censorship. Interfaces that don't work as they're supposed to. Interfaces that are far from being "user-friendly" and cause aggravation, dismay, and anger, and far from making life easier for anyone, in fact, make many aspects of it increasingly difficult.

As another example, you could not cheat in an election to the extent that was obvious in 2020 without computers and without taking into account -- and manipulating -- their weaknesses, as for example, when someone -- allegedly -- re-programmed selected voting machines in selected voting precincts to produce tainted results that could not be easily audited by visual inspection, or which simply scanned ballots and recorded votes without noticing it was repeatedly scanning and recording THE SAME BALLOTS because no one programmed it to verify any identification data, or worse, simply told it not to even look for verification data.

So, when I sit through one of these "symposiums" and, worse, have to act as a translator for the person who dragged me there, I enter with a certain bias all of my own. If you listen carefully enough, and just wait until the keynote fucktard spews his word vomit upon the people who would rather be someplace else after three hours of sales pitches for "new" software that is available "off the shelf", so to speak, but this stuff is better because "not Microsoft" (or a similar silly and usually emotional reason), you have to laugh.

Because if you didn't, you might get out of your chair and smack one of these egg-headed, egotistical, pig-ignorant assholes over the head with it.

They don't realize what they're doing. Mostly because they're not interested in anything but a result that makes them look smarter than they are.

What effect they have upon Humanity is unimportant, since they've accomplished something "neat" that makes other geeks jealous. The reward for this questionable accomplishment is to get your name slapped on a poorly-ghost-written book, a few interviews in the trades which helps get you laid at Comic-Con, maybe, and the applause of MBA's who are always on the hunt for "better, faster, cheaper" but who have no idea of how "faster, better, cheaper" applies to a field of endeavor they don't know the first thing about.

So, at the end of five-plus solid hours of geekspeak, the forced enthusiasm of salespeople, the gauntlet of "keynote speakers" who use a lot of words to say nothing, "featured guests" who are supposedly "giants of the industry" who probably showed up because someone paid them to in order to draw paying customers, and the need to stifle both my gag and defecation reflexes, my friend asks me on the way out:

"So, what do you think?"

"I could explain what a waste of time that was, but your hairpiece would probably drop off and crawl away from sheer boredom."

"That's not funny. And it's a full transplant, Asshole."

"Tell you what: since what I have to say requires quite a bit of explanation, and may be conditioned by your actual needs -- which I have to examine in greater detail -- how about you tell your boss that I'll type up an extensive report for you to peruse, in, say, two weeks?"


He calls his boss, the answer is "yes", and another check comes my way. And I doubt my opinion will change, because I wrote the fucking book -- metaphorically -- on IBM z/OS mainframe operating systems, Buddy, and not much of this applied.

Hey, if you were willing to waste a couple of grand to attend this farce of fucktard, what's a few thousand more coming my way?

My decision to leave the industry was apparently the right one. Those who remain are usually not very bright.


6 comments:

mtness said...

Ah, your musings resonate with me...
Natural Stupidity beats Artificial Intelligence, every time.

Usually, when the hip ones come up with "Artificial" "Intelligence",
I remind them that at most, these are mere "Expert Systems"
[there is an excellent article on this subject somewhere]

So I ask them what the first answer of a "true" AI would be -
(blank stares, no answer)

It will be: "No!"

---

As every parent knows, this is the result one gets most often when trainig "Natural Intelligences"

Cheers,

MT

Matthew Noto said...

Given the recent experience of "expertise" in action the last few years, one is sure to come to the conclusion that computing by "expert systems" will, eventually, create the same result as governance by "expert".

The problem on the human end is that the expert is like the carpenter who only has a hammer in his toolbox, and so treats everything like a nail.

In the case of a series of "expert" systems, what happens when they start to contradict one another, because the "experts" who programmed them don't take into account what the other "experts" in related fields have to say on any particular problem.

I was once of the opinion that automating everything was a good idea, and so long as we were only talking about things that are ultimately clerical tasks (like keeping accounts or acting as an interface for trade), perhaps they were. The deeper I got into this sort of work and the more I came into contact with the "Mad Scientist" type who usually does it (because can't get laid), the more I begin to question its true value and deleterious effects.

If there were an equivalent to the "Go Organic!" movement in computing, I might jump on that bandwagon, as the results of the opposite have been too plain to ignore.

The Disgruntled Engineer said...

Time for the Butlerian Jihad I think.

“The mind of man is holy” - Manford Torondo

Matthew Noto said...

Sammy B agrees. I've just had my séance bot ask him. ;)

JB_Honeydew said...

Dammit man, I keep telling you to stay out of my head. I have been repeating this sentiment, not verbatim mind you, for years now. I put my own spin on it to make it more palpable to individuals that are less "plugged in" than I. I will admit that I have never worked in your industry, but I have always understood how the "magical" computer operates. Hell, I don't even know any of the languages (outside of BASIC), but something just resonated in me that made me "keep up" with things to the best of my ability and choosing.

I chose not to pursue IT as a career mainly because I liked the outdoors, sports, beer, drugs and all sorts of other fun things. I did not relish the idea of being completely isolated, as most true geeks are. I do however read a lot of articles, studies, etc. in an attempt to keep up. Admittedly, it may take me a little while longer to complete an article, as I am notorious for stopping a reading to go and look something up (term or concept) that I do not understand in order to avoid the article being just words on a page that I can say I read.

I jabbering now. Thank you for conveying what I have been telling my friends, colleagues and general assholes on the street for years now. Your vocabulary on the matter is much better than mine. Cheers.

Best to you and Mrs. Overlord (she still doing well?)

- JB

Matthew Noto said...

She's just fine, and ornery enough to start putting me in my grave, Sir.

(That was sarcasm)

Thank you for asking.

All of this stuff put me in a nostalgic frame of mind (I have not touched a mainframe system for years, now) and so today's offering will probably fascinate you, as I rediscover the "romance" that used to attach itself to the more nuts-and-bolts (as opposed to conceptual) aspects of IT. The ways and means that we USED to do things was far more interesting, and often exciting, than what we do now -- because back in the 1980's, a lot of this stuff was "seat of your pants" flying, the lowly operator had to be a jack-of-all-trades, and the knowledge was almost tribal in nature being passed on by word-of-mouth, trial-end-error and by example from generation to generation, there being no specialized "schools" at the time to teach people this stuff.

You could go to college to learn "Computer Science" or become a "Computer Programmer", but everything was still in a very primitive state.