"One is always at home in one's past..." - Vladimir Nabokov, "Speak, Memory"
As I was saying a few days past, I find myself engaged in a project which is near and dear, so to speak.
I'm beginning to feel like Michael Corleone: just when I thought I was out, they pull me back in.
A friend who works for a mainframe time-sharing service asked me to do him a solid and look into the possibilities and benefits that Artificial Intelligence might bring to his employer. My own background in this field is that for nearly 25 years I have found gainful (and often interesting) employment in the field of Automation Programming.
Automation Programming, in a nutshell, is simply a fancy term for "find out how to do this without people", i.e. to attempt to apply technological means to any particular task, so as to either a) free people from the more-mundane aspects of their work, allowing them to concentrate on other, more-remunerative/productive tasks, or b) find a way to make the processes autonomous so that the people who used to do them become redundant, and, therefore, expendable.
There's some more esoteric options within that range of choices, but I'd only bore you to insanity by examining them in detail.
Now, as to what a "time share" is in computer terms: A company builds itself a data center, consisting of all the very expensive infrastructure this entails, and then leases the computing capacity to clients who have need of such services, but who can neither afford, nor wish to expend the funds to acquire them for themselves. The client is essentially charged for the amount of resources that are used in order to process whatever work they need done by the system (for example, how much CPU time did they utilize, how much storage, how many removable media -- tape, for example -- what specific support services, what type and how many communications arrangements, and so forth).
Multiple clients will be utilizing this capacity, simultaneously, and in theory, the company providing the service not only makes enough money leasing this capacity, but also profits on it. However, margins can be thin, as the evolution of technology has made powerful computing systems more-affordable than ever before, and the need for such services has lessened. It's main selling point -- it's cheaper than buying your own -- is about the only one that still sticks, and so anyone engaged in this sort of business has to find their "edge" elsewhere, usually in offering superior services or support.
Which is difficult to do, nowadays, because the people who used to do this sort of thing are getting fewer and further between, and the younger generation is getting dumber with shorter attention spans and far-less technical knowledge (they don't teach any of that in school, you see) with the passage of years and advances in automation.
The idea here is to make the service cheaper and more-efficient, so that customers can be offered lower prices in return for longer contracts. And so one of the biggest expenses left -- because the machines are so efficient, and the software is pretty damned good -- that can be trimmed is personnel.
Traditional automation, automation in general, only gets you so far. There's only so much code that can be applied in a practical manner . There are still many -- usually minor -- functions that the computer cannot perform. This is where the brains trust that is assumed to be "in charge" (trust me, the systems run YOU, always) start to look to the equivalent of magic for solutions. Thus, they start investigating the possibilities of applying "Artificial Intelligence", assuming this to be a panacea.
Because "the industry" says it is.
In practice, it's not all that it is cracked up to be. Primarily because for a regime that is supposed to be independent of humans, it still depends upon them an awful lot. Humans are imperfect, and therefore, the results are not always as expected.
So, to date, I have gone to a symposium to explore and learn about some of the "latest and greatest", since I haven't really paid attention to this stuff for about a year (a lot happens in that time frame) and I'm in the process of studying this particular client's modus operandi -- what they do, how they do it, etc, etc., so that I can make reasonably well-educated suggestions, yea or nay, vis-a-vis Hal 9000 being the "right" option for you.
Except that I'm discovering that what this client wants to do is a) nearly impossible, b) beyond my capacity, c) misdirected, as AI is not really applicable here, IMO, and d) probably best left undone, or, alternately, done by people who have the discipline and smarts to just do it..
They'll need to find someone else to do it. No way I could do this on my own, and certainly not in the timeframe they're requiring.
Essentially, the issue is a very simple one, but no one is devoting the time necessary to adequately study it because the assumed, pre-determined result reflects poorly on those who aren't studying or doing anything about it. The problem is what are known as "legacy systems", which are older systems and applications which manage to survive mostly because no one wants to do the work or bear the expense of replacing/updating them, or because the few clients who still use them can kick up sufficient fuss to prevent a replacement/upgrade.
The usual reason is "because we've done it this way forever, and it works". Which is perfectly valid, but totally wasteful when you consider that a million-dollars-plus-per-year, say, piece of ancient shit that only gets used a few times a month, or even a year, sticks around because either someone's job security is at stake, or because there is resistance to necessary change, is a total waste of resources.
These "legacy systems" then manage to stick around for years past the point where they're truly useful and become more-expensive to maintain because the software or hardware they require is older than dirt, no longer supported by the original manufacturer, and has to be serviced by third-party vendors who charge outrageous rates (because "old"), or by overpaid consultants who can still write COBOL, or something.
And worse, they seem to multiply as the years go on, so that the problem of replacing them or upgrading them becomes far-more complex than it needs to be, and oddly enough, this typically turns out to be a political problem on top of a strictly managerial one.
WHO gets priority? WHO makes that decision? WHO allocates the resources? WHO gets the responsibility (that they'd rather avoid)?
(You'd be shocked, despite all the Y2K fearmongering, just how much COBOL is still out there, for example. I know (it's hardly an industry secret, and recently COVID made it obvious, again) that the entire state of New Jersey's Unemployment system is still running on COBOL apps, and I wouldn't doubt many other -- usually government -- systems are still COBOL-dependent).
In this case, rather than lose customers at the margins by just declaring their legacy stuff DOA and replacing it against the client's wishes (because that would make sense and take balls), someone got the wild hair up their ass that the solution may lay in AI.
This someone, I think, reads too much Sci-Fi.
I've only learned of this delusion a few days ago, because otherwise, none of this was making sense to me and so I asked the right questions -- and got moronic answers. It's not a problem in search of a solution: it's a bunch of people who got lazy expressing a pious wish, a real Hail Mary, if ever there was one.
(This is probably how I got asked to look at this, in the first place).
Bottom line: no one wants to do the work and no one wants the responsibility of telling a stubborn client or cranky consultant to fuck off, or of risking failure in a part of the industry that is already getting fucked without lube thanks to growing electric bills -- mainframes and their associated hardware use huge amounts of electrical power, after all. I'm beginning to suspect that this, more than anything, is the origin of this fantastically dumb idea, and rather than tell the truth, people are throwing all sorts of wild-ass ideas at the wall hoping one will stick.
I'll be sending a report that states I'm tapping out on Monday morning.
In the meantime, I got to take a tour of the facility and it was as if I was transported back in time to when I first saw a mainframe in 1985.
The romance is still alive.
If all those IBM machines could cook, I'd marry them.
And so it appears that all this time I've spent trying to find "a new direction" in life was wasted. I'm a Big Iron addict and the jones is still overwhelming.