[vcf-midatlantic] efficiency, put another way

Herb Johnson hjohnson at retrotechnology.info
Wed Jun 8 11:29:53 EDT 2016

In the context of the previous discussion - which was about gleaning any 
values from, or interests today in, vintage computing - my strategy is 
to say that personal computing and minicomputing of the 1970's and into 
the 1980's was a *resource poor* environment. "Efficiency" in using 
scarce resources is a necessity, to get something done. As gains were 
made in memory, processor complexity, speed and storage capacity, these 
resources were fairly quickly maxed out - and then improved again, and 
again. Some of this stuff was crude, some of it was well crafted in the 
McGuire sense.

But at some point - certainly over a decade ago, probably more - 
computing for most purposes, for most people, provided more than enough 
resources. Processor speed topped off (any faster and they'd cook). 
Memory was cheap. Storage was cheap. Even the Internet became cheap and 
common. There's powerful languages and environments for various domains.

What became expensive? *People.* Programmer time, engineering time, 
production time. Craft is not cheap. Even time itself has become an 
expense. I call the 21st century a "resource rich* period of "computing" 
- it's odd to talk about computing now, it's like telling fish about 
water, it's just part of living. My point - It's hard to conserve when 
there's an (apparent) abundance.

Resource-poor and people-cheap, to resource-rich but people-expensive.

Bringing it back to vintage computing.....

So one value of preserving vintage computing, is to remind us by example 
and fact, about the issues of "efficiency", from a time when we had no 
choice but to be efficient. the tools and methods from then, can be 
applied in principle now.

Why bother? - for the reasons that Jonathan noted, there are times and 
situations where convenient in-efficiency bites us in the ass. Or as 
Dave McGuire noted, as a matter of quality. Current resources can be 
overtaxed when mis-used. Old-school methods and quick fixes don't work 
well when scaled too large. Smart tools used by the less-informed 
produce un-scalable results. And so on.

Resource-poor to resource-rich; and now to maxed-out or inefficient. 
That's the argument.

So we in vintage computing may be able to offer some perspective in how 
to solve problems of excessive, at least inefficient, use of resources.

Or not....smarter software may be smarter than the programmers, and 
avoid the worst of these problems. Compared to the 1970's, we have 
computers now to program computers, they do it better than the worst 
engineers and programmers. This is annoying to us, who take pride in 
being smarter and more efficient. Also: old stuff is clunky, boring to 
many. Those are the counter-arguments.

Herb Johnson

Herbert R. Johnson,  New Jersey USA
http://www.retrotechnology.com OR .net

More information about the vcf-midatlantic mailing list