In the context of the previous discussion - which was about gleaning any values from, or interests today in, vintage computing - my strategy is to say that personal computing and minicomputing of the 1970's and into the 1980's was a *resource poor* environment. "Efficiency" in using scarce resources is a necessity, to get something done. As gains were made in memory, processor complexity, speed and storage capacity, these resources were fairly quickly maxed out - and then improved again, and again. Some of this stuff was crude, some of it was well crafted in the McGuire sense. But at some point - certainly over a decade ago, probably more - computing for most purposes, for most people, provided more than enough resources. Processor speed topped off (any faster and they'd cook). Memory was cheap. Storage was cheap. Even the Internet became cheap and common. There's powerful languages and environments for various domains. What became expensive? *People.* Programmer time, engineering time, production time. Craft is not cheap. Even time itself has become an expense. I call the 21st century a "resource rich* period of "computing" - it's odd to talk about computing now, it's like telling fish about water, it's just part of living. My point - It's hard to conserve when there's an (apparent) abundance. Resource-poor and people-cheap, to resource-rich but people-expensive. Bringing it back to vintage computing..... So one value of preserving vintage computing, is to remind us by example and fact, about the issues of "efficiency", from a time when we had no choice but to be efficient. the tools and methods from then, can be applied in principle now. Why bother? - for the reasons that Jonathan noted, there are times and situations where convenient in-efficiency bites us in the ass. Or as Dave McGuire noted, as a matter of quality. Current resources can be overtaxed when mis-used. Old-school methods and quick fixes don't work well when scaled too large. Smart tools used by the less-informed produce un-scalable results. And so on. Resource-poor to resource-rich; and now to maxed-out or inefficient. That's the argument. So we in vintage computing may be able to offer some perspective in how to solve problems of excessive, at least inefficient, use of resources. Or not....smarter software may be smarter than the programmers, and avoid the worst of these problems. Compared to the 1970's, we have computers now to program computers, they do it better than the worst engineers and programmers. This is annoying to us, who take pride in being smarter and more efficient. Also: old stuff is clunky, boring to many. Those are the counter-arguments. Herb Johnson -- Herbert R. Johnson, New Jersey USA http://www.retrotechnology.com OR .net
On 06/08/2016 12:18 PM, Dan Roganti via vcf-midatlantic wrote:
āI think the term "resource-poor" for describing the 60s, 70sā or maybe the 80s is inaccurate It is relative, depending on your viewpoint - if you keep looking from the 21st century viewpoint But if you were there in the thick of it back then, it was actually the norm. So you were expected to design with those constraints Constraints are a fact of life in engineering
Dave McGuire:
I agree 100%. Many of us were there (myself for the 70s and 80s at least), we never thought of our computers as resource-poor. It was more like "Wow, this system has 64K of RAM! SOOO much more than the last one, look at how much more I can do!"
I disagree 57.25%. ;) This is a marginal argument not either-or. In your own statement, you are looking retrospectively at a time of less resource. As am I. But I was there too in the 1970's. Of course we did not think of ourselves as entirely "resource poor". But we knew that in months or a year, the high-integration parts were were using - memory, RAM, etc. - would be cheaper and have "more". We were not blind to the future. Proof? The common catchphrase of that time, about buying more memory or a card or chips for your microcomputer was "you have to decide when you are going to waste your money". Because the NEXT product would have those chips or features. But we had to get things done, waiting would not get things done; so we bought and did good work with what we had; and upgraded later if we could. And of course, in retrospect, with more resources in hand we look back at past microcomputing as limited in part by the resources of the time. Looking at the past is retrospective by definition. Looking back and comparing is not for me, an exercise in "what sucked back then". I work hard to show what and how things worked - because I and others are making them work AGAIN, now. But it can be hard for others to get this today. This is not easy, Dave reported one failure - not his fault of course, it's an instance. And that suggests why we as vintage computerists are challenged, as we explain our past, today. I'm trying to show some support, for Dave McGuire's position, from the vintage perspective - scarcer resources required more discipline, less slop, more effort, etc.. If Dave wants to argue for those qualities from first principles, more power to him. These are allied but different arguments; this is Dave's thread not mine. Herb Johnson -- Herbert R. Johnson, New Jersey USA http://www.retrotechnology.com OR .net preservation of 1970's computing email: hjohnson AAT retrotechnology DOTT com alternate: herbjohnson ATT retrotechnology DOTT info
On Wed, Jun 8, 2016 at 5:09 PM, Herb Johnson via vcf-midatlantic < vcf-midatlantic@lists.vintagecomputerfederation.org> wrote:
I disagree 57.25%. ;) This is a marginal argument not either-or. In your own statement, you are looking retrospectively at a time of less resource. As am I.
But I was there too in the 1970's. Of course we did not think of ourselves as entirely "resource poor". But we knew that in months or a year, the high-integration parts were were using - memory, RAM, etc. - would be cheaper and have "more". We were not blind to the future.
āI think this goes without saying in any generation of technology. The biggest problem with design was hoping the bleeding edge parts were delivered on time to meet the project schedule. People are acquainted with future technology in components, but int he real world you had to rely on beta and even more cutting edge alpha components. Companies vie to get on the alpha list with many of the vendors, a limited exclusive club, just so they can build their prototypes sooner and get to the market faster. So by the time the hardware and software was tested and debugged, the production components would be shipping on time - hopefully with as few errata sheets as possible:) I've had several issues with vendor parts schedule slipping with TI 30+ yrs ago with delivering their alpha components for their first bidirectional latching bus transceivers on time. To all the way in the 2000's with Motorola and IBM delivering their latest alpha PowerPC processors. That was how future technology was handled - it was a delicate matter. Dan ā
participants (2)
-
Dan Roganti -
Herb Johnson