[vcf-midatlantic] examples of non-crystal based clocks in digital computing?
Herb Johnson
hjohnson at retrotechnology.info
Mon Mar 14 13:30:34 EDT 2016
> Ben Greenfield posted:
>
> I have one simple specific question, without a simple answer
> and general call for examples.
>
> I’m researching timing clocks used in general purpose digital
> computing. I’m starting to think that Ace/Pilot Ace by Alan Turing
> is the start of clock based general purpose computers. I’m starting
> to conclude the earlier computers were all basically clocked
> by the media holding the instructions.
My apologies, but I'm a degreed electrical engineer, who first studied
computers as digital logic devices and circuits in the mid-1970's. My
response is to refute the premises.
I don't think the question is simple or specific. It's kind-of specific,
but "timing clocks" and "general purpose computers" are not well defined
terms in very old digital computing (since Turing is mentioned). I don't
think general examples covering several decades of computing will
provide "simple" examples and answers. I have a tough time, figuring out
pre-1970's mincomputers (much less pre-1960's mainframes) from my Intel
8080 point of view.
Digital computers of decades before, say, computers built from
integrated circuits, were of all kinds of designs. They are different in
fundamental ways. In particular, some operate under a common timing or
clock signal, some do not BUT they may have features that are
repetitive. A common clock may be needed or may not.
Terms like "synchronous" and "asynchronous" and "sequential", are part
of the world of computing theory. Things happening one after another are
sequential, but may not be synchronous.
Some drum-based computers to my knowledge, included their computing
registers *on the drum*. One transistor model of the 1960's I know from
ownership, had several copies of its registers on the same tracks - why?
For faster access. Was there also a clock signal which was ultimately
tied to the drum rotation rate? Have to check the manuals, but the drum
did much of the work anyway.
As for "general purpose" digital computers - there were all kinds of
computers several decades ago, many were specific-purpose. Maybe Ben
means "programmable" computers, I dunno.
I have little idea about what Ben is looking for. But that's OK, there's
nothing wrong with creating some assumptions and premises about ancient
computing (digital or otherwise), and seeing what one can learn based on
those premises. You have to start with what you know, or think you know.
I stumble over my own assumptions all the time.
But in my own opinion about retrospective looks at very old computing,
there's a problem when one assumes modern designs and architectures as a
"standard" and then looks back to see where that "standard" fails or
breaks down or when it started.
For example: for some reason the thread is talking about
crystal-controlled clocks versus not. The question wasn't *exact*
clocking. A timing fork in a UNIVAC at 1024 Hertz, "sounds" fine to me.
A tuning fork (wikipedia) varies by under 100 parts per million per
degree C. But a microprocessor can use a variable clock signal from an
R/C circuit and work fine. I run an 1802 microcomputer which sometimes
uses an R/C circuit clocking down to audio rates - programs run just
fine, not precise in time, but slow (and at very low power consumption,
microwatts and milliwatts).
I prefer to examine ancient computers on their own merits, look at their
features on their own terms and on terms immediately contemporary to
their times. As a digital engineer I can compare-and-contrast circuits
and learn about benefits and limitations, but only in the context of
intended use can one pass judgements. and of course, engineers (and
techies) often disagree about such judgements.
So: it's an interesting question, when put in a general way: what kinds
of ancient (or recent) computers did not depend on a common and
well-controlled clock signal to synchronize their operations? and how
did some operate without it? But for all I know, Ben is asking when
computers "first" used a synchronous clock signal for all its computing
circuits (whatever that means). I'm not a big fan of "firsts", in any
ranking something is always "first", so what?
Herb Johnson
vintage computing scold
--
Herbert R. Johnson, New Jersey USA
http://www.retrotechnology.com OR .net
More information about the vcf-midatlantic
mailing list