[vcf-midatlantic] replying and elaborating to Herb Johnson's postings re: 4004 Based Microcomputer
Jeffrey Jonas
jeffrey.scott.jonas at gmail.com
Fri Jan 12 18:47:10 UTC 2024
This touches upon several of my pet peeves,
so please pardon the way my reply drones on and on and on
until I start foaming at the mouth and falling over backwards ... Oh!
https://youtu.be/DO7VkEFZ7B8 [Monty Python skit]
Herb touches upon how our expectations of "computers" have evolved.
To folks outside the engineering profession,
"embedded processors", "industrial controllers" and other special-purpose
machines
don't look-and-feel like that's now accepted as a "computer".
Yet the Arduino has made such embedded processing more accessible than ever
to the hobbyist, experimenter, artist, etc.
Perhaps that applies here: what were once called "computers"
are now considered embedded processors,
IoT (Internet of Things) or Internet appliances.
I was about to say how cellphones are more of an appliance than a computer
by the way most folks just download apps.
They're not self-hosting, requiring a host system to compile the code.
But then again, even I am using my home PC as an appliance.
I rarely program or even customize it.
I'm running the web browser, text editor, moving files, etc.
Having salvaged lotsa vintage electronics, I've seen the insides of lotsa
equipment.
I've subscribed to Circuit Cellar from the beginning.
They were an early advocate of single chip microcontrollers.
These chips come to mind when thinking of embedded processors
(in chronological order)
1) https://en.wikipedia.org/wiki/Intel_4004
1971: the Intel 4004.
I encountered one deep inside a Calcomp DS12 hard drive controller.
See page 55 http://www.bitsavers.org/magazines/Datamation/19710801.pdf
[AND FOR THE RECORD:
the 4004 was the first _commercially_available_ microprocessor.
The military was first with the MP944 chip set for the
US Navy's F-14 Tomcat fighter's CADC: completed June 1970.
https://firstmicroprocessor.com/
https://www.computerhistory.org/siliconengine/microprocessor-integrates-cpu-function-onto-a-single-chip/
https://en.wikipedia.org/wiki/F-14_CADC
https://www.wired.com/story/secret-history-of-the-first-microprocessor-f-14/
]
2) https://en.wikipedia.org/wiki/Intel_MCS-48
This family of single chip microcontrollers was released in 1976.
The IBM PC used them as the keyboard controller
(allowing a thin coiled cord and serial link instead of ribbon cable and
parallel interface).
3) https://en.wikipedia.org/wiki/Zilog_Z8
1979: the Zilog Z8.
Very popular with hobbyists for the piggyback ROM socket.
Circuit Cellar / Micromint made many Z8 systems such as FORTH in ROM.
http://cini.classiccmp.org/pdf/MicroMint/Micromint_Z8_Forth.pdf
4) https://en.wikipedia.org/wiki/PIC_microcontrollers
1976: General Instruments' PIC (Peripheral Interface Controller)
evolved from ROM only to flash memory, so they're easily reprogrammed and
re-purposed.
PIC, like PDP, is a name to avoid saying "computer" or "microprocessor".
5) https://en.wikipedia.org/wiki/AVR_microcontrollers
1996: Atmel AVR series of single chip microcontrollers.
Similar to the PIC, popularized by the Arduino.
The Arduino ecosystem is no longer just for experimenters and hobbyists.
It's now used for Industrial automation and machine controllers,
replacing embedded processors such as the Z80 and STD bus.
See: https://www.automationdirect.com/open-source/home
What scares me is NOT the tool but how you use it.
Just as everything looks like a nail if you have only a hammer,
and screwdrivers have been abused as pry bars, wedges, paint-can-openers,
etc.
The Arduino is just a tool.
My fear is the lack of disciplined programming, proper safety engineering
practices
and peer review of mission critical devices that risk injury or death.
I remember the horror of the Therac-25. Do you?
https://en.wikipedia.org/wiki/Therac-25
Even Google knows to throw me relevant ads:
https://www.onlogic.com/
"Ideal for IoT, Edge and AI Inferencing applications"
Just like the movie Westworld, what could possibly go wrong ... go wrong
... go wrong
https://en.wikipedia.org/wiki/Westworld_(film)
https://en.wikipedia.org/wiki/Westworld_(TV_series)
That's what happens when robots/androids violate Isaac Asimov's "Three Laws
of Robotics"
1) A robot may not injure a human being or, through inaction, allow a human
being to come to harm.
2) A robot must obey orders given it by human beings except where such
orders would conflict with the First Law.
3) A robot must protect its own existence as long as such protection does
not conflict with the First or Second Law.
and that's the way it is.
-- jeff jonas
More information about the vcf-midatlantic
mailing list