replying and elaborating to Herb Johnson's postings re: 4004 Based Microcomputer
This touches upon several of my pet peeves, so please pardon the way my reply drones on and on and on until I start foaming at the mouth and falling over backwards ... Oh! https://youtu.be/DO7VkEFZ7B8 [Monty Python skit] Herb touches upon how our expectations of "computers" have evolved. To folks outside the engineering profession, "embedded processors", "industrial controllers" and other special-purpose machines don't look-and-feel like that's now accepted as a "computer". Yet the Arduino has made such embedded processing more accessible than ever to the hobbyist, experimenter, artist, etc. Perhaps that applies here: what were once called "computers" are now considered embedded processors, IoT (Internet of Things) or Internet appliances. I was about to say how cellphones are more of an appliance than a computer by the way most folks just download apps. They're not self-hosting, requiring a host system to compile the code. But then again, even I am using my home PC as an appliance. I rarely program or even customize it. I'm running the web browser, text editor, moving files, etc. Having salvaged lotsa vintage electronics, I've seen the insides of lotsa equipment. I've subscribed to Circuit Cellar from the beginning. They were an early advocate of single chip microcontrollers. These chips come to mind when thinking of embedded processors (in chronological order) 1) https://en.wikipedia.org/wiki/Intel_4004 1971: the Intel 4004. I encountered one deep inside a Calcomp DS12 hard drive controller. See page 55 http://www.bitsavers.org/magazines/Datamation/19710801.pdf [AND FOR THE RECORD: the 4004 was the first _commercially_available_ microprocessor. The military was first with the MP944 chip set for the US Navy's F-14 Tomcat fighter's CADC: completed June 1970. https://firstmicroprocessor.com/ https://www.computerhistory.org/siliconengine/microprocessor-integrates-cpu-... https://en.wikipedia.org/wiki/F-14_CADC https://www.wired.com/story/secret-history-of-the-first-microprocessor-f-14/ ] 2) https://en.wikipedia.org/wiki/Intel_MCS-48 This family of single chip microcontrollers was released in 1976. The IBM PC used them as the keyboard controller (allowing a thin coiled cord and serial link instead of ribbon cable and parallel interface). 3) https://en.wikipedia.org/wiki/Zilog_Z8 1979: the Zilog Z8. Very popular with hobbyists for the piggyback ROM socket. Circuit Cellar / Micromint made many Z8 systems such as FORTH in ROM. http://cini.classiccmp.org/pdf/MicroMint/Micromint_Z8_Forth.pdf 4) https://en.wikipedia.org/wiki/PIC_microcontrollers 1976: General Instruments' PIC (Peripheral Interface Controller) evolved from ROM only to flash memory, so they're easily reprogrammed and re-purposed. PIC, like PDP, is a name to avoid saying "computer" or "microprocessor". 5) https://en.wikipedia.org/wiki/AVR_microcontrollers 1996: Atmel AVR series of single chip microcontrollers. Similar to the PIC, popularized by the Arduino. The Arduino ecosystem is no longer just for experimenters and hobbyists. It's now used for Industrial automation and machine controllers, replacing embedded processors such as the Z80 and STD bus. See: https://www.automationdirect.com/open-source/home What scares me is NOT the tool but how you use it. Just as everything looks like a nail if you have only a hammer, and screwdrivers have been abused as pry bars, wedges, paint-can-openers, etc. The Arduino is just a tool. My fear is the lack of disciplined programming, proper safety engineering practices and peer review of mission critical devices that risk injury or death. I remember the horror of the Therac-25. Do you? https://en.wikipedia.org/wiki/Therac-25 Even Google knows to throw me relevant ads: https://www.onlogic.com/ "Ideal for IoT, Edge and AI Inferencing applications" Just like the movie Westworld, what could possibly go wrong ... go wrong ... go wrong https://en.wikipedia.org/wiki/Westworld_(film) https://en.wikipedia.org/wiki/Westworld_(TV_series) That's what happens when robots/androids violate Isaac Asimov's "Three Laws of Robotics" 1) A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2) A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. and that's the way it is. -- jeff jonas
We've identified a number of places where Arduino solutions would be great for some remote data collection and non-critical watchdog operation to let us know of a failure. As you said, a lot of folks don't have the experience of 30+ years in industrial automation and safety requirements and might now realize they've created a severe safety issue (such as a SIL-3 gate switch being managed by an Arduino). About 15 years ago my coworker was trying to speed up a CIncinnati brand robot that stacked 70# commercial microwaves onto a stretch wrapper. He had a 0.3 change to the speed (he thought). He had actually changed the nature of the acceleration curve with a final result about 2.1 times the speed he started at. Rather than stacking the boxes, 4 of them were picked up and then slung around and the clamps failed to maintain sufficient friction and force to save them. Fortunately, nobody was hurt but it did slign 4 of the units well over 100' away to slam into a wall. We started reevaluating the safety systems on much of our older equipment and implemented much more stringent reviews relating to change management. On Fri, Jan 12, 2024 at 1:47 PM Jeffrey Jonas via vcf-midatlantic < vcf-midatlantic@lists.vcfed.org> wrote:
This touches upon several of my pet peeves, so please pardon the way my reply drones on and on and on until I start foaming at the mouth and falling over backwards ... Oh! https://youtu.be/DO7VkEFZ7B8 [Monty Python skit]
Herb touches upon how our expectations of "computers" have evolved. To folks outside the engineering profession, "embedded processors", "industrial controllers" and other special-purpose machines don't look-and-feel like that's now accepted as a "computer". Yet the Arduino has made such embedded processing more accessible than ever to the hobbyist, experimenter, artist, etc.
Perhaps that applies here: what were once called "computers" are now considered embedded processors, IoT (Internet of Things) or Internet appliances.
I was about to say how cellphones are more of an appliance than a computer by the way most folks just download apps. They're not self-hosting, requiring a host system to compile the code. But then again, even I am using my home PC as an appliance. I rarely program or even customize it. I'm running the web browser, text editor, moving files, etc.
Having salvaged lotsa vintage electronics, I've seen the insides of lotsa equipment. I've subscribed to Circuit Cellar from the beginning. They were an early advocate of single chip microcontrollers. These chips come to mind when thinking of embedded processors (in chronological order)
1) https://en.wikipedia.org/wiki/Intel_4004 1971: the Intel 4004. I encountered one deep inside a Calcomp DS12 hard drive controller. See page 55 http://www.bitsavers.org/magazines/Datamation/19710801.pdf
[AND FOR THE RECORD: the 4004 was the first _commercially_available_ microprocessor. The military was first with the MP944 chip set for the US Navy's F-14 Tomcat fighter's CADC: completed June 1970.
https://firstmicroprocessor.com/
https://www.computerhistory.org/siliconengine/microprocessor-integrates-cpu-... https://en.wikipedia.org/wiki/F-14_CADC
https://www.wired.com/story/secret-history-of-the-first-microprocessor-f-14/ ]
2) https://en.wikipedia.org/wiki/Intel_MCS-48 This family of single chip microcontrollers was released in 1976. The IBM PC used them as the keyboard controller (allowing a thin coiled cord and serial link instead of ribbon cable and parallel interface).
3) https://en.wikipedia.org/wiki/Zilog_Z8 1979: the Zilog Z8. Very popular with hobbyists for the piggyback ROM socket. Circuit Cellar / Micromint made many Z8 systems such as FORTH in ROM. http://cini.classiccmp.org/pdf/MicroMint/Micromint_Z8_Forth.pdf
4) https://en.wikipedia.org/wiki/PIC_microcontrollers 1976: General Instruments' PIC (Peripheral Interface Controller) evolved from ROM only to flash memory, so they're easily reprogrammed and re-purposed. PIC, like PDP, is a name to avoid saying "computer" or "microprocessor".
5) https://en.wikipedia.org/wiki/AVR_microcontrollers 1996: Atmel AVR series of single chip microcontrollers. Similar to the PIC, popularized by the Arduino.
The Arduino ecosystem is no longer just for experimenters and hobbyists. It's now used for Industrial automation and machine controllers, replacing embedded processors such as the Z80 and STD bus. See: https://www.automationdirect.com/open-source/home
What scares me is NOT the tool but how you use it. Just as everything looks like a nail if you have only a hammer, and screwdrivers have been abused as pry bars, wedges, paint-can-openers, etc. The Arduino is just a tool. My fear is the lack of disciplined programming, proper safety engineering practices and peer review of mission critical devices that risk injury or death. I remember the horror of the Therac-25. Do you? https://en.wikipedia.org/wiki/Therac-25
Even Google knows to throw me relevant ads: https://www.onlogic.com/ "Ideal for IoT, Edge and AI Inferencing applications" Just like the movie Westworld, what could possibly go wrong ... go wrong ... go wrong https://en.wikipedia.org/wiki/Westworld_(film) https://en.wikipedia.org/wiki/Westworld_(TV_series)
That's what happens when robots/androids violate Isaac Asimov's "Three Laws of Robotics"
1) A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2) A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
and that's the way it is.
-- jeff jonas
I get it and I appreciate the discussion I have a full blown computer that does nothing other than play games Our print site has raspberry Pis acting as PLCs because the PLCs are no longer made and someone wrote software to do the job. The original Micral was made for industry, in fact Warner and Swasey licensed it and built them in the US. So I think this will be an interesting discussion On Fri, Jan 12, 2024 at 1:48 PM Jeffrey Jonas via vcf-midatlantic <vcf-midatlantic@lists.vcfed.org> wrote:
This touches upon several of my pet peeves, so please pardon the way my reply drones on and on and on until I start foaming at the mouth and falling over backwards ... Oh! https://youtu.be/DO7VkEFZ7B8 [Monty Python skit]
Herb touches upon how our expectations of "computers" have evolved. To folks outside the engineering profession, "embedded processors", "industrial controllers" and other special-purpose machines don't look-and-feel like that's now accepted as a "computer". Yet the Arduino has made such embedded processing more accessible than ever to the hobbyist, experimenter, artist, etc.
Perhaps that applies here: what were once called "computers" are now considered embedded processors, IoT (Internet of Things) or Internet appliances.
I was about to say how cellphones are more of an appliance than a computer by the way most folks just download apps. They're not self-hosting, requiring a host system to compile the code. But then again, even I am using my home PC as an appliance. I rarely program or even customize it. I'm running the web browser, text editor, moving files, etc.
Having salvaged lotsa vintage electronics, I've seen the insides of lotsa equipment. I've subscribed to Circuit Cellar from the beginning. They were an early advocate of single chip microcontrollers. These chips come to mind when thinking of embedded processors (in chronological order)
1) https://en.wikipedia.org/wiki/Intel_4004 1971: the Intel 4004. I encountered one deep inside a Calcomp DS12 hard drive controller. See page 55 http://www.bitsavers.org/magazines/Datamation/19710801.pdf
[AND FOR THE RECORD: the 4004 was the first _commercially_available_ microprocessor. The military was first with the MP944 chip set for the US Navy's F-14 Tomcat fighter's CADC: completed June 1970.
https://firstmicroprocessor.com/ https://www.computerhistory.org/siliconengine/microprocessor-integrates-cpu-... https://en.wikipedia.org/wiki/F-14_CADC https://www.wired.com/story/secret-history-of-the-first-microprocessor-f-14/ ]
2) https://en.wikipedia.org/wiki/Intel_MCS-48 This family of single chip microcontrollers was released in 1976. The IBM PC used them as the keyboard controller (allowing a thin coiled cord and serial link instead of ribbon cable and parallel interface).
3) https://en.wikipedia.org/wiki/Zilog_Z8 1979: the Zilog Z8. Very popular with hobbyists for the piggyback ROM socket. Circuit Cellar / Micromint made many Z8 systems such as FORTH in ROM. http://cini.classiccmp.org/pdf/MicroMint/Micromint_Z8_Forth.pdf
4) https://en.wikipedia.org/wiki/PIC_microcontrollers 1976: General Instruments' PIC (Peripheral Interface Controller) evolved from ROM only to flash memory, so they're easily reprogrammed and re-purposed. PIC, like PDP, is a name to avoid saying "computer" or "microprocessor".
5) https://en.wikipedia.org/wiki/AVR_microcontrollers 1996: Atmel AVR series of single chip microcontrollers. Similar to the PIC, popularized by the Arduino.
The Arduino ecosystem is no longer just for experimenters and hobbyists. It's now used for Industrial automation and machine controllers, replacing embedded processors such as the Z80 and STD bus. See: https://www.automationdirect.com/open-source/home
What scares me is NOT the tool but how you use it. Just as everything looks like a nail if you have only a hammer, and screwdrivers have been abused as pry bars, wedges, paint-can-openers, etc. The Arduino is just a tool. My fear is the lack of disciplined programming, proper safety engineering practices and peer review of mission critical devices that risk injury or death. I remember the horror of the Therac-25. Do you? https://en.wikipedia.org/wiki/Therac-25
Even Google knows to throw me relevant ads: https://www.onlogic.com/ "Ideal for IoT, Edge and AI Inferencing applications" Just like the movie Westworld, what could possibly go wrong ... go wrong ... go wrong https://en.wikipedia.org/wiki/Westworld_(film) https://en.wikipedia.org/wiki/Westworld_(TV_series)
That's what happens when robots/androids violate Isaac Asimov's "Three Laws of Robotics"
1) A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2) A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
and that's the way it is.
-- jeff jonas
You forgot the 0th law: A robot may not harm humanity, or, by inaction, allow humanity to come to harm. And with the advent of AI we may need to add Dilov's 5th law: A robot must establish its identity as a robot in all cases. On Fri, Jan 12, 2024 at 1:47 PM Jeffrey Jonas via vcf-midatlantic < vcf-midatlantic@lists.vcfed.org> wrote:
This touches upon several of my pet peeves, so please pardon the way my reply drones on and on and on until I start foaming at the mouth and falling over backwards ... Oh! https://youtu.be/DO7VkEFZ7B8 [Monty Python skit]
Herb touches upon how our expectations of "computers" have evolved. To folks outside the engineering profession, "embedded processors", "industrial controllers" and other special-purpose machines don't look-and-feel like that's now accepted as a "computer". Yet the Arduino has made such embedded processing more accessible than ever to the hobbyist, experimenter, artist, etc.
Perhaps that applies here: what were once called "computers" are now considered embedded processors, IoT (Internet of Things) or Internet appliances.
I was about to say how cellphones are more of an appliance than a computer by the way most folks just download apps. They're not self-hosting, requiring a host system to compile the code. But then again, even I am using my home PC as an appliance. I rarely program or even customize it. I'm running the web browser, text editor, moving files, etc.
Having salvaged lotsa vintage electronics, I've seen the insides of lotsa equipment. I've subscribed to Circuit Cellar from the beginning. They were an early advocate of single chip microcontrollers. These chips come to mind when thinking of embedded processors (in chronological order)
1) https://en.wikipedia.org/wiki/Intel_4004 1971: the Intel 4004. I encountered one deep inside a Calcomp DS12 hard drive controller. See page 55 http://www.bitsavers.org/magazines/Datamation/19710801.pdf
[AND FOR THE RECORD: the 4004 was the first _commercially_available_ microprocessor. The military was first with the MP944 chip set for the US Navy's F-14 Tomcat fighter's CADC: completed June 1970.
https://firstmicroprocessor.com/
https://www.computerhistory.org/siliconengine/microprocessor-integrates-cpu-... https://en.wikipedia.org/wiki/F-14_CADC
https://www.wired.com/story/secret-history-of-the-first-microprocessor-f-14/ ]
2) https://en.wikipedia.org/wiki/Intel_MCS-48 This family of single chip microcontrollers was released in 1976. The IBM PC used them as the keyboard controller (allowing a thin coiled cord and serial link instead of ribbon cable and parallel interface).
3) https://en.wikipedia.org/wiki/Zilog_Z8 1979: the Zilog Z8. Very popular with hobbyists for the piggyback ROM socket. Circuit Cellar / Micromint made many Z8 systems such as FORTH in ROM. http://cini.classiccmp.org/pdf/MicroMint/Micromint_Z8_Forth.pdf
4) https://en.wikipedia.org/wiki/PIC_microcontrollers 1976: General Instruments' PIC (Peripheral Interface Controller) evolved from ROM only to flash memory, so they're easily reprogrammed and re-purposed. PIC, like PDP, is a name to avoid saying "computer" or "microprocessor".
5) https://en.wikipedia.org/wiki/AVR_microcontrollers 1996: Atmel AVR series of single chip microcontrollers. Similar to the PIC, popularized by the Arduino.
The Arduino ecosystem is no longer just for experimenters and hobbyists. It's now used for Industrial automation and machine controllers, replacing embedded processors such as the Z80 and STD bus. See: https://www.automationdirect.com/open-source/home
What scares me is NOT the tool but how you use it. Just as everything looks like a nail if you have only a hammer, and screwdrivers have been abused as pry bars, wedges, paint-can-openers, etc. The Arduino is just a tool. My fear is the lack of disciplined programming, proper safety engineering practices and peer review of mission critical devices that risk injury or death. I remember the horror of the Therac-25. Do you? https://en.wikipedia.org/wiki/Therac-25
Even Google knows to throw me relevant ads: https://www.onlogic.com/ "Ideal for IoT, Edge and AI Inferencing applications" Just like the movie Westworld, what could possibly go wrong ... go wrong ... go wrong https://en.wikipedia.org/wiki/Westworld_(film) https://en.wikipedia.org/wiki/Westworld_(TV_series)
That's what happens when robots/androids violate Isaac Asimov's "Three Laws of Robotics"
1) A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2) A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
and that's the way it is.
-- jeff jonas
At what point is an AI no longer a robot, or a robot no longer fits the assumed description of a device that performs programmed tasks? At what point would you consider them an engineered intelligence? On Fri, Jan 12, 2024 at 3:22 PM Dean Notarnicola via vcf-midatlantic < vcf-midatlantic@lists.vcfed.org> wrote:
You forgot the 0th law: A robot may not harm humanity, or, by inaction, allow humanity to come to harm. And with the advent of AI we may need to add Dilov's 5th law: A robot must establish its identity as a robot in all cases.
On Fri, Jan 12, 2024 at 1:47 PM Jeffrey Jonas via vcf-midatlantic < vcf-midatlantic@lists.vcfed.org> wrote:
This touches upon several of my pet peeves, so please pardon the way my reply drones on and on and on until I start foaming at the mouth and falling over backwards ... Oh! https://youtu.be/DO7VkEFZ7B8 [Monty Python skit]
Herb touches upon how our expectations of "computers" have evolved. To folks outside the engineering profession, "embedded processors", "industrial controllers" and other special-purpose machines don't look-and-feel like that's now accepted as a "computer". Yet the Arduino has made such embedded processing more accessible than ever to the hobbyist, experimenter, artist, etc.
Perhaps that applies here: what were once called "computers" are now considered embedded processors, IoT (Internet of Things) or Internet appliances.
I was about to say how cellphones are more of an appliance than a computer by the way most folks just download apps. They're not self-hosting, requiring a host system to compile the code. But then again, even I am using my home PC as an appliance. I rarely program or even customize it. I'm running the web browser, text editor, moving files, etc.
Having salvaged lotsa vintage electronics, I've seen the insides of lotsa equipment. I've subscribed to Circuit Cellar from the beginning. They were an early advocate of single chip microcontrollers. These chips come to mind when thinking of embedded processors (in chronological order)
1) https://en.wikipedia.org/wiki/Intel_4004 1971: the Intel 4004. I encountered one deep inside a Calcomp DS12 hard drive controller. See page 55 http://www.bitsavers.org/magazines/Datamation/19710801.pdf
[AND FOR THE RECORD: the 4004 was the first _commercially_available_ microprocessor. The military was first with the MP944 chip set for the US Navy's F-14 Tomcat fighter's CADC: completed June 1970.
https://www.computerhistory.org/siliconengine/microprocessor-integrates-cpu-...
https://www.wired.com/story/secret-history-of-the-first-microprocessor-f-14/
]
2) https://en.wikipedia.org/wiki/Intel_MCS-48 This family of single chip microcontrollers was released in 1976. The IBM PC used them as the keyboard controller (allowing a thin coiled cord and serial link instead of ribbon cable and parallel interface).
3) https://en.wikipedia.org/wiki/Zilog_Z8 1979: the Zilog Z8. Very popular with hobbyists for the piggyback ROM socket. Circuit Cellar / Micromint made many Z8 systems such as FORTH in ROM. http://cini.classiccmp.org/pdf/MicroMint/Micromint_Z8_Forth.pdf
4) https://en.wikipedia.org/wiki/PIC_microcontrollers 1976: General Instruments' PIC (Peripheral Interface Controller) evolved from ROM only to flash memory, so they're easily reprogrammed and re-purposed. PIC, like PDP, is a name to avoid saying "computer" or "microprocessor".
5) https://en.wikipedia.org/wiki/AVR_microcontrollers 1996: Atmel AVR series of single chip microcontrollers. Similar to the PIC, popularized by the Arduino.
The Arduino ecosystem is no longer just for experimenters and hobbyists. It's now used for Industrial automation and machine controllers, replacing embedded processors such as the Z80 and STD bus. See: https://www.automationdirect.com/open-source/home
What scares me is NOT the tool but how you use it. Just as everything looks like a nail if you have only a hammer, and screwdrivers have been abused as pry bars, wedges, paint-can-openers, etc. The Arduino is just a tool. My fear is the lack of disciplined programming, proper safety engineering practices and peer review of mission critical devices that risk injury or death. I remember the horror of the Therac-25. Do you? https://en.wikipedia.org/wiki/Therac-25
Even Google knows to throw me relevant ads: https://www.onlogic.com/ "Ideal for IoT, Edge and AI Inferencing applications" Just like the movie Westworld, what could possibly go wrong ... go wrong ... go wrong https://en.wikipedia.org/wiki/Westworld_(film) https://en.wikipedia.org/wiki/Westworld_(TV_series)
That's what happens when robots/androids violate Isaac Asimov's "Three Laws of Robotics"
1) A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2) A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
and that's the way it is.
-- jeff jonas
participants (4)
-
Brian Marstella -
Christian Liendo -
Dean Notarnicola -
Jeffrey Jonas