Re: [vcf-midatlantic] Tyranny of Numbers
Nice summary and history, thank you, and too easy to forget that many people today aren't aware of it. I view it from a slightly different perspective, as a history of engineering progress - solving problems, advancing to the next plateau, chasing bottlenecks and moving on. There's a cyclical nature to the process but line of development in terms of overall ability to solve problems generally moves in a forward direction. At any given moment, the next challenge may be hardware, software or analytical approach. The last piece - the science part of computer science - is something that is often underexplored in popular presentations. While a multi-million transistor CPU is mind boggling, the mathematical complexity of the great ideas that drive us forward - from Babbage through Shannon and Turing to database structure, image analysis, compiler development, Big Data, etc. is even more daunting. Jack *************************************** There are several story lines for the microcontroller/processor exhibit 1) The Tyrrany Of Numbers: as computers got more complex, there were incredible problems with how to handle so many components. Thousands of tubes for ENIAC and such would not scale up. SAGE required constant maintanance, thus the dual system to keep running. see: https://en.wikipedia.org/wiki/Tyranny_of_numbers That's happening again in a new way: pin count on the package. The 4004 was only 16 pins. 40 pins was common from the 70s to 80s with the 8008, 8080, 8086, 8088, Z80, 6502, 6800 and such. Easy for hobbyists to breadboard or hand-wire. The 68000 was 64 pins. CPUs expanded to 100. Now 1,000 or more with pin-grid array. 2) Moore's Law We're accustomed to speed and capacity doubling and redoubling and prices plummeting, thus the $5 Raspberry Pi Zero, dual/quad/octo-core processors in just about everything (desktop, laptop, even cellphones). 3) wider playing field Despite the "Intel Inside" campaign, the ARM core is in many more devices. Before the Intel/AMD duopoly, there was a wider playing field as demonstrated by all the hobbyist and pre-PC microsystems: Motorola's 6800, MOS's 6502, Zilog's Z80, RCA's 1802. There was quite the shake-out when the PC architecture "won". In a way, the maker/hacker community is aware of this thanks to the Arduio (most are Atmel AVR based), Raspberry Pi (cheap embedded Linux on the ARM core). That's another way to show how it's history repeating itself with the diversity of competing products. And how more innovation comes from people's basements & garages than large companies. The IoT (Internet of Things) is just a new face to microcontroller hackers. -- jeff jonas
On 01/06/2017 12:30 PM, Jack Rubin via vcf-midatlantic wrote:
Nice summary and history, thank you, and too easy to forget that many people today aren't aware of it.
I view it from a slightly different perspective, as a history of engineering progress - solving problems, advancing to the next plateau, chasing bottlenecks and moving on. There's a cyclical nature to the process but line of development in terms of overall ability to solve problems generally moves in a forward direction. At any given moment, the next challenge may be hardware, software or analytical approach. The last piece - the science part of computer science - is something that is often underexplored in popular presentations. While a multi-million transistor CPU is mind boggling, the mathematical complexity of the great ideas that drive us forward - from Babbage through Shannon and Turing to database structure, image analysis, compiler development, Big Data, etc. is even more daunting.
Also let's not forget how the CPU (and OS's) followed in the footsteps of the mainframe. As patents aged, new innovation suddenly creep-ed into microprocessors and microcontrollers. -- Linux Home Automation Neil Cherry ncherry@linuxha.com http://www.linuxha.com/ Main site http://linuxha.blogspot.com/ My HA Blog Author of: Linux Smart Homes For Dummies
participants (2)
-
Jack Rubin -
Neil Cherry