Does anyone know the reason why the C64 only ran at ~1 MHz? Was it to keep RAM costs or system cost as low as possible or was there a different engineering reason? (The Atari and BBC machines managed higher clocks, but were very expensive compared to the C= devices, although the 1977 Atari 2600 managed ~1.19 MHz out of it's 6507 (6502 with low pin count/less addressing)..
On Thu, Mar 23, 2017 at 8:47 AM, John Heritage via vcf-midatlantic < vcf-midatlantic@lists.vintagecomputerfederation.org> wrote:
Does anyone know the reason why the C64 only ran at ~1 MHz? Was it to keep RAM costs or system cost as low as possible or was there a different engineering reason?
(The Atari and BBC machines managed higher clocks, but were very expensive compared to the C= devices, although the 1977 Atari 2600 managed ~1.19 MHz out of it's 6507 (6502 with low pin count/less addressing)..
The various clock circuits inside the different home computers was the least of their problems. All of those different combinations used between the various home computers were just methods they found to minimize their Bill of Materials - because every penny was crucial to their bottom line when making thousands of them in the consumer market. There were still real and significant design challenges present in the early VLSI designs of the 1970s to make these faster. But they slowly marched on and made improvements every year to where we had the 8086 running at 5mhz and 68000 running at 8mhz already by 1979. This list below is just some of design problems and they are all interrelated. Making trade offs between each of these was a constant battle. This list is mostly ordered by significance, but some of this could be a matter of opinion. 1. BJT, NMOS, PMOS, or CMOS logic families contribute to speed, eg. depletion-load NMOS is faster than BJT 2. Miller Effect [capacitance] prsent even on the silicon die 3. VLSI design geometries were limited to about 5um 4. Clock distribution across the silicon die [eg, too much clock skew] 5. Higher Clock speed required more power dissipation 6. Negative Bias voltage was a burden to compensate for Miller Effect[eg. 8080 ] 7. Lead Inductance present even on the silicon die 8. Slew Rate, eg, ECL being faster ran on smaller voltages, slew rate was less, giving faster clock speed 9. TTL signals are above the noise floor eg, ECL being faster was below noise floor, ie power is negative voltage 10. TTL operate in saturation for logic signals, [eg. ECL being faster operate in Linear region] 11. Square wave logic signals have more odd numbered harmonics than a linear analog device, reducing the possible bandwidth 12. Pin Capacitance limited not only the driver characteristics but also the frequency bandwidth
So extrapolating to the C64 it could mean: 1. May have been limited due to the specific process fabbing the C64 6502 or support chips and they couldn't communicate higher 2-4 - Not really relevant for the 6502 but may have limited the size of the support chips in the C64 limiting the max frequency they could run at without using dividers. 5. Power distribution - good point, 6502 was pretty low power but more mhz required more beefy power circuits which required more cost; the C64 was an exercise in low cost (although so was the Atari 2600...) 6 - I'd actually to know more about this; Did the 8080 require custom / external solutions to get it to clock higher later on? maybe these were fixed by the Z80. 7-12 these sound like items that could complicate the routing and electronics layouts necessary to make the entire computer work, requiring additional engineering/cost to raise the clock speed of the C64. Am I on the right path for possible (sorry for being less technical) reasons for why the C64 stayed at 1 MHz while it's intended successors (C+, C128) and some contemporaries (A800, BBC Micro) ran at higher clock speeds? Sounds like cost may have been the overall driver.. With the 6502 executing every other clock on many instructions (rating 0.43 MIPs @ 1 MHz), that means RAM needs to at least be double that to allow the custom chips to have their share of every other access, and 4x that if commodore ran the cpu at 2 MHz. On Thu, Mar 23, 2017 at 11:01 AM, Dan Roganti <ragooman@gmail.com> wrote:
On Thu, Mar 23, 2017 at 8:47 AM, John Heritage via vcf-midatlantic < vcf-midatlantic@lists.vintagecomputerfederation.org> wrote:
Does anyone know the reason why the C64 only ran at ~1 MHz? Was it to keep RAM costs or system cost as low as possible or was there a different engineering reason?
(The Atari and BBC machines managed higher clocks, but were very expensive compared to the C= devices, although the 1977 Atari 2600 managed ~1.19 MHz out of it's 6507 (6502 with low pin count/less addressing)..
The various clock circuits inside the different home computers was the least of their problems. All of those different combinations used between the various home computers were just methods they found to minimize their Bill of Materials - because every penny was crucial to their bottom line when making thousands of them in the consumer market.
There were still real and significant design challenges present in the early VLSI designs of the 1970s to make these faster. But they slowly marched on and made improvements every year to where we had the 8086 running at 5mhz and 68000 running at 8mhz already by 1979. This list below is just some of design problems and they are all interrelated. Making trade offs between each of these was a constant battle. This list is mostly ordered by significance, but some of this could be a matter of opinion.
1. BJT, NMOS, PMOS, or CMOS logic families contribute to speed, eg. depletion-load NMOS is faster than BJT 2. Miller Effect [capacitance] prsent even on the silicon die 3. VLSI design geometries were limited to about 5um 4. Clock distribution across the silicon die [eg, too much clock skew] 5. Higher Clock speed required more power dissipation 6. Negative Bias voltage was a burden to compensate for Miller Effect[eg. 8080 ] 7. Lead Inductance present even on the silicon die 8. Slew Rate, eg, ECL being faster ran on smaller voltages, slew rate was less, giving faster clock speed 9. TTL signals are above the noise floor eg, ECL being faster was below noise floor, ie power is negative voltage 10. TTL operate in saturation for logic signals, [eg. ECL being faster operate in Linear region] 11. Square wave logic signals have more odd numbered harmonics than a linear analog device, reducing the possible bandwidth 12. Pin Capacitance limited not only the driver characteristics but also the frequency bandwidth
In my opinion, the 64 ran at 1MHz, because Jack did not have interest in investments needed to make the chip faster. He wanted it cheap and plentiful. In some respects, this aligned with the original MOS goal. CPUs for the masses, cheap. I am sure the HW guys would have loved to do more R&D, but MOS' tech was "good enough" to support the company, and that's what Jack focused on. I believe there are -2 and -3 MOS NMOS units and had been for some time, but Jack didn't need fast, he needed fast enough and that was it. Running at 2MHz would have required static RAM or faster DRAM, so 1MHz (essentially 2MHz due to the CPU/GPU bus sharing) was fast enough. JIm
On Thu, Mar 23, 2017 at 1:14 PM, RETRO Innovations via vcf-midatlantic < vcf-midatlantic@lists.vintagecomputerfederation.org> wrote:
In my opinion, the 64 ran at 1MHz, because Jack did not have interest in investments needed to make the chip faster. He wanted it cheap and plentiful.
In some respects, this aligned with the original MOS goal. CPUs for the masses, cheap. I am sure the HW guys would have loved to do more R&D, but MOS' tech was "good enough" to support the company, and that's what Jack focused on.
I believe there are -2 and -3 MOS NMOS units and had been for some time, but Jack didn't need fast, he needed fast enough and that was it. Running at 2MHz would have required static RAM or faster DRAM, so 1MHz (essentially 2MHz due to the CPU/GPU bus sharing) was fast enough.
JIm
I think this is why Bill Mensch left MOS and started WDC in '78, and made further improvements in speed and instructions to the 6502 and then later the 65816
On 3/23/2017 12:36 PM, Dan Roganti wrote:
I think this is why Bill Mensch left MOS and started WDC in '78,
I interviewed Bill long ago for an article, and I was not as aware of history then as now. I wish I had been, as I would have loved to ask him how he managed to coerce Jack into paying for him to break off and start up WDC. (he did fund at least part of WDC). THere must have been an agreement to not sue, or assignment of some specific part of the 6502 in return for the money, is the only think I can figure. I suppose someone should try to get Bill to come to a VCF (VCF-W) and give the complete picture. Given how tight Jack was with money, and the amount of money that I seem to recall Bill got, there had to be something massive to leverage.
and made further improvements in speed and instructions to the 6502 and then later the 65816 Somewhere there is the store about the 65C816 and the Apple II gs. Ah, here:
https://mirrors.apple2.org.za/apple2.caltech.edu/miscinfo/65xxx.chronology The 65c816's design was not without it's problems. Bill Mensch "improved" the bus interface on the 65c816 (over that used by the 6502). Unfortunately, the Apple's disk drive controller relied on some of the old kludges in the 6502 chip. With those problems removed, the 65c802 and 65c816 chips worked fine on an Apple computer, but the disk drives didn't work. Of course Apple immediately began laminting about the stupidity of the designers at WDC and WDC's designers immediately began complaining about the stupidity of Apple's design. In the long run, money won out. If WDC wanted Apple to use the '816, WDC would have to redesign the chip. They did. I always wondered just what the "improvement" was that had to be discarded. (read before write?) Jim -- RETRO Innovations, Contemporary Gear for Classic Systems www.go4retro.com store.go4retro.com
On Thu, Mar 23, 2017 at 12:53 PM, John Heritage <john.heritage@gmail.com> wrote:
So extrapolating to the C64 it could mean:
1. May have been limited due to the specific process fabbing the C64 6502 or support chips and they couldn't communicate higher
2-4 - Not really relevant for the 6502 but may have limited the size of the support chips in the C64 limiting the max frequency they could run at without using dividers.
5. Power distribution - good point, 6502 was pretty low power but more mhz required more beefy power circuits which required more cost; the C64 was an exercise in low cost (although so was the Atari 2600...)
6 - I'd actually to know more about this; Did the 8080 require custom / external solutions to get it to clock higher later on? maybe these were fixed by the Z80.
7-12 these sound like items that could complicate the routing and electronics layouts necessary to make the entire computer work, requiring additional engineering/cost to raise the clock speed of the C64.
Am I on the right path for possible (sorry for being less technical) reasons for why the C64 stayed at 1 MHz while it's intended successors (C+, C128) and some contemporaries (A800, BBC Micro) ran at higher clock speeds? Sounds like cost may have been the overall driver..
With the 6502 executing every other clock on many instructions (rating 0.43 MIPs @ 1 MHz), that means RAM needs to at least be double that to allow the custom chips to have their share of every other access, and 4x that if commodore ran the cpu at 2 MHz.
I don't know if you can extrapolate this so easily to the platform level. Some of them are as you noted, but that list referred mainly to the VLSI design process. That is your bottleneck, so to speak, among the various IC's, not just the processor, but also the peripheral IC's, and of course memory. After that, the platform is designed around that, with this mixture of components, based on the specs. You could always try some parlor tricks to increase the system clock but not without potentially creating some damage without enough cooling. As I mentioned, the CPU clock, as the rest of the design, is always done with shortcuts to minimize parts on these consumer computers. And these system clocks are typically interlocked to some exiting xtal, via a peripheral IC, such as the color burst xtal or something else So you typically find some of them running at less than the desired frequency. The vlsi process for intel 8080 never really changed, they instead went on to next generation processors such as the 8085 which only needed +5v. The Z80 did this straight off the bat of course, as well as the 6800 and 6502, 6809, etc.
participants (3)
-
Dan Roganti -
John Heritage -
RETRO Innovations