assembler is tedious (was WooHoo!) (Jeffrey Jonas)
There's some irony in discussing the tedium of assembly language programming, by using (excuse me) the tedium of emails as a learning and describing tool. It's my bias, but the hardest way to learn something, is by having several people explain it, post streams of details, argue amongst each other in the process, get sidetracked, etc. I understand, that's the 21st century way; no insult, it's just not my 20th century way (or my textbook way, I was formally educated in computing). Look. The vintage basis for assembly-language programming, included the limited resources available to microprocessor owners in the 1970's. Hand-assembly and front panels, limited memory, limited I/O, limited storage. Limited access to prior source-code. The other limitation was the owner him/herself: programming was likely a mystery, at least programming that particular processor, and later in time the mystery of whatever "system" architecture in use. Results matter. Those owners learned, became computing experts, who became employed in the personal-computer boom to come. So - that old-school way, mattered then; and matters now as a matter of historic preservation. So why preserve it? To remember the lessons of the 20th century past: a past of limited resources, versus the 21st century experiences of ABUNDANT, STANDARDIZED computing. And so, calling assembly language programming "tedious" misses these points, imposes those 21st century assumptions on 20th century tech. This is not some old man's problem with the future. My vintage Web site, attracts people who ask me "I wanna start with some vintage computing; where do I learn, how to program and repair these things?" I'm stumped - modern resources have mostly given up "assembly language" and "TTL logic" or even "read this schematic"! If the above is too "tedious", if you are messing with vintage computers "for fun", and don't care about historic perspective, that's fine. But you have a choice. Try to sledge-hammer C code (or some language-of-the-year) from a cross-compiler into an Apple II, and trip over all the I/O it needs to "know"? Or, work within the Apple II's BASIC, Sweet-16, and assembly-language calls (documented long ago) - in assembler (readily accessible) - and get something DONE? And learn something "old"? If you are worried about looking trivial - buy a Raspberry Pi and *guarantee* your results are trivial, just in a larger social group. Harumph, Herb Johnson -- Herbert R. Johnson, New Jersey in the USA http://www.retrotechnology.com OR .net preserve, recover, restore 1970's computing email: hjohnson AT retrotechnology DOT com or try later herbjohnson AT retrotechnology DOT info
Very good points, Herb! On Thu, Aug 3, 2017 at 11:31 AM, Herb Johnson via vcf-midatlantic < vcf-midatlantic@lists.vintagecomputerfederation.org> wrote:
There's some irony in discussing the tedium of assembly language programming, by using (excuse me) the tedium of emails as a learning and describing tool. It's my bias, but the hardest way to learn something, is by having several people explain it, post streams of details, argue amongst each other in the process, get sidetracked, etc. I understand, that's the 21st century way; no insult, it's just not my 20th century way (or my textbook way, I was formally educated in computing).
Look. The vintage basis for assembly-language programming, included the limited resources available to microprocessor owners in the 1970's. Hand-assembly and front panels, limited memory, limited I/O, limited storage. Limited access to prior source-code. The other limitation was the owner him/herself: programming was likely a mystery, at least programming that particular processor, and later in time the mystery of whatever "system" architecture in use.
Results matter. Those owners learned, became computing experts, who became employed in the personal-computer boom to come.
So - that old-school way, mattered then; and matters now as a matter of historic preservation. So why preserve it? To remember the lessons of the 20th century past: a past of limited resources, versus the 21st century experiences of ABUNDANT, STANDARDIZED computing.
And so, calling assembly language programming "tedious" misses these points, imposes those 21st century assumptions on 20th century tech.
This is not some old man's problem with the future. My vintage Web site, attracts people who ask me "I wanna start with some vintage computing; where do I learn, how to program and repair these things?" I'm stumped - modern resources have mostly given up "assembly language" and "TTL logic" or even "read this schematic"!
If the above is too "tedious", if you are messing with vintage computers "for fun", and don't care about historic perspective, that's fine. But you have a choice. Try to sledge-hammer C code (or some language-of-the-year) from a cross-compiler into an Apple II, and trip over all the I/O it needs to "know"? Or, work within the Apple II's BASIC, Sweet-16, and assembly-language calls (documented long ago) - in assembler (readily accessible) - and get something DONE? And learn something "old"?
If you are worried about looking trivial - buy a Raspberry Pi and *guarantee* your results are trivial, just in a larger social group.
Harumph, Herb Johnson
-- Herbert R. Johnson, New Jersey in the USA http://www.retrotechnology.com OR .net preserve, recover, restore 1970's computing email: hjohnson AT retrotechnology DOT com or try later herbjohnson AT retrotechnology DOT info
If the above is too "tedious", if you are messing with vintage computers "for fun", and don't care about historic perspective, that's fine. But you have a choice. Try to sledge-hammer C code (or some language-of-the-year) from a cross-compiler into an Apple II, and trip over all the I/O it needs to "know"? Or, work within the Apple II's BASIC, Sweet-16, and assembly-language calls (documented long ago) - in assembler (readily accessible) - and get something DONE? And learn something "old"?
If you are worried about looking trivial - buy a Raspberry Pi and *guarantee* your results are trivial, just in a larger social group.
I'm the one who said "tedious". But keep in mind where I'm coming from: BASIC and LOGO. Got plenty of good things "done" in both languages, and now it's time for me to learn what more can be done in assembly. Yes I griped somewhat :) but the fact that it's going to be a challenge is exactly why I want to learn it. I'll be able to have better conversations with museum visitors and better understand some of the conversations here among our own members.
On 8/3/2017 4:27 PM, Evan Koblentz via vcf-midatlantic wrote:
I'm the one who said "tedious". But keep in mind where I'm coming from: BASIC and LOGO. Got plenty of good things "done" in both languages, and now it's time for me to learn what more can be done in assembly. Yes I griped somewhat :) but the fact that it's going to be a challenge is exactly why I want to learn it. I'll be able to have better conversations with museum visitors and better understand some of the conversations here among our own members.
And it is important to note that Evan's use of tedious was in the very beginnings of his understanding of assembly prior to discovering branches. Take away branches/loops in any programming languages, no matter how high level, and I guarantee people will complain about tedium. Herb had some excellent points, but I'll add that the purpose of studying assembly is not necessarily to write programs but rather to better understand how computer architecture works. Any CE or good CS degree will require it for this reason, whether 1967 or 2017. Yes, in 1967 "real" programmers wrote assembly (and had little choice for "real" programming), but I'd argue that even in 2017 "real" programmers know assembly.. they just don't necessarily use it (unless they're writing a compiler or are in the habit of decompiling malware, but I digress...). I get strange looks from people when I tell them I teach assembly to high school students, but 2017 is a glorious time to teach assembly. Back in my day you pretty much had to learn x86 (segment offset, at that), which is painful, since most people owned x86 boxes (or Macs, which Apple did just about everything they could do to not allow you to program in anything, never mind Motorola 680x0 assembly). Well, I was lucky and rather enjoyed RISC assembly (Sparc.. oh how I loved the Sparc), but my college paid for a lab full of SparcStations that most people didn't have access to (at least not in school environments). So, really, x86 was the only option and many people from my generation recoil in horror when they hear the suggestion of assembly education because of that. At the same time, many people from the generation before me (whether because of the wonder days of the 6502, the IBM 360, or DEC) will say "Of course you should teach assembly... why wouldn't you?". In 2017, I can run a virtual Apple II a dozen different ways and teach 6502. I can run virtual MIPS and teach RISC assembly. Heck, I can even run a virtual old school dos box and teach segment offset if I want to be cruel. Linux and Mac OS X both allow me to write modern x86 and I can write ARM on a Raspberry Pi (although, sadly, I find ARM to be rather not so beginning user friendly). I'm told by former students that some colleges now even teach Java byte code assembly (ewwwww....). And, as I'm sure most programmers on this list will back me up on, once you've really learned one assembly language, learning the rest of them is just a matter of details. -Adam
Adam Michlin had a lot to say. It's pretty much reasonable, plausible stuff he has to say. But I have a few nagging comments. I don't care for his use of a distinction between "real" programmers (his quotes), versus whatever-the-hell we do, we who post here, as less than real. I see his point, I have a BSEE myself. But I don't care for that ordering, or the choice of word. This is a discussion list about vintage computers; modern work is second, that's the order. And I don't care to hear, what amounts to "assembly languages - learn one, learn them all". And emulators, while free and convenient, aren't quite the same as actual hardware sitting on your desk, or in a rack. He's welcome to his preferences and priorities of course. But then - if he clearly favors one processor or architecture over another - why then claim any assembly language is like all others? It is...but it isn't. I get his points, there was no insult intended. But it makes me cringe. Like a Civil War re-enactor cringes, when visitors ask them about Web sites, cell-phone reception, keeping cool in the summer, etc. About "reality". I spent my time last week, as follows. Disassembling COSMAC 1802 code to restore source for an RCA monitor program. I looked at and discussed, oddball Z80 and 8080 architectures, for programming to SPI or I2C devices. I explained the repair of a 1979 S-100 bus backplane and just scanned its manual last night. Please don't tell me, what I'm doing, isn't "real". That knowledge of and experience with architectural features are arbitrary details, readily learned, readily emulated. And don't do so, when some of these differences matter to YOU. No insult to you, just consider the larger contexts, as I've suggested. keepin' it real retrotechnology.com -- Herbert R. Johnson, New Jersey in the USA http://www.retrotechnology.com OR .net
My use of quotes was intended to show that I didn't agree with the word real, although I understand it was an attitude of the time. I favor one assembly for educating students foreign to assembly language over the other merely because some are more beginner friendly. Like it or not, emulators are going to become more and more a part of our hobby. How else I can get a classroom of students working on an Apple II? And then get 10 other teachers doing the same in their classroom? Best wishes, -Adam On 8/7/2017 1:15 PM, Herb Johnson via vcf-midatlantic wrote:
Adam Michlin had a lot to say. It's pretty much reasonable, plausible stuff he has to say. But I have a few nagging comments.
I don't care for his use of a distinction between "real" programmers (his quotes), versus whatever-the-hell we do, we who post here, as less than real. I see his point, I have a BSEE myself. But I don't care for that ordering, or the choice of word. This is a discussion list about vintage computers; modern work is second, that's the order.
And I don't care to hear, what amounts to "assembly languages - learn one, learn them all". And emulators, while free and convenient, aren't quite the same as actual hardware sitting on your desk, or in a rack.
He's welcome to his preferences and priorities of course. But then - if he clearly favors one processor or architecture over another - why then claim any assembly language is like all others? It is...but it isn't.
I get his points, there was no insult intended. But it makes me cringe. Like a Civil War re-enactor cringes, when visitors ask them about Web sites, cell-phone reception, keeping cool in the summer, etc.
About "reality". I spent my time last week, as follows. Disassembling COSMAC 1802 code to restore source for an RCA monitor program. I looked at and discussed, oddball Z80 and 8080 architectures, for programming to SPI or I2C devices. I explained the repair of a 1979 S-100 bus backplane and just scanned its manual last night.
Please don't tell me, what I'm doing, isn't "real". That knowledge of and experience with architectural features are arbitrary details, readily learned, readily emulated. And don't do so, when some of these differences matter to YOU. No insult to you, just consider the larger contexts, as I've suggested.
keepin' it real retrotechnology.com
My error on "real", Adam meant the old cliche "real programmers use assembler". I apologize, I thought Adam was asserting that vintage computing wasn't real when compared to day-job computing. As for favoring one assembly architecture over another: it's an old argument. Look at histories of each architecture: they are about circumstances, or sometimes design choices for purpose. Some are certainly simpler than others. ARM is usually about three architectures per processor! It's a fuzzier argument today, when computing resources are abundant. Not so much, when you haveta count cycles. Thus one value of preserving vintage computing: lessons learned from scarcity. And another, from Adam: simplicity, for education. As for my 'dissing emulators. I have my priorities too: I'm preserving hardware, because emulators make it easy to throw 'em out! And for "circumstantial" reasons: my nuts-and-bolts BSEE for instance. But here's good words for emulators. Some people who use 'em, will want the real (no quotes) copper and silicon. And: Emulators help debug and disassemble software, even vintage software. Here's an instance: http://www.retrotechnology.com/memship/cosmac_dev_sys.html#sim Emulators are of commercial interest too, in running legacy software, for purpose. That's why some computers become "vintage"; their software lives but the computers pass on. This brings us back to "hobby" versus "commercial". So: some good points, some bad points, on emulators. Herb On 8/7/2017 2:14 PM, Adam Michlin wrote:
My use of quotes was intended to show that I didn't agree with the word real, although I understand it was an attitude of the time.
I favor one assembly for educating students foreign to assembly language over the other merely because some are more beginner friendly.
Like it or not, emulators are going to become more and more a part of our hobby. How else I can get a classroom of students working on an Apple II? And then get 10 other teachers doing the same in their classroom?
Best wishes,
-Adam
-- Herbert R. Johnson, New Jersey in the USA http://www.retrotechnology.com OR .net preserve, recover, restore 1970's computing email: hjohnson AT retrotechnology DOT com or try later herbjohnson AT retrotechnology DOT info
Just for reference, assembly language is still used quite a bit today in the real world. Too many kids coming out of college don’t bother (or aren’t offered) assembly and are completely clueless about how a computer actually works. While that’s great for many environments and jobs, the reality is that there are a lot of jobs where having assembly would get someone in the door quicker, a better starting salary, and in a more stable position. From the perspective of someone current conducting interviews for a senior level software engineer, having ANY assembly language would give someone a huge edge over someone without it. We wouldn’t care if it was an 1802 back in the 70s or a Pentium yesterday; if you understand the concepts of how a processor operators you can quickly learn a different architecture. Knowing assembly is still a good skill to have. Bob
This is just a reminiscence. Feel free to hit 'delete'. My need for assembly language changed over my 40 year career. When I started programming professionally, it was assembly language all the time, because decent compilers for 'C' or Pascal just didn't exist in the 70's and early 80's. Later on, late 80's, while doing embedded systems work, we finally had a good C compiler, and one only had to drop into assembly language to code up interrupt handlers or particularly speed-sensitive device drivers. In the early 90's, when I was writing a CAD program for PCs (x86), I would code up the video drivers in 'C', and then hand optimize the assembler produced by the compiler to speed it up. In the final era of my career, I was writing Perl on UNIX boxes, and had no need for assembly language at all. Bill Dudley retired EE/programmer This email is free of malware because I run Linux. On Mon, Aug 7, 2017 at 7:46 PM, Bob Applegate via vcf-midatlantic < vcf-midatlantic@lists.vintagecomputerfederation.org> wrote:
Just for reference, assembly language is still used quite a bit today in the real world. Too many kids coming out of college don’t bother (or aren’t offered) assembly and are completely clueless about how a computer actually works. While that’s great for many environments and jobs, the reality is that there are a lot of jobs where having assembly would get someone in the door quicker, a better starting salary, and in a more stable position.
From the perspective of someone current conducting interviews for a senior level software engineer, having ANY assembly language would give someone a huge edge over someone without it. We wouldn’t care if it was an 1802 back in the 70s or a Pentium yesterday; if you understand the concepts of how a processor operators you can quickly learn a different architecture.
Knowing assembly is still a good skill to have.
Bob
On 08/08/2017 09:18 AM, William Dudley via vcf-midatlantic wrote:
This is just a reminiscence. Feel free to hit 'delete'.
My need for assembly language changed over my 40 year career.
2017 - 1978 = 39 years (good grief I've been at this for a while).
When I started programming professionally, it was assembly language all the
Yup, I recall that, though I think C had become stand on PCs at my start.
Later on, late 80's, while doing embedded systems work, we finally had a
Yup, Love C, it's a better assembler (if you knew how your C compiler did it's magic anyway).
In the final era of my career, I was writing Perl on UNIX boxes, and had no
I hope I'm not yet in my final era of my career but I now have a mix of languages. Asm/C/C++ for embedded, Perl/Python/shell for admin and text processing (a lot more bash/python), still working with Java (no love of that verbose language) and lots of Python/Javascript for Smart Home, php and Javascript for the Web (HTML/CSS/JS). Thing to note is that for different jobs I use different languages. I'm currently curious about Go and Rust (multi-processing). Now back to asm lang, I still find that when I debug (down to the bits) I need asm knowledge. Just because your C/Python/JS program doesn't work does that mean that the bug isn't in the underlying compiler/interpreter. The assumption, that it just works can lead to insanity (trying it until it works). -- Linux Home Automation Neil Cherry ncherry@linuxha.com http://www.linuxha.com/ Main site http://linuxha.blogspot.com/ My HA Blog Author of: Linux Smart Homes For Dummies
participants (7)
-
Adam Michlin -
Bob Applegate -
Chris Fala -
Evan Koblentz -
Herb Johnson -
Neil Cherry -
William Dudley