[vcf-midatlantic] OT: people don't understand computers anymore
Systems Glitch
systems.glitch at gmail.com
Tue Jun 7 16:00:07 EDT 2016
(reply stuck in the middle of the thread, no particular reason)
Being one who currently slings code for a living, I have the following observations:
- There are more job openings for "computer programming" than there are dedicated, serious computer scientists
- People are attracted to these jobs because they pay well
- A deep understanding of CS is not required for many of these jobs (e.g. DB management, web design)
- Using a computer scientist as a code monkey is probably not fulfilling or economic
Thus, you have a lot of "computer people" who are there driving an industry that uses computers as a means to make money. They don't need to know how the server they deploy to or the workstation they use actually *works* because it's irrelevant to their job. For many people, their interest in computers stops with what's immediately relevant to their jobs. That's fine, not every lathe operator needs to understand the chemistry and physics behind the alloy he's working into whatever the final bit will be.
The ones who *do* care about what's happening at a low level are naturally going to want to learn more, and often do. Those people end up moving from code monkey to something else, into management or software architecture or something. Or they quit and work for themselves in a more involved field. That's good too, because now the really motivated and interested people are the ones responsible for making decisions, and guiding the people who don't really have an interest in what they're doing outside of making a living.
I do think most CS tracks at university would benefit from starting at the low level and working up. This would probably require splitting the current batch of CS students into two different programs: actual CS, and Computer Programming.
Now, w.r.t. high-level languages and waste of computer resources: because of bulletin point #1 above, for general purpose computing, the *programmer's* time *is* more valuable than the computer's time, nowadays. That is an inescapable fact for many jobs. Underlying architecture has no bearing and *should* have no bearing on, for example, how well someone's web application accomplishes its task. Likewise, for quick and/or one-off tasks efficiency often isn't important. If the time to improve the code is longer than the run time, and the results are equally correct, it's not cost effective. That *does* matter.
If efficiency *is* important, that requirement will eventually reveal itself (again, general purpose computing, not talking about HPC or embedded work). When that happens, someone will make a (hopefully informed) decision on whether to throw hardware at the problem, or take the time to do it right. Sometimes more hardware is the answer. Often it isn't, and that was a lot of the contract work we received at my previous salaried job.
Thanks,
Jonathan
More information about the vcf-midatlantic
mailing list