> ... the childish slags words you folks use. ...
That's a bit harsh. Those on the overclocking forums are just selecting terms to describe what they're doing, and I certainly would never
say their hobby is childish. These days there are few other 'techy' hobbies that a young person can have, given the massive black-box
nature and immense complexity of modern devices. It's not like the 80s when one could at least get to the guts of 8bit/16bit systems, write
code for robotics control or other device interfacing (the BBC Micro was excellent for this), doing assembly language, etc. Numerous
people kickstarted their careers on the back of that boom, including me.
I doubt very much that those who use these terms would in any way intend such use to be demeaning for what you're referring to. Since
there wasn't any existing jargon to describe what they're doing, one can hardly be surprised that words & terms are either created or
combined from others to suit. It is however certainly true that a lot of those who meddle with PC overclocking haven't the slightest idea
what (for example) voltage actually means (perhaps not helped by modern sw/BIOS tools making it easier than ever to do all this stuff,
often in an automated manner), but from all I've read there are definitely those who do delve more deeply to understand what it's all about
so they can achieve better results - this can only be a good thing.
Your comments remind me of someone I used to know in the late 90s, head technician at a university I worked at for a while. He really
shouldn't have been there, grossly overqualified man for the job; he ended up working there for family reasons (caring for his mother).
His original career was defense systems, eg. he helped design the guidance system for the Minuteman missile. He once said how in his
opinion the apparent intelligence of students seemed to be going right down the crapper, and I had to agree from what I often saw. He once
commented that 5X more people now were being awarded top-level degrees but that one could hardly conclude modern graduates were
magically 5X smarter. then when he was at Uni.
> To me it's basically a denigration of the profession I've spend my life on. Sorry if I'm a downer, but I suppose you kids would never
> understand how crappy it makes me feel to have my kind of work trivialized in the end.
I'm almost 42. Hardly a 'kid'.
And there was me yesterday feeling like a grumpy old git...
I agree that it would certainly be a shame if the activities of those who mess about with PC overclocking never resulted in their taking up
electronics or electrical engineering in a more serious way. I'm sure some do, but what proportion? I've no idea. As long as it's not zero
though then who can say it's not worthwhile? It's hard enough as it is to get children and young people interested in science/etc., so I
say be grateful there's still something
which is able to garner such interest. Schools these days sure as hell make no attempt to
When sites like tomshardware publish reviews of new CPUs, there are two types of readers. Most jump straight to the performance results;
this is the crowd who are IMO less likely to get more seriously interested in science, etc. Others though read the entire article, trying to
understand the explanation of how the new chip design works, what it means. These are the people we need to encourage. Don't belittle
them by saying what they do is trivialising your knowledge - they would never intend their hobby to have that effect. Better to help them
learn more about the foundations of what it is they're dealing with.
Heck, you should get your knowledge written down, build a website or something, pass on what you know.
> Oh well, everything is commoditized in the end even intelligence.
It certainly feels like there's a general dumbing-down (anyone in the UK watch Horizon yesterday, about the LHC? Ye gods it was so
slooow!), and numerous parties are to blame for that (media, the state, parents, teachers, etc.) but in a way I suppose it's inevitable that
as tech gets ever more powerful, it becomes easier to manufacture products which simplify how people can interact with technology
(touch sensitivity, voice recognition, facial recognition, movement tracking, etc.), leading to the irony of incredibly sophisticated devices
like the iPad being controlled by the most basic of hand gestures - in an odd sort of way it's like we're going back into the cave and are
scribbling on the walls once again.
The push from OS products like Windows and OSX is for GUIs that hide the guts of a system's functions (eg. Metro in Win8), a change
that seems unstoppable due to the rise of mobiles, smart phones & tablets. Ordinary citizens are able to do amazing things with modern
technology, but they haven't the slightest idea how it works. I agree that's a bad thing, but how to change it for the better without making
functionality more restrictive? I've no idea. The media continues to label people in IT and science/engineering in general as geeks, nerds,
boffins, etc., so no wonder prospective students avoid such subjects. Universities churn out endless lines of graduates in psychology,
media studies, hotel/catering, tourism, etc. - pretty much everything the modern economy doesn't need. Worse, retraining centers for the
unemployed only teach this sort of nonsense aswell.
Btw, my step-Father worked his entire life as a research engineer at Kodak (aswell as the obvious film & photography, he specialised in
speaker design, audio, magnets, etc. - he used to build cinema speakers); I've heard numerous stories of what he was involved with, so I
have a healthy respect for the kind of engineering you're talking about. Sadly, engineering is not a respected discipline in most western
nations now, unless you happen to be lucky and live in Germany (or Mexico, so I'm told).
> ... but programming is a whole different thing. ...
Doing it well requires the ability to abstract, something which is not really taught anymore. Schools just want pupils to pass exams, so
encouraging free thinking is not part of the agenda. Pupils who try to progress beyond the schedule are discouraged or actively
penalised for doing so. At 16, I had, "You should not know this equation!", written in my physics textbook because I'd learned something
on my own; thankfully a different teacher had the opposite view - he loaned me his course notes & books on Special/General Relativity.
Talking to a woman on a train more recently, she told me her daughter had been marked down
because she had tried to learn
beyond the prescribed material.
There are two types of computer courses I've come across: Computer Science (what I did) and Computing
. The former normally
includes a lot of theoretical background, an emphasis on the mathematical/logical basis of computer-related topics, strong support for
research-oriented follow-up courses, more avdanced optional topics like AI, etc. The latter is nowhere near as detailed, often a year
shorter, doesn't cover advanced subjects, focuses entirely on practical aspects of computers (eg. "multimedia", a term I found
sickening when working as a sysadmin in academia), avoids general concepts and instead teaches highly specific things such as
one particular application or platform instead of a general language or theory, and is aimed at getting people into non-research jobs
as quickly as possible. Too many edu places (especially colleges, and universities that used to be colleges or polytechnics) teach
the latter kind of course, creating students who are of little use in real industry; they can't abstract and often have no idea about the
real foundations of what they think they know. The former type of course is harder & more demanding, so many avoid them.
Example: I was told of a "Computing" graduate who, on his first day at work, was asked to comment at a meeting about how the company
could better organise/process its data. The poor guy said how about using a bubble sort; everyone else in the meeting literally laughed
at him, but that's the level of material which students are being given along with a bit of paper saying they now have a degree. I feel
sorry for such students, most have no idea they're being intellectually conned.
> Heck, one time I recruited a guy who was in college for computer science that was 19, and he whined all day about doing his
> programs in C++ instead of Visual Basic. ...
QED. Exactly the kind of thing I encountered. I met students who thought multimedia meant only Macromedia Director, because that's
all they'd been taught. They knew nothing about the basics of imaging, video, audio, etc. They had a "degree" in Computing with
Multimedia, yet they'd never heard of PAL or NTSC, couldn't explain fields vs. frames, had no idea what RGB meant, interlacing, etc.
Ditto for equivlent audio concepts. It was depressing talking to them.
> ... gone off topic ...
Yeah, but I'm a chatty mood.
Besides, it's related in a way. What do I use my SGIs for? Partly to maintain a website which as much as I can helps people to learn
about SGIs and make best use of them, while slowly merging in data about modern systems in a manner which eventually I hope will
allow people to get the bost of both the old and the new.
And hey, skywriter, it's not all gloom! Last year I had to get rid of a dozen Indigo2s and many low-spec Octanes which were taking
up space, not worth enough to sell, etc. I advertised on the local Freecycle forum, strongest interest came from a group of young people
who spend their spare time doing all sorts of cool things with computer tech, a blend of hardware, Linux and electronics. One of them
was building a general machine which could act as a wide variety of different hw systems (it had multiple mbds for different types of
systems in the same box, with custom hw/sw and front end to drive it all). The guy who came to collect the 20+ systems was only about
19 I think. I found it reassuring that people doing this sort of thing still existed. They were delighted to have a bunch of SGIs with which
to expand their experimental work.