A word from the wise
GavinWraith (26) 1563 posts |
This new mode of education also began to flourish in some schools with the spread of BBC micros. Many highly creative teachers inspired new adventurous and disciplined forms of learning in their pupils — though many teachers had no idea what to do with computers because they had no suitable training. ALAS THE DREAM COLLAPSED. Politicians, parents, school teachers, and industrialists all started claiming that computers should be used to teach schoolkids how to use the tools that were being used in industry. This was a world-wide folly. So instead of learning how to THINK, children all round the world now use the potentially most powerful educational medium that has ever existed merely for the mundane task of learning how to USE the packages that run on Windows on a PC, such as word processors, browsers, email tools, databases and spread sheets As a result many intelligent school leavers who have never encountered programming or artificial intelligence now don’t see how computing could possibly be a university degree subject: they think it’s like cooking — you learn to use a computer as you learn to use an oven. I was intrigued to hear a senior Microsoft person on the radio a couple of weeks ago lamenting the fact that there are so few people coming out of schools wanting to study computing, because they think it is cool to use computer systems but don’t realise it is cool to create new ones. He claimed this was seriously damaging the economy. He did not mention why this is happening. If some of the people now graduating can be made to understand this message, then perhaps when they are teachers or politicians or parents they will not make the same drastic mistake as was made by the previous generation. Alas, this may now be irreversible, world wide: a great tragedy of our time. Even if politicians recognize the mistake, it will take decades to produce enough teachers who have the competence to teach people to create working systems instead of merely using them. |
Clive Semmens (2335) 3276 posts |
Hear, hear to all that – albeit a bit late now. It was a bit late in 2006 – I was fighting these battles in the 1990s. And mostly losing 8~( …water under the bridge… |
GavinWraith (26) 1563 posts |
Yes. He was head of the AI group at Sussex University for some years, before moving to Birmingham. The only people at Sussex with any computing experience were the AI group and the engineers. The maths department, by contrast, had Joe Taylor, myself and a few other people with experience only with micro-computers. The rest had no interest. When Sussex was given a load of cash by a benefactor to fund a chair in computer-science, it was faced with the fact that it had no degree in that subject. I found myself as maths-representative on a committee to establish a curriculum for one, and eventually as chairman of a new computer-science department. Its hardware was mostly paid for by the profits from Aaron Sloman’s Poplog software. The question was, how should the new department fit into the Sussex school-of-study system? A new school of Cognitive Studies was formed, containing computer-science, linguistics and psychology. Everybody thought that Aaron Sloman would be its dean; after all, he had put in a lot of work initiating joint seminars to get its constituent departments cooperating smoothly. Alas, somebody else was appointed, and he took up a post in Birmingham. Despite the lack of interest in computing among my colleagues in the mathematics department, I would have preferred to see computing under the umbrella of mathematics, with a bigger emphasis on software engineering. This would have dovetailed with Aaron’s educational ideas. Water under the bridge, indeed. |
Steffen Huber (91) 1953 posts |
What you are describing sounds like it is a “teacher problem”. None of the bad things that happen are coming inherently from being forced to use Windows or specific software running on Windows as a teaching tool, but from how they are used and explained in actual teaching. And I also feel that the vast majority of pupils don’t have the capability to abstract from the “specific tool used” to the underlying “general problem to be solved with any similar tool”. And I also feel that requiring this abstraction to be understood by all pupils is not necessary at all. In math, you also start with simple specific instances of general math theory. Same for physics – you don’t start with quantum mechanics and relativity theory. I don’t see a reason to handle IT topics differently. Back in the 80s when I was at school, when IT was an optional “do it if you like it” topic, I had a great teacher who gave us a self-developed CPU-with-microcode-programming-model software to introduce us to the things that happen inside a CPU, i.e. the level below assembly. It was a great experience. But it was also useless knowledge for 99% of the pupils, no matter what they went to do later in their life. |
Clive Semmens (2335) 3276 posts |
It surely is a “teacher problem” – or more particularly, a teacher training problem. But it’s also a curriculum problem, and a school management problem. See http://clive.semmens.org.uk/Education/CheerioClive.html – perhaps especially the postscript. “Computer Studies” could be a great specialist subject, of the kind you describe in your last paragraph, Steffen – and was pretty much what my off-syllabus work was. The actual computer studies syllabus was “how to use specific standard packages” – most of which were completely obsolete long before the pupils entered the work place. Even general computer use skills would be far better absorbed in the routine use of computers in other subjects, rather than being a subject in itself. |
GavinWraith (26) 1563 posts |
While the academic politics were going on Joe and I were also involved in evening education classes, showing folk how to make their own websites. Our students were sometimes as young as twelve (and as old as eighty). We gave prizes for the best sites. This enterprise was huge fun. HTML was a lot simpler in those early days. |
Clive Semmens (2335) 3276 posts |
Way back in 1980-81 I was teaching evening classes at the local tech college – my pupils there ranged from early teens to about 80. Exidy Sorcerers running CPM at first, then we got a second room full of BBCs! Those classes were fun – some kids using the User port and electronic kits making working models, and one elderly could-have-been-retired-but-wasn’t shop keeper writing his own accounts program. |
Rick Murray (539) 13840 posts |
This. So very much this. My third form teacher was a CS teacher. He was a total nerd, liked to weild “Harry the Hammer” to unruly students, and a lot of the basics of programming I learnt from him. My fourth form and fifth form teacher (his replacement after he moved on) was barely capable as an IT teacher. IT was a pile of shit. It’s “this is what a database it and how it is used to keep track of inventory in a shop”. Which was pretty awful considering that the term before I wrote a database and one that could index data on disc so could cater for more than fit into the meagre resources of a Beeb. Yes, dude, I know what a ****ing database is, I wrote one. And so it continued in much the same vein. He kept freaking out the FileStore because the stupid arse, he tended to end the lesson by walking to the fusebox above the classroom door and flicking the switch. Thankfully the FileStore wasn’t on that ring circuit, but it didn’t take long before the dreaded “Too many users” message popped up. It’s not a random coincidence that I wrote the Econet stuff on my site. All competence gained dealing with my so-called teacher’s complete lack of competence.
It was interesting at work with their brief excursion into using Linux. It became very clear those who understood what a word processor or spreadsheet was as they adapted right away to the replacements (LibreOffice, I think it was). The rest? Trained on Word and Excel so completely lost when the options and menus were slightly different.
I would contest that unless you’re good enough to get into university, school is actually designed to teach children not to think, but to “shut up, know your place, and follow orders”. Of course, this lack of critical thinking abilities leads to results such as Brexit because “Project Fear” is a lot simpler to digest than a lot of guff about the economy tanking and such. You’ll recall that the mantras (“Brexit means Brexit”) got repeated endlessly. This is how you get badly educated people to remember things. The remainers tried to explain but they had no easily digested mantras so their message was largely ignored by the majority. It’s much the same story across the pond with MAGA. The fact that dozens of court cases failed to find even one instance where the election results were wrongly called really ought to demonstrate that the Republicans are full of crap, but something like 75 million people still support the orange nutjob, as the judges that he helped install take a wrecking ball to people’s rights.
Couple that with the lack of built in languages that can be used in the same way as BASIC (slowly changing with the likes of Python, but twenty odd years without has made computers more of an appliance that “does stuff”.
I believe that interested people should learn to program on an old BBC; in that way they can get to understand how the machine actually works.
Let me ask you a question. Do you understand the process by which Windows boots? Do you understand how RISC OS boots? I would say “mostly” for the BBC MOS (my 6502 is about as rusty as the Titanic) and “pretty much” for RISC OS, if we gloss over some things like how FileCore actually does what it does. I know how the system is brought up, one rainy winter day I read my way through the boot code in RISC OS 2 and RISC OS 5 isn’t that different. ;-) That’s part of the problem right there. Those machines that are not a black box locked down with DRM (like any games console, but UEFI is bringing DRM to a PC near you) are so vastly complicated that I doubt there is anybody that completely understands the entire system. There’s just so much. Plus, from time to time I get people expressing surprise that I know assembler. Why, people ask, as any compiler can easily outperform any assembler programmer 1. Well, because I grew up when it was necessary, but even I hold the view that one shouldn’t use assembler unless it’s necessary. I’m sure I had plenty of now-deleted discussions with the resident troll over this. But, you know, if few people use assembler, who writes the next bootloader?
Not worth it. Two decades ago the smartphone didn’t exist. Now pretty much everybody carries a device capable of high definition video, playing similar video streamed from the other side of the planet, and the ability to look up pretty much anything … so rather than worry about losing the ability to write software, you might want to consider the long term effects of easy internet access meaning nobody has to bother to remember anything anymore.
This too. I’m not just a firm believer in inflicting an eight bit machine on my non-existant daughter, I’d also have her sit down and work out what to do before writing a single line on code. By contrast, Mamie was planned. Granted, most of the planning was in my head, and I was making up a lot of the algorithms as I was going, but the code is a lot cleaner and better written because I knew what the end result was before I started, and prototyped ideas in BASIC, so I had an understanding of what I was doing before I wrote anything for the program proper. It also made it much much easier to plug in things like doors and spiders (suggestions from my testers). That’s not to say I got everything right, but it’s pretty good given that I had no idea what I was doing. ;-) Anyway, yes. One shouldn’t just sit down and write code and hope it turns out right… |
Clive Semmens (2335) 3276 posts |
This, doubled and redoubled in No Trumps. Even if you are good enough to get into university, other than good universities, or good departments at others. http://clive.semmens.org.uk/Education/TeachingToThink.html#IM (One particular conversation I had with I.M. came to mind when I read Cox’s quote. We talked about how school, so far from teaching children how to think for themselves, tends to teach children to regurgitate other people’s thoughts instead. This is definitely a weakness in the education system – and one which is getting worse due to interference in the education system by arrogant, ignorant politicians. (And the likes of Christopher Woodhead.)) |
GavinWraith (26) 1563 posts |
You cannot get inside a student’s brain and think for them. The most that a teacher can do is provide opportunities. I think this is what the ancient Greek adage about virtue being unteachable is about (except that virtue is not the right translation). I fear that the teaching=fact-injection error is too wide-spread among politicians and educational administrators.
That is right. But I remember Joe Taylor teaching large audiences by apparently coding off the cuff. Of course, he had put in a huge amount of forethought, so he was presenting an illusion. I noticed that there was a vast difference in teaching methods between mathematicians and computer-scientists. Mathematicians liked to use blackboards and chalk. Computer-scientists used the overhead projector, showing pages of bullet-pointed summaries, often whipped away before anybody had time to take notes. New technology has made both styles obsolete. The linearity of time is an obstacle to the preparation of teaching-material. Arguments are like trees – each proposition may require many pre-requisite propositions – and so, like code, has to be flattened. I always found this difficult. There are so many ways to flatten a tree into a linear order. There is material here for a science-fiction writer: a species that can think and communicate along many channels simultaneously. |
Clive Semmens (2335) 3276 posts |
This, absolutely – more or less what my link says somewhat less concisely!
There is a place for coding off the cuff – in the early stages of learning to code. Partly simply a lesson in how not to do it, but actually quite a bit more than that. One can learn a lot by making mistakes – more than just “this isn’t how you do it.” |
Paul Sprangers (346) 524 posts |
When it comes to coding, a teacher could also emphasise the beauty of a routine or an algorithm. I think that it might help pupils or students if this aspect got some attention too. On the other hand, when I tried to explain my enthusiasm about that integer solution to my nearest dearests, I only saw question marks in their eyes. But perhaps I’m not a good teacher. |
Steffen Huber (91) 1953 posts |
But only to continue to emphasise what a real software engineer should aim for, and that the real beauty lies in simplicity, changeability and understandability – maintainability. Coding (for practical things, not for personal dabbling and enjoyment of course) is not art, it is craftmanship. And reusing things that have already been done “the right way” by other good coders. And being able to look at code and judge if it was written by good or bad coders of course… |
Rick Murray (539) 13840 posts |
In the early stages when you don’t know better, and when you have enough experience that you don’t need to sit and think through what needs done, at least not for small programs. My ChooseBD (recent blog) was written that way as I knew what needed done, just had to turn the idea into code. |
Clive Semmens (2335) 3276 posts |
That’s an interesting one. What’s good and what’s bad code is evidently sometimes in the eye of the beholder: for example, I generally reckon my code is pretty bad, but others (with a better coding pedigree than mine) have disagreed.
Actually, although I almost never commit my ideas to keyboard (or paper) before starting to code, I certainly wouldn’t start to code until I’d got a very clear idea of exactly what I was trying to do, and how to go about doing it, sorted out in my head. |
Rick Murray (539) 13840 posts |
This, this, and this again. Like the recent snippet of Paint that was posted. Clever, yes. Concise, yes. Horrid, very.
Where does the difference lie? Creating art is craftsmanship.
Beauty is in the eye of the beholder. I prefer that which is clear and obvious and uses plenty of brackets so one doesn’t have to remember obscure rules of precedence.
Surely you only need to subtract INT from num to see if you’re left with more or less than 0.5, upon which point it seems logical to just add 0.5 to the number and INT that?
Ah, but did they have a reason to want to know at the moment you were explaining?
Best way for working through a problem.
I think anybody who reduces a topic down to a PowerPoint presentation doesn’t deserve to be called Scientist. That style of presentation is only useful for allowing manglement to bamboozle other manglement. It’s an expensive waste of time, but at least it keeps them away from the people that actually get work done (unless they’re the unwilling audience participation).
We have them. They used to be called “women”, but apparently that word means something else these days. 🤦 |
Clive Semmens (2335) 3276 posts |
Oh, I don’t know. You could call them Mewing Mews, for example. |
GavinWraith (26) 1563 posts |
Nice one. But even for women, ears and mouths are serial devices. I do not know who first formulated the dictum that consciousness is a virtual serial machine built on the highly parallel device which is our body. Maybe there are reasons (logical?, physical?, evolutionary?) why that has to be so, but I do not know them, which is why I think that the idea of a highly parallel consciousness is worth some speculative thought. The art versus craft dichotomy is only a recent nonsense. The beauty of algorithms or mathematical ideas is for me a most important aspect. Though I think it is a notion that is hard to pin down. Economy and surprise have roles to play in it. |
Clive Semmens (2335) 3276 posts |
True, but they are separate serial devices, and some people (not only women, but perhaps more often women) seem to be able to operate them simultaneously and independently. The eyes and fingers (perhaps on a keyboard) are also separate serial devices. In principle the fingers are also multiple separate serial devices, and for some purposes that can actually work. |
Steve Pampling (1551) 8170 posts |
Time sliced, rather than parallel processing. Always seemed to burn calories at a high rate and produce a headache quite often. |
Paul Sprangers (346) 524 posts |
Absolutely, but at that time I was surprised that it could be written is such a concise way. It stroke me as beauty. I only saw question marks in their eyes. The nail on the head. That was exactly the problem, as it still is, embarrassingly often even. |
Clive Semmens (2335) 3276 posts |
I’m not sure about that – at least, while I think you’re right that the conscious processes are time sliced, subconscious processes can make the actual operation of the “peripherals” (ears, mouth, eyes and fingers) truly parallel. |