I have a group of pretty good students. They are very nice and likable and bright…all good things.
They went from most of them saying “I don’t have the faintest idea what I’m doing” to “here’s a pretty full featured first version” in three weeks.
It wasn’t totally unexpected given their progress last week but overall it still was a fun surprise.
Non linear progress is one of the frustrations and pleasures of teaching.
Making a new course is a ton of work. This is true no matter what, but it is harder in this day and age where you have to produce a ton of detailed material. When I was a philosophy instructor, there were classes where I had to pick texts, think about what I was going to discuss, design a midterm, final, and two paper topics. In principle, I didn’t need to work out my lectures in detail, in advance. (I mean, I experimented with such, including writing them out verbatim). I can easily give very coherent, extemporaneous lectures (I do it all the time…indeed, a visitor once asked me if what they’d just saw was a prepared lecture). I’d guess most academics can do this, though we’d prefer not to.
I like a loose structure when I’m hoping for lots of interaction. Well, when I <i>expect</i> lots of interaction (I’m always hoping).
These days I have to produce slides. Lots of slides. Assignments need a metric ton of vetting plus rubrics, submission procedures, etc.
All things being equal, if we’re going to redo a course, we’d like it to last for a while. Major reworks get us 8 hrs of duties credit per contact hour. That’s a <i>lot</li>.
The other dirty secret is that while I think the class gets better…it’s not always clear.
BUT if I can improve a course even for a year, the temptation is to go for it. After all, for <i>those students</i> it’s the only version of the course they’ll get.
I’m thinking about how to get more efficient at major changes.
It doesn’t have to be AI that does it. Forrest Brazeal writes:
No, the real trend to watch here is not that the cloud providers are making it easier for non-technical people to code (although they are), but that they are straight-up reducing the number of people required to deliver technical solutions.
I’ve been saying for awhile now that we’re getting close to a crisis point in the IT world. The mid-tier IT worker is in imminent danger of being automated out of existence, and just like with the vanished factory jobs of the last 30 years, nobody wants to admit it’s happening until it’s too late.
The IT automation apocalypse will move slowly (by tech standards), so it is flying under the radar. Unlike with the collapse of American manufacturing, we won’t get breathless feature articles and political posturing. A shuttered factory and 700 unemployed workers are concrete, easily visible; a decaying Rust Belt town makes an arresting photo spread. But how do you build a narrative around midlevel IT engineers let go in twos and threes from jobs that even they probably can’t quite describe?
Moreover, the first people to feel the pain will not be the highly-paid, conference-trotting Very Important Programmers in job-rich tech hubs. They will be anonymous Windows administrators and point-and-click DBAs and “senior application developers” who munge JSON in C#. Normal people making comfortable money, fifty to eighty thousand dollars a year in ordinary places like Omaha and Memphis and Santa Fe.
This will be the new outsourcing. Consider stuff like email, mailing lists, and simple websites. These are a hell of a lot easier to run these days than they were even 10 years ago. The marginal effort of say Google to run even a large organisation’s email is very small, perhaps not even an extra job a lot of the time.
These are good and skilled white collar jobs that will likely evaporate. And far sooner than robot lawyers take over.
Maybe babysitting predictive modes will be the new back end dev task. Maybe. Students are certainly flocking to machine learning courses.
CS departments need to think hard about this trend and what we can do for our students.
I don’t teach methodology per se in my software engineering class, at least not the explicit sequence of waterfall, iterative, and agile. I do discuss a bit about sequencing activities, the notion of a wicked problem, etc. We, in fact, iterate a piece of software.
But as with the idea of a software crises, I’m worried about our standard narratives. This blog post points to discussions of how waterfall was ever a thing and how iterative methods were proposed very early on.
I suspect that some of our hoary old narratives are just things we say. I’m not sure software is all that different from buildings or that other sorts of engineering have all that easy a time.
I’m strongly considering changing my reading material for my software engineering class. I like Code Complete in a lot of ways, but it really does feel a bit old and a lot of bits are not super well organised or presented. And it’s big without being super nicely modularised. It’s not really a textbook. I’m planning a pretty significant reworking of the course (to consolidate some stuff) so this is the time to change.
I was looking at a text that has some good reviews and decent presence on Open Syllabus and a metric ton of supporting material. I’ve never used supporting material but one can see the attraction!
I’m skimming a copy starting with the intro. The intro of text tends to be a really weak bit especially if it’s didactic instead of tutorialesque, so I forgave the cutesy intro dialogue. There were some helpful fake graphs about characteristic error rates which seemed fun. Then I hit the following description in a list of “kinds of software”:
Artificial intelligence software—makes use of nonnumerical algorithms to solve complex problems that are not amenable to computation or straightforward analysis. Applications within this area include robotics, expert systems, pattern recognition (image and voice), artificial neural networks, theorem proving, and game playing.
Say what? “Nonnumerical algorithms”?!?!?! Right before talking about artificial neural networks??!?! Maaaaybe there’s a specialised enough variant of “numerical algorithm” (written primarily in Fortran?!?) where this is technically not wildly false, but it sure the hell is misleading here (given the standard distinction between symbolic and non-symbolic AI). Seriously bonkers.
But…ok. Does one extremely boneheaded bit of a sort of throwaway warrant tossing the whole thing? Maybe? I want to be fair. I can always guard against this in lecture…I guess. Then I hit:
- Software has become deeply embedded in virtually every aspect of our lives, and as a consequence, the number of people who have an interest in the features and functions provided by a specific application has grown dramatically. When a new application or embedded system is to be built, many voices must be heard. And it sometimes seems that each of them has a slightly different idea of what software features and functions should be delivered. It follows that a concerted effort should be made to understand the problem before a software solution is developed.
Oy. I mean, the dude has chapters on iterative and agile processes, so there’s some course correction. But. Come. ON!
Ok, now I dump it from the list.
I have a buncha tabs about SQL and query optimizations. So a bit of clean up:
Lots more tabs to be cleaned.
I went a whole week without blogging. BOO! I was commenting a bit more on LGM but really illness and class knocked me out. And then it snowballed.
I’ll try to post two a day for this week. But it might be slapdash.
My anxiety was brutal last week as I had to prepare a new class with material I inherited which was challenging to interpret much less prep.
Plus figure out how the changes affected the coursework. Yeek.
Per usual, it worked out. Per usual, it cost me an all nighter.
The lecture itself felt really really good. I’m working on whole class participation and I think it was working ok. I did a retrospective of the prior week first as outline and then again as slide snippets. I was able to emphasize how they should be pulling things to get in their heads. Then we applied it to the new material.
I came out tired and happy. And with a brutal sore throat. I also manage to spill not so hot tea over me and the desk. Yay.
Four years ago, I made a push to change how we teach software engineering at the MSc level. I had ambitious plans about how to change the whole sequence, but I was going to start by taking over the first class in the sequence.
The first year was super tough as the person who has been teaching it took medical retirement (sadly). My early ideas just weren’t workable given me and the cohort.
I completely revamped it, esp the coursework and have edge closer and closer to something which has some good innovation. It needs another overhaul but this year went pretty smoothly (still some coursework marking to go).
I won’t said it’s best of breed because I don’t have a lot of comparisons. But it seems good and rather interesting. One class out of four isn’t enough to be transformative but it’s a start!
Plus I taught the whole day with a unicorn horn on my head. Good times!
I thought it was going to be an easier week. Buuut…MSc project grading (done!), three exams to write (whoa, those are hugely not done), class prep (Not Done sob), and a programming contest (somewhat advanced)….it’s a bit much.
Sometimes, I can harness this pressure, but usually closer to the wire. Which I want to avoid!
Oh well. Next week is the last week of period 1. Now, I have a class in period 2 to panic over. Yay.