Skip to content(if available)orjump to list(if available)

Computer science courses that don't exist, but should (2015)

penguin_booze

CSCI 0001: Functional programming and type theory (taught in English [0])

For decades, the academia mafia, through impenetrable jargon and intimidating equations, have successfully prevented the masses from adopting this beautiful paradigm of computation. That changes now. Join us to learn why monads really are monoids in the category of endofunctors (oh my! sorry about that).

[0] Insert your favourite natural language

corysama

> CSCI 3300: Classical Software Studies

Alan Kay, my favorite curmudgeon, spent decades trying to remind us we keep reinventing concepts that were worked out in the late 70s and he’s disappointed we’ve been running in circles ever since. He’s still disappointed because very few programmers are ever introduced to the history of computer science in the way that artists study the history of art or philosophers the history of philosophy.

synack

When I was at RIT (2006ish?) there was an elective History of Computing course that started with the abacus and worked up to mainframes and networking. I think the professor retired years ago, but the course notes are still online.

https://www.cs.rit.edu/~swm/history/index.html

_carbyau_

I recall seeing a project on github with a comment:

Q: "Sooo... what does this do that Ansible doesn't?"

A: "I've never heard of Ansible until now."

Lots of people think they are the first to come across some concept or need. Like every generation when they listen to songs with references to drugs and sex.

rightbyte

Come on music is very much cross generational?

I think software engineering have so many social problems to a level that other fields just don't have. Dogmatism, superstition, toxicity ... you name it.

PaulDavisThe1st

The history of art or philosophy spans millenia.

The effective history of computing spans a lifetime or three.

There's no sense comparing the two. In the year 2500 it might make sense to be disappointed that people don't compare current computational practices with things done in 2100 or even 1970, but right now, to call what we have "history" does a disservice to the broad meaning of that term.

Another issue: art and philosophy have very limited or zero dependence on a material substrate. Computation has overwhelming dependence on the performance of its physical substrate (by various metrics, including but not limited to: cpu speed, memory size, persistent storage size, persistent storage speed, network bandwidth, network scope, display size, display resolution, input device characteristics, sensory modalities accessibly via digital to analog conversion, ...). To assert that the way problems were solved in 1970 obviously has dramatic lessons for how to solve them in 2025 seems to me to completely miss what we're actually doing with computers.

xmprt

That gives even more reason to study the history of CS. Even artists study contemporary art from the last few decades.

Given the pace of CS (like you mentioned) 50 years might as well be centuries and so early computing devices and solutions are worth studying to understand how the technology has evolved and what lessons we can learn and what we can discard.

teiferer

> The effective history of computing spans a lifetime or three.

That argument actually strengthens the original point: Even though it's been that short, youngsters often still don't have a clue.

degamad

> To assert that the way problems were solved in 1970 obviously has dramatic lessons for how to solve them in 2025 seems to me to completely miss what we're actually doing with computers.

True they might not all be "dramatic lessons" for us, but to ignore them and assume that they hold no lessons for us is also a tragic waste of resources and hard-won knowledge.

citizenpaul

Its because CS is not cared about as a true science for the most part. Nearly all of the field is focused on consolidating power and money dynamics. No one cares to make a comprehensive history since it might give your competitors an edge.

jancsika

If Alan Kay doesn't respond directly to this comment, what is Hacker News even for? :)

You're not wrong about history, but that only strengthens Kay's case. E.g., our gazillion-times better physical substrate should have led an array of hotshot devs to write web apps that run circles around GraIL[1] by 2025. (Note the modeless GUI interaction.) Well, guess what? Such a thing definitely doesn't exist. And that can only point to programmers having a general lack of knowledge about their incredibly short history.

(In reality my hope is some slice of devs have achieved this and I've summoned links to their projects by claiming the opposite on the internet.)

Edit: just so I get the right incantation for summoning links-- I'm talking about the whole enchilada of a visual language that runs and rebuilds the user's flowchart program as the user modelessly edits it.

1: https://www.youtube.com/watch?v=QQhVQ1UG6aM

corysama

https://news.ycombinator.com/user?id=alankay Has not been active on Hacker News for several years now.

At 85 he has earned the peace of staying away from anything and everything on the internet.

wredcoll

> E.g., our gazillion-times better physical substrate should have led an array of hotshot devs to write web apps that run circles around GraIL[1] by 2025

Why? What problem did it solve that we're suffering from in 2025?

degamad

> art [has] very limited or zero dependence on a material substrate

This seems to fundamentally underestimate the nature of most artforms.

andrewflnr

You first sentences already suggest one comparison between the histories of computing and philosophy: history of computing ought to be much easier. Most of it is still in living memory. Yet somehow, the philosophy people manage it while we computing people rarely bother.

markus_zhang

I always think it is great value to have a whole range of history of X courses.

I once thought about a series of PHYS classes that focus on historical ideas and experiments. Students are supposed to replicate the experiments. They have to read book chapters and papers.

chongli

Art and philosophy have very limited or zero dependence on a material substrate

Not true for either. For centuries it was very expensive to paint with blue due to the cost of blue pigments (which were essentially crushed gemstones).

Philosophy has advanced considerably since the time of Plato and much of what it studies today is dependent on science and technology. Good luck studying philosophy of quantum mechanics back in the Greek city state era!

Exoristos

> The history of art or philosophy spans millen[n]ia.

And yet for the most part, philosophy and the humanities' seminal development took place within about a generation, viz., Plato and Aristotle.

> Computation has overwhelming dependence on the performance of its physical substrate [...].

Computation theory does not.

pathseeker

programmers are introduced to CS history, it's just mostly irrelevant or poorly contextualized so there are no useful lessons

motorest

> He’s still disappointed because very few programmers are ever introduced to the history of computer science in the way that artists study the history of art or philosophers the history of philosophy.

Software development/programming is a field where the importance of planning and design lies somewhere between ignored and outright despised. The role of software architect is both ridiculed and vilified, whereas the role of the brave solo developer is elevated to the status of hero.

What you get from that cultural mix is a community that values ad-hoc solutions made up on the spot by inexperienced programmers who managed to get something up and running, and at the same time is hostile towards those who take the time to learn from history and evaluate tradeoffs.

See for example the cliche of clueless developers attacking even the most basic aspects of software architecture such as the existence of design patterns.

with that sort of community, how does anyone expect to build respect for prior work.

ponco

I reflect on university, and one of the most interesting projects I did was an 'essay on the history of <operating system of your choice>' as part of an OS course. I chose OS X (Snow Leopard) and digging into the history gave me fantastic insights into software development, Unix, and software commercialisation. Echo your Mr Kay's sentiments entirely.

musebox35

Sadly this naturally happens in any field that ends up expanding due to its success. Suddenly the number of new practitioners outnumbers the number of competent educators. I think it is a fundamental human resources problem with no easy fix. Maybe llms will help with this, but they seem to reinforce the convergence to the mean in many cases as those to be educated is not in a position to ask the deeper questions.

fsckboy

>Alan Kay, my favorite curmudgeon, spent decades trying to remind us

Alan Kay giving the same (unique, his own, not a bad) speech at every conference for 50 years is not Alan Kay being a curmudgeon

>we keep reinventing concepts that were worked out in the late 70s and he’s disappointed we’ve been running in circles ever since.

it's Alan Kay running in circles

cranium

I'd add classes about:

  - telling clients that the proof-of-concept is non-conclusive so it's either bag it or try something different
  - spending innovation tokens in something else than a new frontend framework and/or backend language
  - understanding that project management methods are tools (not rites) and if your daily standup is 45min then there's a problem

ByteDrifter

Many computer science programs today have basically turned into coding trade schools. Students can use frameworks, but they don’t understand why languages are designed the way they are, or how systems evolved over time. It’s important to remember that computing is also a field of ideas and thought, not just implementation.

kelseyfrog

CSCI 4810: The Refusal Lab

Simulate increasingly unethical product requests and deadlines. The only way to pass is to refuse and justify your refusal with professional standards.

LPisGood

CSCI 4812: The Career Lab

Watch as peers increasingly overpromise and accept unethical product requests and deadlines and leave you high, mighty, and picking up the scraps as they move on to the next thing.

andrewflnr

Conventional university degrees already contain practical examples of the principles of both these courses in the form of cheating and the social dynamics around it.

jayd16

Actually a lot of degrees have a relevant ethics class.

kelseyfrog

I don't disagree that such classes exist, but I have yet to see an ethics lab specifically with the intention of practicing refusal.

tdeck

But universities want their graduates to be employable, is the thing.

d_silin

"CSCI 4020: Writing Fast Code in Slow Languages" does exist, at least in the book form. Teach algorithmic complexity theory in slowest possible language like VB or Ruby. Then demonstrate how O(N) in Ruby trumps O(N^2) in C++.

aDyslecticCrow

We had this as a lab in a learning systems course. converting python loops into numpy vector manipulation (map reduce), and then into tensorflow operations, and measuring the speed.

Gave a good idea of how python is even remotely useful for AI.

nine_k

One of my childhood books compared bubble sort implemented in FORTRAN and running on a Cray-1 and quicksort implemented in BASIC and running on TRS-80.

The BASIC implementation started to outrun the supercomputer at some surprisingly pedestrian array sizes. I was properly impressed.

LPisGood

Python has come along way. It’s never gonna win for something like high-frequency trading, but it will be super competitive in areas you wouldn’t expect.

liqilin1567

Optimizing at the algorithmic and architectural level rather than relying on language speed

omosubi

I would add debugging as a course. Maybe they should teach this but how to dive deep into figuring out how to learn the root cause of defects and various tools would have been enormously helpful for me. Perhaps this already exists

9dev

Yes please. Even senior engineers apply with their debugging abilities limited to sprinkling print-exit over the code.

Do you have a moment to talk about our saviour, Lord interactive debugging?

tomhow

Previously:

Computer science courses that don't exist, but should (2015) - https://news.ycombinator.com/item?id=16127508 - Jan 2018 (4 comments)

Computer science courses that don't exist, but should (2015) - https://news.ycombinator.com/item?id=13424320 - Jan 2017 (1 comment)

Computer science courses that don't exist, but should - https://news.ycombinator.com/item?id=10201611 - Sept 2015 (247 comments)

ChicagoDave

I could add so much to this page.

- COBOL on punch cards

- RPG II/III

- PDP/Vax Shared Memory Modules

- Hierarchical Data File Storage

- Recursive Expression Evaluator

- Batch Processing large datasets

- Compressed dates and numbers

So many of these teach you structural complexity that you’d never learn in today’s world.

epalm

Where’s the course on managing your reaction when the client starts moving the goal posts on a project that you didn’t specify well enough (or at all), because you’re a young eager developer without any scars yet?

ekidd

Back in the 90s, this was actually a sneaky part of Dartmouth's CS23 Software Engineering course. At least half your grade came from a 5-person group software project which took half a semester. The groups were chosen for you, of course.

The professors had a habit of sending out an email one week before the due date (right before finals week) which contained several updates to the spec.

It was a surprisingly effective course.

(Dartmouth also followed this up with a theory course that often required writing about 10 pages of proofs per week. I guess they wanted a balance of practice and theory, which isn't the worst way to teach CS.)

collingreen

In uni we had a semester long group project with the stronger coders as project leaders. Group leaders controlled pass/fail for the group members and vice versa. After the leaders put together project plans for their teams and everyone was supposed to start working WHOOPSIE reorg and all the leaders got put on random teams and new people got "promoted" into leader (of groups they didn't create the plan for). If the project didn't work at the end the entire group failed.

I've never been taught anything more clearly than the lessons from that class.

markus_zhang

Shit I was thinking about exactly the same thing: professor deliberately change requirements at last week to mess up the students and give them a bit of taste of true work.

Glad that someone actually did this.

hotstickyballs

That's the job of a (good) manager.

dfex

some scars need to be earned

h4ck_th3_pl4n3t

It's called test driven development.

qwertytyyuu

Functional programming exists in any reputable computer science course. Standard is haskel, For a true "unlearning" it might need to be a third or forth year subject

EigenLord

I wish more comp sci curricula would sprinkle in more general courses in logic and especially 20th century analytic philosophy. Analytic philosophy is insanely relevant to many computer science topics especially AI.