University classes are great. They force you out of your comfort zones. When I was self taught I would never have pushed through learning the socket API in C, doing so many projects in bash, studying the academic side of distributed systems, data structures, common algorithms. Stuff like that.
I interview a lot of self taught people, or boot camp graduates, and their issues is often that they pigeonholes themsleves into a comfort zone, or they fall apart when you ask them about academic topics that are relevant for the job.
On the other hand, people who never taught themselves anything code related often suck at coding, or they've forgotten a lot of what they learned in college. Hell, for some of them, even while still in college they've forgotten a lot of what they were taught the years prior.
It's best to have done some code by yourself before university, so that you have faced the problems that arise naturally, and when the courses present you with clever solutions to them, you retain them. You don't just dismiss them as fancy theoretical stuff you need to know for the exam, then promptly forget. You've footgunned yourself with memory management enough times that it speaks to you when you get explained RAII.
I wouldn't group boot camp graduates and self taught people together. I'm confident there's skilled people coming out of bootcamps, but the people I know personally saw it as a cheaper shortcut into the field because they couldn't teach themselves and would have otherwise gone to a university or chosen a different field.
Coding bootcamps weren't really around when I started, but I avoided online courses and traditional learning methods. I would have also avoided bootcamps for the same reasons. I wanted to create and solve problems that were exciting, rather than follow through a textbook and take tests.
I'm self-taught and learned C in my early teens because I really wanted to do something that I couldn't find any code or preexisting solutions for, and I knew C was really the best way (for me) to solve it. I didn't want to learn it but I wanted the cool thing more, so I struggled through forum browsing, reading documentation, and trial/error and successfully got what I wanted while gaining more skills that led to where I am today.
The desire and drive to learn something matters more than the method, in my opinion.
By the time I got to university I already knew C and the socket APIs and have been paid for delivering software that uses them. I had a friend who made a lot of money selling games he wrote in high school for the C64. Both of us were well ahead of new grads in terms of raw programming skills.
What I was missing were things like Calculus, Linear Algebra, Discrete Math (I knew parts having read TAOCP but had gaps). I knew data structures and algorithms but learnt some I didn't. I had some other random theoretical exposure but the CS program plugged some holes and I did learn random things I didn't know. I also learned more about how to learn.
I think the CS program made me more well rounded. I don't think it made me a better programmer. There was zero challenge for me in all the programming related courses. Where I had to work was on the math and theory side.
I don't think anybody denies that, but getting into and paying for a university is very much a financial and social class issue.
> When I was self taught I would never have pushed through learning the socket API in C, doing so many projects in bash
You speak only for yourself though. I'm largely self-taught and have done these things.
> I interview a lot of self taught people, or boot camp graduates
These are often two very different types of job candidates.
> they fall apart when you ask them about academic topics that are relevant for the job
Yes, I do tend to fall apart in audition-style job interviews. But I can solve the same problems when just left on my own, with nobody standing over my shoulder.
Your criticism of the specific details rings true, but I also liked the overall thrust of the GP which is that two common failure modes for working software engineers are either being overly academic and not efficient at practical application of that knowledge, or else being too superficial and direct about solving the immediate problem in front of them without recognizing or even being aware of the theoretical knowledge and concepts that can greatly improve their local solutions.
I think it’s fair to say those failure modes tend to disproportionately accumulate university graduates and self taught developers respectively. As long as we don’t use it as some kind of litmus test then I don’t think it hurts to call that out.
I acknowledge that theory-oriented vs. problem-oriented would be a fair characterization. But I think the language of "comfort zones" and "pushing through" was unfortunate and unfair. It suggests that somehow self-taught developers are lazy, when in fact they often have to work harder than anyone else, because nothing is handed to them by a professor or university. (Not to mention that it can be a lot harder to get a job when you don't have any academic credentials.) I would say that teaching yourself a difficult, esoteric skill, with no outside help, is inherently breaking out of your comfort zone.
College being a huge expense is an anglo-centric issue
It's important for those concerned, but most people aren't, so I don't like to include it because then the entire "value of college" debate shifts on the economics of it.
>You speak only for yourself though. I'm largely self-taught and have done these things.
>I'm a bit confused here. I was referring to the first paragraph in your original post, whereas you seem to be referring to the second paragraph?
My point is, I think, that I would wager you are not the norm among the exclusively self-taught crowd
There's going to be a lot of people on Hackernews to debate me on this, but I'm going to go out on a limb there and say: There's already a selection bias if you're hanging on here.
Programmers who have an issue with the academic parts of CS (self taught or otherwise) probably wouldn't hang out on Hackernews to read such content as: "Writing a competitive BZip2 encoder in Ada from scratch in a few days (2024)".
It's hard being self taught and overcoming the comfort zone, it's hard to go out of your way to figure out what you should learn as you don't have the luxury of being forced to follow a curriculum drawn by experts of the field you're studying.
My thesis is that I disagree that "Self-Taught Engineers Often Outperform"
Formally trained engineers mostly outperform, with a few self-taught people that are going to stand out, but they are the visible part of the iceberg, and if you advise someone to go self-taught, most likely they'll end up underperforming compared to someone who's gone to university. And that's normal, because being self-taught is harder.
> I would wager you are not the norm among the exclusively self-taught crowd
What is the norm?
> the self-taught people I interview
That's another small and unrepresentative group, possibly much smaller than self-taught developers who visit HN. In total, how many self-taught people have you interviewed? Either way, there's selection bias.
> a few self-taught people that are going to stand out, but they are the visible part of the iceberg, and if you advise someone to go self-taught, most likely they'll end up underperforming compared to someone who's gone to university.
That's kind of the point, though. Who would advise someone to go self-taught? That would be strange advice. There's definitely survivorship bias in self-taught engineers who have managed to make it in the tech industry, which is exactly why you should pay attention to them: they've successfully overcome the odds and obstacles. The % of self-taught who get to that point is likely much smaller than the % of university-taught. As you say, "being self-taught is harder."
You are taking the average of two groups but there is no iceberg, the self-taught people who make it are the only ones in that group...the other people do other things. It is like including the people who drop out of college in your group. As you say, self-taught is harder so people who go through that are going to know more and will end up knowing more that is useful.
People who teach at university aren't the experts in the field. The situation of university is inherently artificial created to fulfill a wide range of objectives which are largely unrelated to utility for students (and certainly not, utility for employers). For most subjects, people who teach at university are going to be very far from the experts...if they were experts, they wouldn't be teaching.
> If you just want to ignore the United States, then fine, but in general, good luck trying to ignore the United States.
Sorry to disappoint you, but as a Canadian hiring mostly Canadians no I don't care about how expensive college is in the US on a day to day basis. It really is just a US problem. Canadian Universities are still expensive but not remotely in the same ballpark as the US. You can often pay the tuition by having a decent summer job.
Good luck supporting yourself as a full time student in Toronto or Vancouver if you don’t have family locally, or if your parents don’t have the money to help you out.
> I don't think anybody denies that, but getting into and paying for a university is very much a financial and social class issue.
I mean... Not really? I got a BS in Computer Science from a cheap, small university (plus a bunch of it at my local junior college, for even cheaper!), and the quality of the education was better than I've seen out of "excellent" schools. It was really cheap, too! Easily paid off after a few years at software engineer salaries.
Hell, with entry-level salaries at places like Google or Meta, you could probably pay the whole thing off in a year.
I think people focus far too heavily on "Ivy League" schools and the costs associated with them, and forget that things like junior colleges and small universities still exist, and are still relatively affordable.
With a "commodity" degree like CompSci, cost isn't really a problem.
Besides, no one gives a shit where you went to school after your first job in the field. That first job might be marginally harder to get, and you might have to settle for slightly lower pay, but you're going to be far from struggling with the debt unless you really overpaid for that degree
I actually do deny that university classes are great. Many are actively harmful. When I went to university, the intro level cs class was taught in c. It took me decades to unlearn.
The idea that having knowledge of C is harmful is possibly one of the most anti-intellectual things I've heard today. I guess you don't value education, so you wouldn't know, but your brain doesn't have some limit. Extra knowledge is at worst unused, but often helps in various subtle ways. Never does real knowledge "hurt."
The culture of CS departments at some universities before the tech boom was also deliberately antithetical to almost all of the things that people now mention as being great about university.
* Courses designed to fail out many students
* Courses designed to extremely theoretical and impractical because teachers found them fun to teach
* Making the subject inaccessible to as many people as possible
When I went to uni in the UK I didn't study CS (now senior dev at a large US tech company) because of the above, the subject had the highest fail rate, had the most unpleasant faculty, and had the highest rate of unemployment after graduation of any subject (this was a top 5 CS uni that only took people with top grades).
It is great if people got something from their experience but this isn't how it goes for most people. And, from working with many people who have CS degrees, you do still see issues: poor communication skills, poor business understanding, often have significant trouble prioritizing work because of the previous two issues, etc. In other words, some can code (even there, grads come out...not great) but a CS course is usually not a comprehensive education to work anyway.
I am not sure what represents comfort zone more than the way most universities teach any subject.
This is gonna sound like an old man yell at clouds post but...
I also remember university being great. I built a compiler, a toy OS and interfaced with GPS.
But a few years ago I was invited to teach on another university and I was very disappointed. The curriculum was basically a modern bootcamp stretched over a few years with a lot of unrelated classes sprinkled in. (EDIT: I just checked it and: lots of business, management, humanities, chemistry, environment, entrepreneurship... and one e-sports class?).
Almost no fundamentals, except for an algorithms class, almost straight to React and a few backend frameworks that were popular in the startups in the area.
Were you teaching for a software engineering degree or a comp sci degree? Former is more focused on practical skills that are more “useful”, hence all those cross-discipline courses.
I’d argue that the core comp sci skill is more useful than learning how to use React or whatever. But I guess I’m another old man yelling at cloud.
It really depends on the university, and then also on the lecturer.
When I was self-taught, I learned the socket API in C because I needed it for what I was trying to do. This was long before college, and when I got to the latter, it turned out that the person teaching us networking only knew the absolute basics of the API, and I could already run circles around them, so it was the opposite of "out of my comfort zone" (which is "in the boring zone").
Totally agree that the sweet spot is a mix: struggle on your own first, hit walls, make a mess... then when the structured material shows up, it actually means something
Boot-camp grads are not self-taught, they went to a boot-camp. Boot-camp people are career opportunists. Nothing wrong with career opportunists, just saying that they are different than a self-taught dev.
I interview a lot of self taught people, or boot camp graduates, and their issues is often that they pigeonholes themsleves into a comfort zone, or they fall apart when you ask them about academic topics that are relevant for the job.
On the other hand, people who never taught themselves anything code related often suck at coding, or they've forgotten a lot of what they learned in college. Hell, for some of them, even while still in college they've forgotten a lot of what they were taught the years prior.
It's best to have done some code by yourself before university, so that you have faced the problems that arise naturally, and when the courses present you with clever solutions to them, you retain them. You don't just dismiss them as fancy theoretical stuff you need to know for the exam, then promptly forget. You've footgunned yourself with memory management enough times that it speaks to you when you get explained RAII.