This post is largely about discrimination and uses Brian Acton as a timely centerpiece. The truth is however more complex (IMHO):
- Culture matters a lot in hiring. By this I mean a Stanford grad is much more likely to hire another Stanford grad in a field of qualified candidates. This isn't simply a form of nepotism as such. Two grads from the same college will share a larger common cultural base;
- The culture the founders bring shapes the organization. Many startups are started by Stanford and MIT grads. It shouldn't surprise you that this biases the makeup of their workforces and what they look for;
- Speaking as someone who has interviewed in the field of programming there are many frauds. I don't even necessarily mean deliberate frauds but there are clearly people who are employed as programmers/engineers who have no business being such. It's astounding how you can stump someone with 5-10 years of experience by asking them to code a simple loop (seriously);
- If people are worried about foreign labour putting downward pressure on wages, the indentured servitude that is the US work visa and immigration system should be the target of your anger. It allows bodyshops to hire people from, say, China and India and pay them a pittance because they know those people can't leave for 8+ years if they ever want a green card.
Make a green card automatic after, say, 6 years on an H1B, even if you change jobs, and a lot of those problems would go away.
As for other fields, I can't speak to those, other than anecdotally a lot of fields seem to have the earning potential of being a waiter in Manhattan.
"Cultural fit" is just another name for racism, sexism, and ageism. Yeah, all of those things can be culture. Why should it be any less wrong to discriminate based on college attendance than on race?
What else is cultural fit? "Sorry, you don't fit our culture of being heterosexual here."
We're supposed to be adults. We're supposed to know how to get along and play nicely with others, regardless of how different we are from each other. This "cultural fit" line is bullshit apologia for real discrimination.
Yeah, with respect to cletus, what he stated aren't reasons, they're excuses. Let's do a little bit of search and replace and see how these statements read:
- Gender matters a lot in hiring. By this I mean a woman is much more likely to hire another woman in a field of qualified candidates. This isn't simply a form of nepotism as such. Two people from the same gender will share a larger common cultural base;
- The race the founders bring shapes the organization. Many startups are started by whites. It shouldn't surprise you that this biases the makeup of their workforces and what they look for;
We wouldn't accept either of the above as a valid excuse for poor hiring practices, even though both statement are true. In modern society we acknowledge that it's all too easy to hire people like oneself, deliberately or otherwise - this is why we go out of our way to minimize the effects of these biases in our hiring.
The tech industry on the other hand seems to celebrate it to the point where "cultural fit" is the core of hiring practice. A lot of the ridiculous Silicon Valley stereotypes come from the immense monoculture it runs.
I think that's invalid. If you find someone who's a great programmer, but obviously won't get alone with your teammates, then you definitely should not hire him. Team dynamics are as important than individual skill levels.
That might hold water if every company I've ever worked for hired much more than just self-diagnosed-with-Aspergers', white-guy assholes. I've never seen a group of programmers get along. "So-and-so should be fired/shouldn't be hired because we just don't get along" is not an acceptable excuse. Any other industry and your boss will tell you that you better learn to get along.
This is Kindergarten level issues here. You're supposed to know how to tie your own shoes and play nicely with others by the time you're 6 years old. Why do we continue to let the tech industry endorse childish behavior in the work environment?
EDIT: The one job I hated the most was the one where I was hired because I was a "good cultural fit". Turns out, they thought I was of the "culture" that enjoyed working free overtime. Nerf guns and free soda weren't enough to keep me in the office after 6pm, so I started to get squeezed out, culturally.
> Why do we continue to let the tech industry endorse childish behavior in the work environment?
Sometimes assholes are super-productive geniuses (note that there are more assholes who think they're a genius) so even if they don't get along with the rest of your team (or any humans, really) it's a tough tradeoff whether it's worth having them around.
You have to balance whether they're doing more harm than good to the overall endeavor. In our case, we have one very senior guy who's highly antisocial but absolutely a crack dev. We've given him an extremely flexible work-from-home arrangement. It works out well because we hardly see him, and he's happy communicating over email and just churning out code in his batcave or whatever.
Hey, so excellent example. There is no way that guy fits into anyone's culture, yet the company still made it work. The premise still stands, "not a cultural fit" is always code for discrimination.
I don't think it's that simple. At the trivial and obvious level, you need to speak a common language. But even if you're both speaking English, you can communicate much more effectively if you have a shared cultural background that you can both refer to.
Okay, language. Check. But at my last two positions, my closest work-partner was from Malaysia and Lebanon. Slight accents, but completely great to work with.
When I'm having trouble with something, the last person I want to help me figure out my problem is an exact clone of myself. I want people used to different tools and different ways of thinking.
that is an extremely macro level of culture that does not explain why there are so few women, hispanics, and african-americans in technology. They aren't that different of a culture. Certainly far less of a cultural difference than all of the H1Bs they're hiring.
You're not getting the point. All of you who keep parroting your master's "cultural fit" line are not seeing the lie for what it is worth. It's not about culture.
Cultural fit is not that important for short term hires (ie 6 month contract) because you basically need someone that has the ability to do short term wins. It's only when you look beyond ie 2 or so years when cultural fit (or lack of) becomes more prominent.
You're not going to know that until you've been working with the person for a while. You cannot possibly tell that from "culture". If you think you can, it means you're being prejudiced in some way.
> Why should it be any less wrong to discriminate based on college attendance than on race
This question can be asked of anything. Why should it be any less wrong to discriminate based on intelligence? Why should it be any less wrong to discriminate on skill set?
It isn't so much "wrong" to discriminate on race as it is stupid, because race has nothing to do with most jobs (unless you're interviewing to be the grand knight of the klu klux klan or something).
The same cannot be said of which college you attended. Going to Stanford or MIT may not be the best indicator of your potential, it may not even be particularly good, but it requires a certain amount of hard work, determination and intelligence to get into and graduate from good schools like these.
Speaking as someone who has interviewed in the field of programming there are many frauds. I don't even necessarily mean deliberate frauds but there are clearly people who are employed as programmers/engineers who have no business being such.
I agree mostly, but for a different reason. I see a lot of people who have experience in years, but not in work. They were basically hired out of school, learned their job in 6 mos. and repeated the same job every 6 mos. for 5 years. Their knowledge is still 6 mos. out of school even though they have been working for 5 years.
This problem is the fault of both the employer and the employee. The employee for not taking control of their own destiny and the employer for being so risk adverse to never move forward.
Isn't it a bit silly to talk so much about culture being supreme then advocate importing more H1B's, who are almost certainly less of a match, culturally?
Yes and no. The ultimate goal for many H1Bs is the green card - and after your application is in your mobility is severely restricted.
The nature of the country-based priority dates means that Chinese and Indian nationals wait years for the green card application is approved, during which time they (mostly) cannot move.
So if you're strictly on a H1B, yes, it's trivial to switch jobs. If you have a green card application in-process and aren't from China, India, or the Philippines, it's also pretty easy to move (you might have to wait a few months for the right moment). For people from the above three countries it's just pain and misery.
At Hired, we found that candidates who are on H1B visas get 1/3rd fewer job offers. So yes, technically transfers are easy, but you definitely reduce the number of potential employers.
Body shops typically lock employees into restrictive contracts where punitive relocation "costs" are levied if the employee leaves before the contract is up.
About the 3rd point, was there a common factor with those programmers/engineers in their CVs that prompted them a coding interview (which obviously they failed)?
Perhaps this is why tech companies have so little ambition: a constant stream of new blood ensures some things change (yak shaving), but never anything substantial that requires extensive experience. Instead, everyone just props up the technological status quo for their CRUD website.
Some of the most fascinating and, frankly, lucid technology discussions I have had were with highly experienced (dare I say, older) engineers who actually know a thing or two. I find it completely scandalous that, in addition to these guys getting punished in the hiring process, the industry does not really seem to learn from them. Software engineering seems like a field that does not tend to advance in any real sense, except at a snail's pace. Most of the programmers I know (who are mostly young) are merely trying to reach a level of understanding that allows them to participate in the status quo, let alone challenge it. By the time they have sufficient experience to drive meaningful change, I suppose they will be old by industry standards, pushed out, and the cycle repeats.
Claims of age discrimination is like claims that there is untapped cheap labor in hiring women in first world countries. If the claims were true anyone could hire up all of the cheap labor and stomp their discriminating competitors. That doesn't happen. Either it's really a grand conspiracy like some claim or the situation is not so simple. I think it's more likely older people expect to get paid more for their experience, are less likely to learn new things on their own like younger people are, are less likely to start up their own companies like younger people are because doing so is risky.
You see people obsessing over youth. They are not. They are valuing risk takers, people who know modern tech, people who don't expect to be paid so much just because they know more less useful to the current situation skills and knowledge. The quality of a low age itself is irrelevant. It will be the same 15 years from now. Current younger people will feel more entitled to higher pay, will feel less willing to compromise or settle for jobs which their younger peers would be happy to get. Ultimately if someone brings value to a job it's the stupidity of the employer to not understand their value. If a company can make 10x more from hiring someone why wouldn't they? Companies which don't hire the best value makers don't do as well, which allows those companies which do hire those people to float to the top. It is the same thing with women. Any sane businessperson would hire people no matter their gender if those people can make them more money, and anyone who is leaving value on the table by being discriminatory despite the lost potential leaves room for their competitors to destroy them. If you think older people can make you more money ultimately then you hire them and rule the tech world.
> Claims of age discrimination is like claims that there is untapped cheap labor in hiring women in first world countries. If the claims were true anyone could hire up all of the cheap labor and stomp their discriminating competitors.
Who is claiming that first world women are an untapped source of cheap labor? In what field?
> [older people] are less likely to start up their own companies like younger people are because doing so is risky.
Even if true, this is irrelevant. Age discrimination in this context applies to employee hiring.
> They are valuing risk takers
Is "risk taking" a criteria your company applies when looking for talent? None of the startups I've worked with have ever sorted engineering candidates by "risk taking".
> It is the same thing with women. Any sane businessperson would hire people no matter their gender if those people can make them more money, and anyone who is leaving value on the table by being discriminatory despite the lost potential leaves room for their competitors to destroy them.
The crux of your argument is that the market will punish bad behavior. I think this sounds convincing. However, it is a flawed argument. For one thing, the market can't reward new approaches if no-one tries them.
50 years ago you could probably have made the argument that outright discrimination against women and minorities was rewarded by the market. After all, all companies observed similar hiring practices - including the market leaders.
Taking a snapshot of current practices, and claiming they are some sort of epitome "because free market!" is simply a logical fallacy.
>Who is claiming that first world women are an untapped source of cheap labor? In what field?
The people who claim that women are discriminated against purely based on gender. If women really are being discriminated based on gender, and companies are hiring less qualified men in favor of more qualified women, then someone should be able to hire up all of the women who are being snubbed by all of the sexist companies and have a competitive advantage over them, right? Same situation with older people, right? Unless of course if this isn't about age or sex. I'm female and not getting any younger by the way. But I still don't buy the bull others try to sell.
>Age discrimination in this context applies to employee hiring.
Are older people better employees or worse? Are they hiring based on age or the other things and age is only an easy thing to blame?
>Is "risk taking" a criteria your company applies when looking for talent?
No startup is a sure thing. Younger people are less risk averse. It's not that a company wants people who will take risks it's that older people want more of a sure thing. They walk in and expect a salary which matches their years of experience, while most places hiring have no use for their experience. Their experience would add no value to their business, and so hiring a younger person who does not demand such a high salary is a better option.
>For one thing, the market can't reward new approaches if no-one tries them.
There is no law keeping the people clamoring about ageism from do that. No one can force others to nor should they be able to. Clearly if they believe that hiring older people is such a good move they should be able to make a lot of money.
I'm unaware of laws which make it harder or impossible for older people to get hired?
> Unless of course if this isn't about age or sex. I'm female and not getting any younger by the way. But I still don't buy the bull others try to sell.
You're rebutting a straw man.
> No startup is a sure thing. Younger people are less risk averse.
Again, a moot point. If an employee is applying to work for a startup, they're acknowledging the risk. The risk, by the way, is negligible for employees.
> It's not that a company wants people who will take risks it's that older people want more of a sure thing. They walk in and expect a salary which matches their years of experience, while most places hiring have no use for their experience.
This is a contrived example. Furthermore, it doesn't make sense. By a "sure thing" do you mean salary vs. equity? I can assure you most young engineers also value salary.
> Their experience would add no value to their business, and so hiring a younger person who does not demand such a high salary is a better option.
What are you basing this on? It reads like a caricature.
> Clearly if they believe that hiring older people is such a good move they should be able to make a lot of money.
I already addressed this. If you're intent on repeating your original claim, we'll just have to agree to disagree.
>Ageism: people are not being hired because of their age only and not for any other reason.
>The risk, by the way, is negligible for employees.
What. Employees don't care that they might not have a job? Employees don't care that they might not be able to be paid?
>This doesn't make sense.
Companies don't want to pay for what they don't need. People feel entitled to what they feel they are worth and not necessarily what value they give to a company based on what the company needs. When they have a lot of experience but nothing useful to contribute they blame things which do not factor in at all in hiring.
>What are you basing this on? This reads like a caricature.
I think you are mistaking my snark for me actually asserting things.
>I already addressed this. If you're intent on repeating your original claim, we'll just have to agree to disagree.
Whose responsibility is it to fix the apparent ageism if the people complaining about it don't want to do anything to solve their own problems?
>You're relying on make-believe scenario to argue a point.
I live in a reality where technology is constantly changing, where people need to learn new things or get left behind in their usefulness. If they have years of experience in software or systems which no one uses anymore that doesn't guarantee them anything.
>It's an industry wide issue which is actually being addressed.
People solving their own problems instead of complaining about them on message boards? Mission Accomplished! Wait, so companies really were not hiring the best people who give them the most value and now suddenly are thanks to people speaking up? When did all of this happen?
> Then mission accomplished! Wait, so companies really were not hiring the best people who give them the most value and now suddenly are thanks to people speaking up? When did all of this happen?
Get back to me in 15 years when ageism is no longer something people complain about.
Won't happen. It will always be like this. People who refuse to adapt blame others instead of doing something about their own problems. If anything people will invent new isms to blame their problems on.
>Myth : The highest paid are in the 30-40 age bracket.
I'm not saying people are not paid money I'm saying older people will expect higher salaries based on their experience compared to younger people despite the direct value they can contribute to a company. Value = money.
>...but they already know so much more that learning is less important. They know it already!
Yes, they know so much... of things no one uses anymore, of outdated practices, and not what is making money right now. Technology changes fast and what is useful to know now won't be in 15 years.
> Technology changes fast and what is useful to know now won't be in 15 years.
This claim is bandied about way too carelessly. Application level tools do change quite fast. The fundamentals of technology do not.
Knowledge of networking, TCP/IP, x86 assembly, stack discipline, cache implementations, etc. is comparatively timeless. For instance, one of CMU's core CS/ECE classes (213) has been taught in roughly the same manner for the last 15 years. The same I believe holds true for 6.001 at MIT.
You can argue that COBOL is no longer relevant. What you can't claim is that only the framework-du-jour is important to modern business. It's simply not the case.
I would argue that the number of people needed for older technology is finitely limited not that they are no longer needed. There may be demand for older technology, but the majority of new jobs won't be for older technology.
>is comparatively timeless
Why are companies apparently hiring younger people over older people who are vastly more experienced and given that timeless knowledge would give so much more value to their companies? Are they just plain stupid?
> I would argue that the number of people needed for older technology is finitely limited not that they are no longer needed. There may be demand for older technology, but the majority of new jobs won't be for older technology
Obviously no-one is being hired to write x86 assembly. It's taught because it's relevant, important and timeless.
> Why are companies apparently hiring younger people over older people who are vastly more experienced and given that timeless knowledge would give so much more value to their companies? Are they just plain stupid?
Like most of your arguments, this is a logical fallacy. You're appealing to authority and the status quo, without actually addressing any of the issues head on.
>Obviously no-one is being hired to write x86 assembly. It's taught because it's relevant, important and timeless.
Not everything is timeless. Not everything everyone specializes now will be timeless. Some knowledge will be useful always, but if there is not demand for the timeless positions, or demand shrinks rather than grows, then even if someone has timeless knowledge that may not actually be useful in them getting hired, because what they know wouldn't give value to companies looking to hire.
>You're appealing to authority and the status quo, without actually addressing any of the issues head on.
I refuse to accept without evidence that companies are only hiring younger people because of their youth and not because they give the company some kind of competitive advantage.
>Like most of your arguments, this is a logical fallacy.
That was a question, which was refused any answer, not an argument. You are asserting that people have timeless knowledge making them more valuable employees than younger people without that timeless knowledge. I'm asking why then do companies make hiring decisions which go against their interests.
> You are asserting that [older] people have timeless knowledge making them more valuable employees than younger people without that timeless knowledge.
Just like my questions weren't arguments? I thought you saying that knowledge was timeless was related to older people (the topic) having knowledge. A person may understand principles but if they are not wiling to learn what a company wants they won't be hire able - do you disagree with that too?
I'm trying to understand exactly what on topic (you arguing that ageism is real but not explaining why people would possibly want to higher younger people over older people) you are saying.
If you really think I'm wrong say something which will change my mind.
To summarize the generalizations:
- Companies hire younger people for competitive reasons : they cost less, they are more willing to learn, they have less baggage and less liability
- Older people demand more - they know they can't take risks with jobs and want high paying positions
- Older people are less willing to learn the things younger people are for whatever reason
- Older people may have more fundamental experience, more general knowledge, but if they have specialized heavily in something which is obsolete they can't compete unless they learn something new
- We know that older people run the companies, and start up most new companies - so why are older people choosing younger people instead of their peers?
Are these all wrong? Your other posts have not been convincing. I'm sure if you care about this cause it would be worth converting someone who cares to your view point, because right now all I see are lazy whiners who don't want to compete and want an easy scapegoat to blame.
You conflated statements. You were (and still are) putting words in my mouth.
This statement [of mine] :
> Application level tools do change quite fast. The fundamentals of technology do not.
Was in response to your blanket assertion that technology changes so fast current knowledge will be useless in 15 years.
I believe this claim to be false, and provided evidence to that effect. You then claimed I had claimed this knowledge was unique among older engineers. I said no such thing. I don't even know how you would get that impression. I cited current University courses, after all.
As for our previous discussion on ageism, I think it's clear we won't ever see eye to eye and will have to agree to disagree.
>Was in response to your blanket assertion that technology changes so fast current knowledge will be useless in 15 years.
What I meant was that some of what is useful now won't be useful in years to come not absolutely all technology ever. Some people choose to specialize in company specific technology which wouldn't be at all useful outside of it.
Although I roughly agree with the sentiment ('if you think everybody else is leaving money at the table, you should be picking it up rather than complaining about it') - there is evidence to the contrary.
A large part of the post-WWII economic growth of the Soviet Union has been attributed to the integration of women into the labour force - and the final stagnation to the fact that at that point there were no more women to integrate, everybody was already working.
My guess is that as "our" generation ages, age discrimination will subside greatly. There are a few factors I think currently contributing to the the current age discrimination practices that are unique. First is that the consumer web is relatively young and that our generation was the first to adopt it on a mass scale. And second, the most prominent and visible tech success stories of the past decade have revolved around social media, in which the most innovative products were heavily driven by younger demographics. This is still true; the more "innovative" social media products, or most widely adopted (ie Instagram, Snapchat, etc) are still heavily driven by younger demographic users. In these cases, it was important to have employees get the product, which seemed to be most adopted early by "our" young generation. I think as we move past the social media dominant consumer web, we'll see less of this age discrimination.
but do they understand the culture of the generation below them well enough to successfully capture market share? I'd argue that most older people are clueless to the rapidly changing millennial culture.
If:
A) technology is accelerating
B) the bar for status quo keeps lowering through enabling technologies
C) only people that grew up in the new 'system' will be fluid at it
Then:
A) most of the new ideas will come from a newer and newer cohort
B) age-ism will become even more prevalent
FB's (and Google's and Yahoo's) definition of talent is whoever passes their "quiz show" interview
However, yes, you don't need to know the worse case complexity of shell sort for most tasks, especially at the beginning of a project where load demands are smaller.
Here's the odd thing - if I meet a candidate who knows about shellsort, I'd already be impressed. Many people I interview have trouble with much more basic concepts than that.
And of course, it is worth keeping in mind that when you hire an employee that hopefully at some point, the load demands are higher. And it's kind of awkward if your product hits rapid growth, but your engineers are currently busy learning about algorithmic efficiency.
Nobody sane asks about "worst-case efficiency of shell sort" unless you yourself suggest shell sort as an algorithm. But it is expected that you get the general idea of O notation, because it actually does matter for day-to-day work.
And, as a practical interview tip: It's perfectly OK to say "I don't know about this specific case" - just follow it up with "for most sorts, O(n^2) is an upper bound, so if I'd guess, I'd go with O(n^2).", or something like that. It tells the interviewer that you know general principles.
That's the big problem in most interviews - candidates treat it like a test, where there's just the right answer, and nothing else. It's not a test. It's an attempt to somehow, within an hour usually, find out what you know and how you think. Throw the interviewer a bone or two, and help them do that. Silence and short "I don't know" answers really just waste time for everybody.
An important thing to realize is that no matter how much they try, a company the size and profile of Google has at best only a loose control over their hiring process.
There are too many internal recruiters of varying ability and agendas, and too many interviewers of varying ability and agendas.
A company like Google can have stated hiring principles and techniques, but in practice it's going to be nearly random.
Very well written article and I can add some context being in the unhired category of engineers.
I graduated from Georgia Tech sometime in the last couple of years with a 3.64 GPA. I know that GPA doesn't mean much but really at Georgia Tech they make you work for it like a dog. I have some projects on GitHub and well I thought companies would come knocking at my door. Boy was I mistaken. So I started applying late into my graduation semester.
I got interviews with Amazon / Google and another Tech giant. I got turned down without an interview from Facebook / Palantir and some other older startups. I did not apply to a lot of companies, and I feel that was the real difference bw me and friends that got offers from all of those companies that I've mentioned above. Past summers I had worked at management consulting companies and ultimately after not getting any offers from Tech companies, I moved back home to Asia. And although I did feel a little crushed at the time, I'm working at a big engineering company back home, and I don't make as much money, I'm learning to like what I do.
Perhaps after a couple of years of nursing my self-esteem, I'll try again. Till then yea it sucks to not fit in. It sucks to not be a culture fit. It sucks to be told that you are not adequate after slogging it out at a supposedly "world class" engineering school. The best reason that I got rejected was because other candidates felt coding was their calling and I didn't seem all that committed. wtf! I'm of Asian descent and well was on a F1 Visa (non-resident/citizen) with a horrible accent. Perhaps there were other factors that resulted in me not being hired because I did make it through the phone screens for some companies, and also because I have other asian friends that did get offers. . But yea atleast it feels good to know that there are PhDs out there that can't get jobs either so it wasn't all me.
I can tell you that sadly accent can have more weight than we all wish it would have. It's a double issue, even if you are interviewing with the most liberal and ethical person who loves you and think you're a great fit but had problem understanding due to heavy accent, it's not in your favor. And even if your accent it's only light, there are sadly still people out there that will even subconsciously prefer the candidate that bad the "better" English.
It's sad, but I think it's more of a reason than your actual skills. I go to GT and can vouch anyone who graduated with your GPA is probably very smart and talented. Don't give up!
Knowing how to interview is a skill by itself, there are many common mistakes people do that are not related to their skill set. My wife had a heavy Russian accent and spent a year working solely on her English, and interview skills (went to a career coach). From dozens of rejections, she now got 2 offers in one month, and she is almost 40 :-)
I agree about accent, but I want to encourage the parent post by saying that since his written English is excellent, he may be able to avoid that discrimination by sticking to text communication until he's established a relationship with someone at the interviewing company. This is what I try to do - I'm hard of hearing, and someone who doesn't understand the implications of that can easily jump to the conclusion that I'm an idiot or dishonest if I hesitate in answering a question. It just takes longer to process speech when someone changes topics quickly.
Are you a citizen? As I've very rarely heard of a fortune 500 company sponsoring a new grad's visa.
I will mention that I've interviewed for Facebook in the past, and have found them to be very pretentious. The interview really made me not want to work for them.
Also as a Gatech student as well, did you primarily do on-campus interviews E.g: Career Buzz. As I've found Career Buzz to really suck at getting students jobs. You're talking like 50+ students interviewing for 1 job. And the interviewers are looking for that closet nerd most of the time.
I kind of have a philosophy that if its a good job, the company doesn't need to approach students. The students will approach them.
Haha to be perfectly honest the only reason I wanted to work at Facebook was because of the perks. They pay well plus you aren't spending much at all on food / gym / laundry / etc. so you can save quite a bit if you live frugally. Plus the network in the event I wanted to venture out on my own.
Anyways yes most of my non-citizen friends applied to 20-50 jobs using spreadsheets and all that. I was a little late to get my act on like I mentioned in my initial post, and as a result I clicked on all the apply buttons for CS or related jobs on Career Buzz. For startups I emailed co-founders / applied to their open positions.
Anyway one of my friends was a Korean and is now serving in the army, so is the Israeli kid that lived down the hall, so atleast I'm a little better off?
Considering that 3.55 is the cut-off for highest graduation honors at GT, you have nothing to be ashamed of there! If you decide to take another stab at things, there is no reason to limit yourself to Amazon/Google/etc. The southern east coast has a perfectly healthy tech ecosystem, from Atlanta up to D.C., and the GT brand carries well there.
Yes man the irony. Georgia Tech tells you that you've graduated in the top 10% of your class, institutional highest honors and all that.
And then your parents call you and tell you that they spent six figures on your education and you are essentially worth nothing. Here are some quotes from my parents:
"If you really wanted to work as a shitty programmer for a small pay we could have saved all that money instead of sending you to this school."
[My dad to relatives in front of me] "Sending not_pg to Georgia Tech was one of the biggest mistakes of my life. It was just a drain of my hard earned money."
The reason for shame was my culture and I'm still grappling with it a bit because it is so ingrained in my psyche from all the years of growing up with it. It seems like everyone in my family is constantly judging me for being such a failure. In the grand scheme of things I've stopped caring lest I let this get to my head. But I do realize that I essentially screwed up my time by not being more diligent, not managing it better.
Anyway I'm out of the US so Atlanta to DC isn't in the picture atleast in the short term. The only good thing to come out of this is that if I do apply for a PhD in the coming few years I'll have some professors that'll write some good recommendations for me and a decent GPA.
I think you need to read a bit of michaelochurch, and learn to stop desiring to be a cog in the machine. A "job" is a contract wherein you show up and are told what to do, in order to make the owners rich in exchange for a subsistence salary and all of your waking hours.
There is more to the world than that. Take time off, explore and learn about the world. Start a company. Start ten companies.
Believe me, "jobs" are overrated. Between me and my friends, we've worked at every top tech firm in the world (Apple, Tesla, Google etc). They're all the same. They want all of your waking hours in exchange for a subsistence wage (Bay area is expensive).
Go and live your life. Build your own skills so you can consult and control your hours. Travel and have fun. YOLO.
Corporate jobs do have their perks. Especially in the Bay Area, they give you access to the following:
1) A green card if you're not a US citizen (this one's really important for foreigners).
2) Starting capital (savings), if you want to later start a startup. If you're single and live in SV, you can save substantially. Maybe you can start a company with no money, but I imagine it's harder.
3) Connections/networking. You get to work with a lot of smart people. Again, if you want to start a startup, this helps (it's easier to have 1-2 cofounders, than do it alone). For example, Valve was founded by a couple of ex-Microsoft engineers.
Wow. You can't judge your success by what your parents think, especially not Asian parents who, though they often have the best of intentions, frankly have no idea how things work in the U.S.
Anecdote: my parents, who are from Bangladesh but have lived here since about 1989, called me one day (five years ago now), deeply concerned that my brother wanted to turn down his acceptance to Cal Tech to go to this place called "Yale." They had no idea that there were avenues to success beyond getting a PhD in engineering and going to work for Lockheed-Martin. They're extremely smart, loving, and genuinely concerned parents who supported us (financially and emotionally) in every way, but had these ingrained cultural perceptions that I had to talk them out of.
Also, I'll say this: there's very little you can do with a 3.9 at Georgia Tech that you can't do with a 3.65. Your GPA is well beyond the cut-off where American companies are going to care, except perhaps the snobbiest investment banks or management consulting firms. What held back your job search was almost certainly your narrow scope, not anything about your stats on paper.
My brother used to work at that Newtown facility. He left before this crash happened, but LM firing all those people has killed the town. We hear about the Rust belt and the factories closing, this is the high tech example. My brother saw it coming and left. Everyone there said he was crazy at the time. Now he has income and skills, they do not.
Life changes. You have to raise your kids with enough sense to be able to recognize when it does and then act accordingly. Even if it seems crazy to you, you gotta trust that you raised them right and then trust them.
If any parents are out there, please don't do this. Your children are independent adults and they have their own values and goals. They are not an investment opportunity or a vessel for your own unfulfilled aspirations. If you make your kids feel that you're disappointed in them you're just going to make them miserable and resentful and you'll have a lousy relationship as adults. Please value your children's happiness and well-being over prestige and money.
It's a fine line that's hard to walk. In stark contrast to my asian parents, my wife had hippie Oregonian parents who said "do whatever makes you happy!" She regrets a little that they never gave her the information and support she needed to pursue Ivy-league colleges.
It's not your job as a parent to prioritize happiness over prestige and money, just as its not your job to prioritize the other way. What you bring to the table as a parent is experience and knowledge of the world, and you need to make sure your kids have the information they need to make the decisions they need to make. Maybe your kid grows up and wants to be an investment banker making a few million a year. You don't have to push them in that direction, but it's valuable to them if you tell them what they need to do to prepare for that if they ultimately want to pursue that.
And keep in mind: nearly everyone will value money more at 35 than at 15, but the decisions we make that lead to that are often made long before we can appreciate what the impact will be.
Your wife's parents might not have known. There was an article no too long ago describing how kids in the midwest are at a disadvantage because kids never learn to perceive elite schools as substantially better than most state schools.
My opinion is that the real value of elite prep schools is that they teach (indoctrinate?) you elite values through the culture, your peers, actions of your peers etc. They tell you what to do, instill in you a belief that you are "worthy" and capable and give you the tools on how to do it.
I think the key word in what I wrote is "adult". Whatever fine line may exist with kids, at some point people start to value their independence and you do more harm then good by trying to pressure them.
I'm glad that you are starting to let that go. It can be paralyzing and then all you will do is resent yourself more because of all the time spent dwelling, making your situation and gauge worst. Let it go, you are great. Do great now.
> According to news reports, both Acton and Koum “took a year off to tour South America” not long before applying to work at Facebook, giving both a recent period of prolonged unemployment.
That agreement doesn't seem to cover new talent. If the companies are screaming about lacking talent, that they could only acquire more engineers by hiring from each other would be positive evidence for that claim.
>> $19 billion in cash and stock (slightly less than the inflation adjusted cost of the Manhattan Project)
Holy Crap !!!
That cannot be right - surely not....
Edit: According to Wikipedia : "The Manhattan Project began modestly in 1939, but grew to employ more than 130,000 people and cost nearly US$2 billion (about $26 billion in 2014 dollars)."
So, if you allow 7bn to be slightly less, then yes. It is right.
And there is worse to come - also according to wikipedia and the BBC, the US spends around 50bn pa, (the UK an embarrassing 4.6bn a year) on science funding - but really, half of US science spending on a mobile IM app.
There may be a disparity based upon how inflation is calculated. Plus or minus 40% after an interval of 75 years is not unreasonable for the purposes of an "about this big" comparison.
A Paasche index will understate price inflation, while a Laspeyres index will overstate it, due to demand elasticity. Governments may change the formula used to report changes in the money supply, using different measures like M1, M2, M3, and MZM, along with the average velocity of money. This obscures the amount of value intentional inflation extracts from the productive economy, at the cost of mathematical accuracy.
You could buy exactly one Manhattan project with 1939-1946 dollars for $2billion, of which 90% was production, and 10% R&D. You could buy exactly one Apollo Program with $25.4billion in 1961-1972 dollars. One LHC costs $9billion 2008 dollars. Tevatron was a bargain next to that at only $120million 1980 dollars.
Put into that sort of scale, with $19billion in 2014 cash, you could do any one of the following:
Build an entire city designed for 50000 inhabitants supplied entirely by renewable energy sources.
Build an entire archipelago of artificial islands.
Bury an Interstate highway under Boston.
Build 3 international-class airports.
Build two subway lines under an existing major city.
Where do you get your marvellous comparison figures from?
Yes, it puts it into a more depressing perspective. Although I would tip my hat to the founders if they decided to build "WhatsAppTown", a city of the future.
In fact I want them to - because when we throw around this kind of cash (the amounts normally only State actors have) we get outliers - sometimes a boring miser who just wants more, sometimes a visionary who tries to build their Jeston style future.
I just hope Facebook stock is going to those who watched the Jetsons on Saturday mornings.
I searched the web for "megaprojects" and grabbed figures from any site that at least sounded legit, so caveat emptor. I pulled anything that felt like it would cost between 15 and 25 billion US$ if it was completed (or completed again) this year.
I really dont see the STEM shortage that people are talking about. Last year I graduated from a local college and we had no shortage of STEM from Biology to Comp Science to Information Systems. We recently bought an engineering school full of students so next year our STEM foot print will be even larger.
Lets be honest here sometimes its cheaper to use foreign labor and that is why these companies are doing it. I dont necessarily agree with it but a lot of companies out there are doing it including my employer.
Yes, let's. I want Mark Zuckerburg to admit out loud that the reason he can't find qualified engineers is because he's not willing to pay. No more of this "they don't exist" lie.
This, coupled with the Google/Apple/Intel/Adobe/etc. collusion to prevent salary negotiation wars, it's very clear: Silicon Valley companies aren't special anymore. They are now just like every other corporation in America. They're focused on nothing but the bottom dollar. They have their HR departments spewing morale boosting propaganda and they do their damnedest to keep those employees locked in as wage slaves. Free lunches! Games in the break room! Climbing walls! All that shit is designed to keep you in the office, to entice you into working overtime, for free.
When I'm looking at job listings, I look for the tell-tale lie on their websites: If they say something like, "our employees are our greatest asset," I know they doth protest too much. They really treat employees like fungible commodities. Unfortunately, I've let my judgement be blinded too many times and have had to learn the hard way that it is true in every case.
Employees are now commodities. Other industries have realized this and now the tech sector is realizing it as well. Especially now that people are more computer literate than they were 20 years ago combined with globalization plus the internet. The pool is much bigger than yesterday.
Employees are ALWAYS commodities. These issues are not new. This is exactly the same labor vs. capital struggle that has happened, well, forever.
Companies exist to run business processes that create money. Employees are an input to that process. Employers only care about employees to the extent that the employees help that process run efficiently.
Expect these things to become ever more apparent as more and more complex work becomes commoditized. Remember when you could make real money as a Wordpress developer? Now a person who wants a website (if they still want a website and not an Etsy / Amazon / Facebook / whatever presence) should really just sign up with Square Space. The market needs products & services... employees are a by product.
Yes, this is the MBA training, that engineers are labor, to be reduced in cost. This leads to shortage, confusion by MBA. Until they solve what you just said, treating engineers as commodities.
Ah, no. Being computer literate is not the same as having skill in programming, QA, DB, UX or whatever you need. Being tall does not mean you can dominate in the NBA. I was the same size and weight as Michael Jordan but that doesn't translate into 6 NBA championships. If you need skill you have to pay for it, not just fill in the blanks with random people who work cheaper.
Right, but if more people were playing basketball in schools today than were 20-30 years ago then the talent pool would be much larger today. If you look at people born in the 50s and 60s most of them didnt have a computer to tinker around when they were in their teens or even 20s. But my generation saw the dawn of the Internet. When I was young everyone had a desktop in their house the family shared, now families each have their own computers like laptops and smaller computers like tablets and even smaller devices like phones. 12 year olds are building iphone apps and elementary schools are having programming days. Being a beginner is easier today when you have an internet community and the device of your choice to play with. More people enjoy building websites using blogger or wordpress or what have you and some of them dont even write a single line of html much less CSS.
> people are more computer literate than they were 20 years ago
Less afraid of the keyboard, almost certainly. More computer literate, I'm not as convinced. I know plenty of people that can tell me about their Android smartphone yet not anything about the underlying OS.
> I want Mark Zuckerburg to admit out loud that the reason he can't find qualified engineers is because he's not willing to pay. No more of this "they don't exist" lie
Except Facebook was instrumental in breaking the hiring cartel, because they were totally willing to pay. AFAIK, that's still true.
What you are seeing is typical of all big companies. At that size 99% of employees are fungible. If you want to be valuable and have an impact you need to move out of the big SV companies and into the smaller shops. This may mean leaving SV, but there are lots of companies out there the truly do value their employees and show it with salary, responsibility and respect.
Well 2 out of 3. I know plenty of small companies that respect you for your responsibility but are deathly afraid you'll leave because of salary. My company loses good people to our, much larger, clients who can pay top dollar. We've, humorously, set up a job board on our website to cut out the middle man, us.
While there certainly are companies whose business model is to get cheap workforce from India or China on H1B (just take a look at H1B stats), there is something I don't get. Where and how this pay inequality for local vs. imported workforce manifests for more desirable companies like Google or Facebook?
As a student outside United States I had a chance to do internships in the US several times (on J1 visa), and as I am finishing my degree soon, it was natural for me to try and get a job there. My observation is that the salary and benefits I was offered at Google, Microsoft and Facebook wasn't much different than if I was American - if any, it was even slightly higher than what they pay American new grads (data based on talking with American peers and reading Quora posts). My non-American friends who also got a job in the US also didn't complain about getting lower than their American peers. Thus, should I expect fewer bonuses/pay raises? Smaller chance to get promotion? Less desirable project?
An important point is that H1Bs basically increase the supply of talented engineers. So even though they're being paid salaries equivalent to their American peers at MSFT/GOOG, without them, the Americans would be making a lot more. With a shortage of engineers, who knows, may be the starting salary would've been $200k instead of $100k. I personally think this is a good thing and keeps America competitive. But some would argue otherwise.
The problem is that H1B's increase the supply of indentured engineers.
The solution is to convert H1Bs to Green Cards much faster. That way all the engineers are competing on a level playing field.
I actually like having more foreign engineers as long as they have green cards. Engineers tend to create companies and a need for more engineers. We all win.
What is the basis of your assertion? There are at least 250,000 H1B workers in the USA. Do you seem to suggest this is true across the board? In my experience in most 'good' companies (e.g. Texas Instruments that is discussed in the article), such exploitation doesn't occur, based on the anecdotal evidence that I've heard.
These companies typically exploit their workers with lower pay and threats to send them home if they don't play ball. Or are employing people to offshore jobs.
The "good" companies that use H-1Bs are a fairly small percentage of the number of visas issued.
One cannot complain about the 'bad' companies being 'unfair'. It is a consequence of globalization. If, say, Indian_Software_Company A can do the IT overhaul of US_Retailer B at a price point cheaper than US_Software_Company C, then in a non-protectionist world, nothing should prevent Indian_Software_Company A from getting the contract. I am not a software engineer, so I do not know if Indian_Software_Company A is cheaper and better than the other, but such a practice is consistent with modern trade.
Now, in the context of TI, which is a typical 'good company', complaining about H1B doesn't make much sense. In all likelihood, a H1B TI engineer is paid as much or more than a US citizen/permanent resident TI engineer. In fact demonstrating fair wages is a fundeamental requirement for H1B applications.
Unfortunately, there is a lot hypocrisy in the criticism of 'foreign workers' in public conversations in America. The same voices in favor of capitalism also want protectionism in labor. It is likely that this hypocrisy (or confusion) underlies the political disillusionment of the TI engineer's wife [1].
In a non-protectionist world there would be free movement of labor and so there would be nothing stopping say a US or European developer moving overseas to follow the work if that it what they desired. Sadly this is not the case so off-shoring is equally as protectionist which is where a lot of the criticism comes from.
Globalization as currently implemented simply moves work to where it is cheaper as protectionist immigration policies in most cases prevent skilled labor from following the jobs. That sadly is the reality of modern trade.
> In fact demonstrating fair wages is a fundamental
> requirement for H1B applications.
The fact is that there are enough loopholes in this requirement to drive a truck through and enforcement is weak. And the companies that exploit this know that.
> In a non-protectionist world there would be free movement of labor and so there would be nothing stopping say a US or European developer moving overseas to follow the work if that it what they desired. Sadly this is not the case so off-shoring is equally as protectionist which is where a lot of the criticism comes from.
Would like to disagree. An American passport allows you to work from, pretty much most other OECD countries in the world. So, if Singapore has a better job market for techies than the US, nothing prevents one from relocating. But in my experience, American citizens are, on average, more reluctant to relocate even between US states, than immigrants. Indians or Chinese leave their families and relocate 10000 miles to live and work in the US. But most Americans I've met aren't as eager to relocate to say, Singapore (or even jump from one coast to the other or to the mid-west), even if prospects for jobs are better.
Last I checked non-Singapore citizens and Permanent Residents need an Employment Pass (EP) to work as a skilled worker in Singapore and such a pass must be applied for by the employer. So under that scenario you need a visa and sponsorship by an employer. There is a new system called PEP which does not require sponsorship by the employer, but has additional requirements in terms of base salary and restrictions on employment, but again it still counts as requiring a visa.
So having "just an American passport" does not allow you work in a country like Singapore without applying for a visa. While the Singapore system might be more flexible than other such as the US, it certainly is not "free movement of labor".
> Would like to disagree. An American passport allows you to work from, pretty much most other OECD countries in the world. So, if Singapore has a better job market for techies than the US, nothing prevents one from relocating.
Even if a US passport allowed you to work from "pretty much most other OECD countries" (which I'm pretty sure is itself false by any reasonable standard for "pretty much most"), Singapore is not a member state of the OECD.
I find that hiring practices can often discriminate against those who grew up poor/not middle class.
I remember someone writing about how they hired saying things like "I look for evidence of interest in technology from a young age." Well, I had tons of interest in technology from a young age, but my resume can't reflect that. I didn't have a computer until I was about 13 or 14 because we couldn't afford one earlier, even though my parents really wanted one. I think our school library might have had one computer for the whole school, which was not available. I certainly couldn't go to expensive tech camps or the like, we couldn't afford it. I did get all the science books from the public library and read them as much as I could. I didn't start programming until college. I get very very positive reviews from all the jobs I ever worked though, but my resume might be passed up because I didn't have the opportunities others had.
Surprise! Double standards, discrimination, illegal practices and plain incompetence in the hiring business. Those self-absorbed tech rock star companies are actually not smarter than the rest.
Brian Acton's rejection by Facebook is used as a pro argument for the idea that tech companies are just pretending to not find qualified candidates, when there are plenty. I am not sure which way it is but the argument used is flawed. Brian Acton never stated that he went to Facebook to be a programmer. Not even a senior one. At that moment he already had managerial experience, having lead large dev teams at Yahoo. It way more likely that his Facebook interview was centered around a managerial role and the fit, for one reason or another was not there.
Not taking any sides, I want to say that Brian Acton's example is just a nice/easy/convenient media story. Not at all indicative of anything related to availability or lack of STEM people.
Thank you for pointing this out. I've seen no indication that he interviewed as a programmer at Facebook or Twitter. Considering he was managing a large team at Yahoo, it's unlikely that he would be. He might even have been interviewing for a director or VP-level position.
> However, in this era of e-mail, Skype, and many people working successfully in the technology business from a large distance (e.g. India or China), why exactly couldn’t the desperate “starving” technology companies offer him a job he could do from Texas?
I feel like there's a blindness by some STEM professionals to the fact that not all jobs can be done nearly anywhere, least of all those available to semiconductor engineers. I blame the acronym for bundling us all together unnecessarily.
I recently had to have one aspect of this issue pointed out to me, since it was never a major concern of mine and I'd been completely blind to it: seniority.
I do not currently work for a start-up, or even what you would typically consider a technology company. However, the group I work with, the program in the businessy sense, is not very old. The original kernel of the group began less than ten years ago, and has grown in fits and spurts. Many, if not most, of the "founders" are still here, as well.
When the program started, many of the founders were young and relatively inexperienced. Today, they're still young but are more experienced---in this specific project. And at least some of the people who have been hired since are older or more experienced---at least in terms of having worked in a wider variety of environments---or both.
As a result, in some situations, someone with more experience and skills is being supervised, managed, or lead by someone with significantly less experience. The situation isn't improved by weird incentives that place emphasis on some jobs more than others, similar to the cliche about sales versus engineering.
So, how's this for a weird situation: people who would be senior are doing junior work, because "job market"; people who would likely be junior are in charge, because of good timing (For those of you who plan to be future Zuckerbergs, what are you going to do when you really do need to hire a rocket scientist?); people who are ready to move up can't, because nobody is leaving or dying off; and people who might be better off leaving won't, because they would be taking a pay and prestige cut.
[Note: I am absolutely not talking about myself. I knew what I was doing when I got here. I have no real urge to "move up" in the world. I'm a technical guy. I'm mechanism, not policy.]
Zuckerberg quoted as saying: "Our policy is literally to hire as many talented engineers as we can find. The whole limit in the system is just that there aren’t enough people who are trained and have these skills today."
Two huge unanswered questions. What do you mean by "trained and have these skills?" And at what price?
At $50,000 a year, I would also like to hire as all the engineers. I will place them into jobs that pay 100K a year, and I will keep 50k. How many times would I like to do this? Answer: all of the times.
Now, suppose I have to pay them 125K a year, and I can hire them out at 100k a year. How many times would I like to do this? Answer: none of the times.
So there's a massive shortage of 50k/yr engineers, but a huge glut of $125k a year engineers. It sounds like there's a severe shortage of programmers willing to work for far less than they're worth. How is this different from any other field?
For perspective, the budget of the NIH, which funds a huge percentage of all the biomedical research in the US is about 31B. This is research that subsidizes and lays a foundation for many of the medical advances we have today. For additional perspective, this 31B in public money is about equivalent to all the private R&D spending done on medical research.
So the thought that someone would spend 19B on a messaging platform, which is an already solved problem, is sad. 19B probably isn't "cure cancer" kind of money, but you'd be able to solve some pretty real problems with it.
When they say they can't find the right person, they mean that Frank's cell phone ran out of juice.
So, my brother used to work in a small clock shop. These were really precise clocks (nanosecond) and they did distributed timing. You have an oil field, maybe, and want to know the size of it? Send out a blast pulse and record the echos. But the recorders need to really know where and when they hear the echo over a few hundred square miles. Its not easy stuff to deal with.
But the company had Frank. Frank had been there since the beginning. He had written all the code for the clocks, he built and wired the entire place, he knew the whims of the servers and the phone system, the slight burning smells that meant an imminent failure. Frank was the company, and the rest of them where there to make sure Frank was working and happy. Yeah, he owned a bit of the equity, not a major share, but a bit. No-one else knew all the passwords or the files like Frank. Frank barely had time for his kids, let alone to train anyone else. To keep things going, Frank was the ONLY person.
So, this worried the owners a lot. If Frank got cancer or in an accident, they were screwed. Hell, everyone was screwed. But to train anyone up was near impossible. Because Frank didn't have the time between putting out the little fires in the code and sleeping and making money for the owners. Frank didn't have them in a bind as much as Frank himself was in a bind. He wanted time to see soccer games, but he wanted his kids to go to college more, and that meant the money came first.
So, what was everyone supposed to do? They needed a new Frank. But anyone they brought on was just confused by the spaghetti of wires and the nonsense code that Frank had made 20 years ago to solve some problem half drunk at 2am on a Saturday. There was no record because there was no time to make one. You had to just clone Frank's brain, because nothing else would do in time.
My brother left that company because trying to untangle the Gordian Knot that Frank and the comapny were committed to was a noose.
But there are a few thousand Franks out there. There are a ton of companies that are in this bind. They have 2 or 3 guys that can actually do the job. And trying to replace that team is impossible. When the Zuck merps out that they can't find anyone, he is not lying, but mis-speaking. He means that Frank's phone is out of juice.
Whenever an article like this pops up the usual chorus is, there is no skill shortage, just a shortage at the price the employer is willing to pay.
Sadly I've seen myself as someone on the hiring side of the table fall into a few of these traps listed in the article.
A newly minted Phd, is one type of person that I've found myself having trouble hiring. Same with an newly minted Masters.
The reason is that often their area of specialization doesn't really help in the role I'm hiring and therefore I wouldn't give them any bump in pay for their extra schooling. On the other side of the table, especially for a Phd, they often expect a bump in pay for their additional schooling so its tough for me to hire someone like that.
EDIT to clarify, I don't reject these people out of hand, infact it helps them get an interview over others, but when it comes to offer time ie salary, I don't give any real value to a masters or Phd. Or put another way, if the salary I'm willing to offer is $125,000 its the same regardless of education.
To be clear, In a fund with 12 people, we have 5 Phd's and one more with a masters, so we do hire well educated people:)
Post grad education might get you interviewed first, but will have almost no bearing at all on if I extend a job offer or not.
The same can be said for what school you went to. MIT education, great I'll put you at the top of the interview pile, but when it comes time to extend an job offer, where you went to school doesn't factor into what we offer at all.
Too far along in their career can also be tough. Often this is due just to salary demands not fitting a junior role.
A new grad has no job they are quitting so they don't need a "bump" in pay to leave. Someone who is well into their career might be a good fit but is making a good salary where they are so they require a sizable pay increase to leave. Given this, it often makes sense to go with the younger candidate whose salary fits the role.
EDIT
To flesh out the senior developer comment a bit more. There are really 3 areas I've had trouble with in the past.
1) The 10 year developer who has 1 year of experience 10 times over. This is just a junior developer who wants the salary of a senior developer.
2) Its harder to find senior developers, regardless of what you pay. As Joel once said the good developers always have jobs, hence his internship program.
3) the hire bar is, umm..., higher for senior developers as they are expected to do more, mentor, design architecture, etc. This means I'm more picky as to who I hire.
Having said that, there are times where experience rules, and in this case you get what you pay for.
NOTE
This post has generated more upvote and downvotes than almost any other post I've written. So half the people agree with me and half disagree?
> On the other side of the table, especially for a Phd, they often expect a bump in pay for their additional schooling so its tough for me to hire someone like that.
> Too far along in their career can also be tough. Often this is due just to salary demands not fitting a junior role.
Does this mean that you don't even give these people an offer, or do they reject your offer citing low salary (or other reasons, but you suspect it's actually low salary)?
I'm asking because many examples in the article (WhatsApp founders, females, Hispanics) seem like they weren't given an offer in the first place, not that they rejected an offer because their current position was rationally a better option.
> A newly minted Phd, is one type of person that I've found myself having trouble hiring. Same with an newly minted Masters.
The PhD issue makes some sense, a person who just spent two to four years becoming an expert in a narrow area of specialization may want to use that knowledge, and may not be all that much more valuable if you don't need it (especially since his or her other skills may not have "kept up").
However, for masters degrees I wonder if you're considering the fact that the degree (in many fields) can be seen as a "signal" [1]. Many masters programs are highly applied and, assuming a reasonable degree of rigor, more challenging than undergraduate programs. So, looked at in this way, wouldn't it make sense to use a masters degree to help narrow your pool down to people who can handle challenging work? In this case, you would pay more not for the additional knowledge, but for the free screening the degree provided.
Disclaimer: I'm getting a masters degree in CS (and already have one in econ, hehe), although from what I've seen, people don't have much trouble finding jobs, so...
A master's degree, assuming no experience, is more like a warning signal for someone who's stayed in academics too long and has no practical skill. It's like you're hiring a really expensive junior developer if they've never built real world software under business constraints that needed to like be bug free, and scalable. It's like the code written in graduate college gave a shit about memory use, file or network i/o performance, a good ui design, or like actually doing the thing it's supposed to. Sorry, ranting due to bad experience in the past.
The thing is, more education will enable a good developer to be even better. It's doesn't seem to help a struggling developer in any shape or form. Unfortunately sometimes I get the impression that people stay in school to address a problem that can't be addressed with more classroom time and end up doubling down on a field that maybe isn't right for them.
At the very least try to intern early on and or successfully help a popular open source project before graduate school.
> The thing is, more education will enable a good developer to be even better. It's doesn't seem to help a struggling developer in any shape or form.
In my (limited) experience, the people who struggled in undergrad run screaming away from any possibility of doing grad school (my experience here is consistent across econ and CS). The people who were good at the field consider grad school because the psychic costs of learning more are low.
If you find a person with an MS who can't pass a simple programming test, then (and this is very general, I'm sure there are exceptions) take a look at the school her or she attended and stop interviewing people from that school, problem solved.
> take a look at the school her or she attended and stop interviewing people from that school, problem solved.
This is really bad advice, because much about learning programming is about an individual effort. It has little if anything to do with the school.
> If you find a person with an MS who can't pass a simple programming test
You actually need more than a simple programming test that solves a single problem. You need to see enough code to have some architecture, designed by the candidate. Interviews are too short for this so experience whether at some other company or open source, or just if skilled people can vouch for you goes a long way.
I also agree in doing a black mark against the school however it helps to have a look at (in terms of Masters, PhD) the quality of work within that department / within the graduating year. It's a way to go beyond the marketing fluff in non top-tier schools.
Replying to leobelle here since the thread got too deep.
I agree that learning to program is individual, but you don't learn to program in grad school. If a school gave you an MS and you can't do Fizzbuzz, that school isn't serving its signalling purpose (that I mentioned in my first comment). So while that school may be fine, if lax, if the employer is using education as a signal, a school that doesn't fail students doesn't provide a strong signal.
Relating to your comment about needing to see more code, education is actually perfect for that. Most companies don't let you take your source code with you, schools do. So provided you took some courses that required significant engineering projects, a student has a big pile of (hopefully) reasonably-designed code to show off to an interviewer. A person who worked for a less-than-progressive company (since some maintain open source libraries and such) does not. They both have their open source contributions.
As someone who occasionally hires, a masters doesn't really show much in terms of talent, but it does sometimes show a level of specialization. And, there are times when I need a specialist (e.g. image processing, feature analysis, statistics, etc.). But, generally, I don't need a lot of specialists.
Also, there seem to be two types of people who go for masters. There are those that really love their particular specialty, and are willing to sacrifice significant opportunity costs to master it. I respect that group. And, then there are those that think that a masters will get them more pay, but who also had calculated that they couldn't increase their pay the hard way (e.g. working for it). I'm wary of the that group.
Also, when hiring a specialist, I don't want one who has focused on topics that any good developer with a B.S. can research independently: operating systems, languages, data structures, algorithms. I want one with a particular domain specialty - someone who can read recent research papers, and implement it.
This isn't always the case in CS. It is often more advantageous to your career to not take time off. Granted, someone can get a masters while working, but it's often for the purpose of a career change or promotion into a specific category of position.
Who said anything about taking time off? If the goal is to become a talented software developer, what does a hiring manager care if that was done for-pay or in school, provided the candidate got the experience and did something challenging?
I understand that there are "business" concerns (deadlines, tools, testing, and other things that aren't always taught in school) but a candidate could just as easily be deficient in those areas having worked for an employer with crap practices.
At some point companies need to stop whining and solve their own problems.
> their area of specialization doesn't really help in the role I'm hiring
Perhaps you should instead consider that their specialization is a specific example of them applying themselves to a difficult new problem and solving it. It seems unlikely to be able to constantly hire people already experienced with what you're doing. Wouldn't you rather hire someone who's demonstrated an ability to learn something new and succeed at it?
Agreed. It's disappointing that there is a general skepticism of newly minted PhD's, since the degree generally involves extensive problem solving, independent thinking, self-organizing, commitment to a specified goal, receptiveness to feedback, critical eye of both one's own work and that of others, and the ability to communicate those ideas effectively.
Those are powerful tools. If the candidate wants a job that exclusively involves doing things related to their narrow field of study, then there is probably going to be a problem, but unless you haven't communicated the job description effectively, they're not applying for a job as an academic - they know exactly where to go if they want to pursue that career path.
Disclaimer: newly-minted PhD here, with no particular need to apply my dissertation topic directly to my software engineering work.
Newly-minted PhD's should consider casting a wider net. At the company where I used to work in D.C., probably 1 in 5 of the staff was PhDs. Areas where the tech industry is more "meat space" than "net space" seem to have more PhDs per capita among the engineers.
The southeast has three major tech hubs: the Dulles corridor in NoVA, the Triangle in NC, and Atlanta. They also have quite a few good schools with big graduate programs that feed into the market (VT/UMD, Duke/UNC/NCSU, GT/Emory). The software work in these places is often some integral component of a meat-space product, and that meat-space product is often in a complex problem domain in which the specialization of a PhD is valuable. For example, if you're developing software for a UAV, it's valuable to have someone with domain experience in control theory. The specialized domains in D.C. are often defense related, in the Triangle it's often bio-related although quite diverse, and in Atlanta it's often logistics-related.
Certainly, Silicon Valley has tons of these sorts of jobs too. Google is full of PhDs. But the ratio of specialized domain products to consumer products seems to be higher in the southeast. And as a result, my perception is that the market's valuation of PhDs is higher. It's quite common to see companies in that market where not having a PhD is considered a severe liability.
The problem is that for many areas of software it is not possible to be effective without at least a couple years of fairly deep experience. That is the learning curve regardless of how smart someone is and it is difficult to justify hiring someone that will not be qualified to fill the job they were nominally hired for for two years. It is not a matter of training per se, it is the amount of time required to develop competency.
For many areas of software development, such as database engine internals, the pool of people that are qualified or trainable within some reasonable number of months is quite small. Therefore, for those areas of software development the shortage is real for all practical purposes. A PhD in computer science does not automagically confer the ability to become a software subject matter expert within a few months.
The problem is that for many areas of software it is not possible to be effective without at least a couple years of fairly deep experience.
This point is key given that industry turnover is such that the first company to hire you has to expect that they are simply training you for the next.
On the other hand, a for such a specialized area a relevant PhD seems almost a necessity. Either that, or an equivalently long time in industry specializing in that area.
I doubt that this is the norm though. I get the impression that most of these hiring managers who don't value experience are hiring for pretty uninteresting work and simply want inexpensive labour. The op didn't state what area of work he's in, but if a junior is the ideal candidate, it's very likely not very interesting or difficult work.
edit: after a bit of digging, it turns out that the OP works on trading systems in Canada. The ironic thing is that 125k is quite a bit above a junior position in most of Canada. So it's not clear what is going on here. When hiring PhDs, he's competing with the US companies and might be a bit shy of what they pay. On the other hand, if he's paying a junior 125k, he's paying well above the local average.
In regards to a newly minted PhD, it invokes training someone who has spent 10+ years (or even their whole lives) in academia. Other items to look out for:
- If the only projects they produced have been for academia or they have had a history of starting AND not finishing multiple side/non-academia projects
- Little to no involvement in group work or publications (which makes me think - what have you been doing during the 4-5 years?)
- Another item I have noticed was that an applicant's PhD thesis was very well written yet when I read the Masters thesis and any other written documentation produced, the grammar is less than average, even poor! Which makes me think of potential 'help' involved that polished their PhD. So look for inconsistencies.
- I found out that they registered a domain name Dr <Last Name> back even before they started. Who the heck does that?
Plus various other issues in conflict with the newly minted PhDs who have just entered industry and not being given things on a silver platter ($$$ and respect due to title). Give them a year or two in industry and see how they shape up.
Disclaimer: Hired a newly minted PhD, observations based on one person so I don't claim this is based on all. I would consider newly minted PhDs/PhDs but after the same rigorous checks I would make with all the other candidates.
Their MSc thesis was probably the first that they wrote. After a while of paper writing, thesis writing, and even more paper/thesis reading and the like, I'd imagine that they got better.
What I'd worry about is if I saw NO improvement after 4-5 years. ;-)
> - Another item I have noticed was that an applicant's PhD thesis was very well written yet when I read the Masters thesis and any other written documentation produced, the grammar is less than average, even poor! Which makes me think of potential 'help' involved that polished their PhD. So look for inconsistencies.
Generally there are 3+ years between writing your Master's Thesis and your PhD, during which you spend a lot of time writing and editing and would hopefully get significantly better at doing so. I don't think there is any inconsistency in seeing an improvement in ability over that long span. It would be like looking at someone's code from the beginning of their career, comparing it to something they just wrote, and citing the inferior quality of their earlier work as a clear sign that someone else helped them with their most recent stuff.
> - I found out that they registered a domain name Dr <Last Name> back even before they started. Who the heck does that?
Why is this weird? Presumably if you're entering a doctoral program, you plan on finishing it (whether or not that happens). Whats wrong with getting a domain if you think you'll need it in the future?
I have read through their Masters thesis and compared it to others in their department. The quality of the writing is low which surprises me considering that the person graduated in a top tier level university.
As for the PhD, I don't see proof of other papers written during that time so they may have indeed focused solely on their work. I just need that desire to communicate properly extending beyond their thesis.
"1) The 10 year developer who has 1 year of experience 10 times over. This is just a junior developer who wants the salary of a senior developer."
Like everything else, that depends on the candidate. A person who has had 10 1-year jobs may have tons of applicable experience and be a better developer than the Loyal Schlub who spent two or three 3-4 year stints doing CRUD (or even proprietary trader front-end) app X. Given the rest of your post, I suspect the issue here isn't that you've had trouble finding good candidates with "10/1" resumes; it's that you simply are selecting them out due to bias.
"2) Its harder to find senior developers, regardless of what you pay. As Joel once said the good developers always have jobs, hence his internship program."
This actually supports the sentiment I expressed in response to your first point. It seems you're looking for an extremely specific and narrow type of resume--one that fits a particular format consisting of long-term, stable employment. You have a bias against other kinds of employees.
"3) the hire bar is, umm..., higher for senior developers as they are expected to do more, mentor, design architecture, etc. This means I'm more picky as to who I hire."
I have yet to see any interview capable of determining whether a person at the level you're talking about is actually capable of delivering on what you require. This last point is something that can only be adequately gauged in the context of your company's business by actually having the person work there, or work on a project there for some time. Companies are different. A person's ability to "be senior" in the way you describe at one company may not translate well, or at all, to another company.
I'm not trying to pick on you or be aggressive. I'm just commenting on a larger narrative--hiring practices in technology just aren't very good. It's not due to any deficiency in intelligence or honesty, either (most of the time, at least, I think). It's just that it's a really hard problem for which there isn't any one good answer.
regarding #1, I don't believe he's talking about someone who has chained together a bunch of jobs spending a year at each. Having "The same year of experience X times over" generally means you didn't grow as a developer from your experience because you were doing the same thing in the same way for a long time.
For example, if your job is cranking out CRUD apps or "online brochure" websites for various clients and you do it for 5 years, there's a good chance that while you're very good at those things now, it hasn't really gained you the same experience as if you'd been doing something a bit more demanding.
I also want to add extra comment regarding #1. Some developers are super and every year they get a new offer from another company and they jump ship. Nothing wrong with that - but maybe a company wants someone who can work with the company for a long term. If a developer who jumps ship every year, they might as well look at someone else who is slightly less senior but good at what they do. I know some companies have contracts that locks down new hires, but i think most of us prefer flexibility but still provide some psudeo-mental confident to the hiring company that "I am not a china easily stealable within a year. Hire me and I will try to help you make things better."
In my somewhat limited experience, companies often treat 'senior devs' as team leads, building mentoring responsibilities into the role rather than treating them as 'highly skilled devs who work efficiently and do not require much oversight'.
For a mentoring structure, it is always more efficient to have more mentees than mentors, hence more junior roles than senior roles.
The 'senior' = 'superproductive' structure requires fewer employees overall, but also brings an element of risk since the work is more concentrated.
No, the thing is that there are not enough "interesting" and "complex" things to do - industry doesn't usually need to scale an app to the size of WhatsApp or Twitter, and even when it does, a handful of senior people managing a team of junior devs usually suffice. So, experience usually isn't really a benefit (after 3-5 years of experience), and can be a detriment for the reason you mention (boredom with doing the same CRUD app over and over again).
I'd push back rather hard on that statement. US companies invest loads of cash in R&D, more than any other country. The problem is that if you expect to be doing R&D for a YC startup (or even a typical SV consumer web or B2B startup), you're really barking up the wrong tree—those sorts of companies don't do a lot of R&D. If you're really interested in that kind of work, you'd be far better off in bioscience or hardware. Or at least larger, established software companies.
You need to be careful on how you define R&D. Pure R&D, the kind of culture that spawned say UNIX, Oak (Java) or Plan 9 etc is on the decline as in recent years the bean counters tend to look at pure R&D as a waste of money as there is no medium term return on investment. Look at the decline of IBM Research in the US as an example. Google and in some cases Microsoft are exceptions to this.
However the US tax code allows companies to write off new development of new systems as a tax break, so you typically would be doing this kind of R&D at a startup until the system moves into production and is then accounted for differently for tax purposes.
So US companies invest load of cash in R&D, just not the traditional kind of Pure R&D.
Yeah, this is what I was referring to. Pike lamented this in his famous essay, "Systems Software Research is Dead." In it's place, we tend to have more focused R&D, which is still kind of nifty, but can imply a certain amount of acquiescence with the status quo.
I have no idea. Google X, Facebook AI Lab? I'm struggling with this myself, the solutions (for me) now seem to be: learn something new every year at work (by changing the technologies your employer uses or changing employers), and research in the afternoon.
I did exactly this - did my research after work, when my employer didn't budge in terms of putting on new technologies, or even improving it, I changed.
The other move is to do a start-up and work on one's own R&D (how you do it is your call)
On the contrary. I feel like if you can prove competency at depth in AI/robotics, you will be absolutely _set_ for the next ~half century of computing. The sort of problems we're solving in the areas of machine learning/planning have applicability literally everywhere, even outside of STEM fields. (looking at you, social sciences, a big data / machine learning twist is certainly in your future)
The hard part is the "proving competency" part. I consider myself a somewhat competent programmer, but trying to catch myself up on the state of the art in my focus; N body simulation; is a hard enough problem, and even when you feel like you understand it, how do you let other people know? (little bit of chicken in egg here, since normally the PHD is this sort of signalling.)
The answer I chose is "build fun projects in my free time." and with the advent of more accessible programmable robotics, you can certainly explore both of your interests. (I should take my own advice here too...)
Do we know what role Acton applied for? I saw the twitter posts about the rejection but never saw what job he was applying for. It's hard for me to generalize about the industry from this one specific case, without even knowing what job he was allegedly rejected from (his experience may have been undesirable for an entry level position where the hiring manager felt he was overqualified, or conversely he probably lacked the experience at the time to get the role of CTO).
That's probably a safe assumption. I'd imagine he was looking for another very senior-level job. In that case, it seems silly to use it as an example for why there might be a shortage of talent for entry-level to mid-level engineers. It doesn't seem like his experience was detrimental, or that he didn't fit some specific pattern (based on age, etc).
As a PhD who tried to get a job outside my field of expertise I'm glad you mentioned your reluctance to hire them, as it was certainly a vibe I got. For any people hiring potential PhDs out there, I'd like to clarify (and generalize) our pros and cons. We probably don't really expect to be paid much more, unless the job is one that we are uniquely suited too. We more than make up for our lack of experience by being extremely self-motivated and self-starting. Odds are good that we have above average speaking and communication experience. Our biggest drawback is that we may take a while to become "team players". This includes a capacity for being overly critical and tendency to overthink things.
There are jobs out there which justifiably require a PhD qualification, e.g. industry research labs, defense research labs, national research labs. I think new PhDs should try hard to seek these jobs rather than typical developer-type jobs that would not necessarily benefit from the skills one gained while working towards their dissertation. This is a difficult mandate to follow, but I think a useful one from a career-satisfaction perspective.
Of course it's not. Why is it reasonable to pay lawyers $600/hr but to pay a skilled computer engineer the same is interpreted as a joke? That's not a rhetorical question -- I'd really like to know the answer.
Actually, I suspect I know the answer in many cases, but I'm curious what other people think.
My understanding of how legal billing works is that you aren't actually buying one person's time for $600/hr, but you're buying a team of junior paralegals, legal secretaries, and young associates and one big-name attorney to supervise them. The big-name attorneys make a lot, but nowhere close to $600/hour; even if they're partners, a lot of money is going to the salaries of the paralegals, legal secretaries, and $160K/year associates under them.
Is your lawyer an in-house general counsel? Probably no. Do you require this person's services for ~40/hrs per week, every week for years on end? Probably no.
I would certainly pay an amazing engineer $600/hr if I could call on them at any time and get billed at increments of 6 minutes, but that would be an insane way to build software (I know lawyers who think its an insane way to do law, but it is the system).
You can't outsource lawyer from different country.
Maybe not trial lawyers or other positions where you need a warm body in the courtroom, but entry level lawyers are feeling the outsourcing squeeze as well.
After the rise of "social" coding ala Github, it's now a lot easier to vet a candidate for a development job. PhD is only useful if the core product has a considerable overlap with your PhD thesis. A graduate degree doesn't help, in fact, if the time is invested in "specializing", one may lose out on gaining 2 years worth of core dev experience but I think it really depend on the type of job.
As others have noted, having an advanced degree should be a signal that the candidate is capable of deeply investing themselves into a subject area. The typecasting is ridiculous: just because you have no graduate degree but spent the first 3 years of your career working on, say, an online store, does that disqualify you from working in any other area of software engineering?
Github as CV has been debated already. Your employer may place restrictions on your ability to contribute to or publish open source. You may already be working 40+ hours a week and have other interests besides writing more code. If I'm not especially familiar with a project, I have no way to evaluate whether your contributions are valuable; I'm using the project owner's judgement as a proxy for my own. Maybe the project manager is a chump and the project is a mess, so does someone contributing 100 commits to that project count as a positive or a negative?
Open source is not the meritocracy many think it is.
Let's say you one-upped AngularJS with a new library. How would you compete with that sort of marketing muscle? Devs are no different from regular people: they don't want to change, they'll be averse to re-learning things, and they'll downgrade you for not being backed by Google.
I think that the GitHub comment should be extended beyond that and into non-academia projects. It should be emphasized that those hiring PhDs (and especially those that have been burnt - see this comment which exemplifies by experience http://chronicle.com/blogs/phd/2013/09/19/the-ph-d-industry-...) will be looking for evidence of work beyond academia. I am thinking within CS so I can't speak for the rest of STEM.
However the fault probably lies within the wider academia culture and the lack of expertise that academics/institutions provide to those wishing to leave and go to industry. Some PhDs genuinely do enter for the purpose of having a research career only to realise that academia is not for them and vice versa..
I have no clue how anyone could justify downvoting your comment. It's hard to believe that anyone downvoting you could be a hiring manager in a technical field. I have had extremely similar experiences looking for both experienced and junior candidates, and with the price of senior candidates it's a costly mistake on a bad hire. Thank you for sharing your story..
> 1) The 10 year developer who has 1 year of experience 10 times over. This is just a junior developer who wants the salary of a senior developer.
What would you say then about a developer that switches job every year so (s)he can learn something new? I think there is a stigma regarding those too.
> 1) The 10 year developer who has 1 year of experience 10 times over. This is just a junior developer who wants the salary of a senior developer.
How simple is it to gain 10 years of real experience in the industry? Are there enough jobs and opportunities for the ambitious developer to keep improving and gaining novel insights and experiences in that person's jobs? Or does she, after a point, have to relegate most the learning to free time projects because the work doesn't take her out of her comfort zone any more?
Basically companies have to either pay/accommodate people or train them, but instead they'd rather whine. Know it all too well, as someone who has tried to avoid living in California for lifestyle reasons, which limits a lot of options.
Speaking from personal experience, it is hard to find engineers who can truly take a project and run with it.
For any given job posting, there are typically hundreds of applicants. Yet often, not a single one of the applicants could build a non-trivial application without intensive mentoring. Many of them couldn't do it even with mentoring.
I don't know what this says about our economy. Maybe nothing--maybe it just says something about the nature of engineering. I can offer only my anecdotes. I'm not in a position to draw from those anecdotes a conclusion about what our economy needs.
The author is forgetting one thing: under qualified devs (at the time of a FB interview) can also be lucky and learn the job while building and scaling a product which has gotten traction.
Given the numerous applicants they deny, what are the chances that a few of them get lucky? I'm estimating, with my limited math knowledge, that this will be closer to 1 than one may think beforehand.
Sounds like Facebook et al should be investing some of this money into university CS/CE/CIS programs along with internal training programs if they are truly concerned about the lack of qualified personnel. A billion dollars goes a long way.
Can we talk about the other elephant in the room: that Facebook can afford the inflation-adjusted equivalent of the Manhattan project budget with a product that it does not charge its users to use?
Google can afford even more than that. Times have changed, and these tech juggernauts are essentially printing their own money with how invested the general public is into their products.
Idea I had a while back: Monster.com for anonymous interviews. Purge race, age and gender information from the interview process. Getting a broader range of candidates to the final interview would already be a huge step.
I'm not even sure in-person or phone interviews are necessary or beneficial. I'm aware of a top philosophy dept. that does professor hiring completely be application materials. Why? They found application materials were a better predictor of success than in-person judgements.
I don't know that rockstar companies really say that much about the industry as a whole. I also don't put much stock in that Zuckerberg quote about "hiring as many talented engineers as we can find", unless they define talented as a very high bar, in which case the statement is meaningless. Who wouldn't want to hire as many crazy good programmers as they can?
Acq-hires are proof of dysfunction. Companies like Yahoo have talent, but can't recognize it at the bottom because the middle-management filter is so defective. So they buy market-verified talent at a panic price. That's shitty for the individual (well, great for 0.05% of them and shitty for 99.95%) because there's so much noise.
The problem is that the VC-funded world is full of soft-skulls who never grew up (some are old, but not mature; those tend to be investors). They make workplaces that are culturally defective and mean-spirited but have the superficial trappings of college because they're halfway houses for frat boys. "Culture fit" is about wanting to work with the same people you'd go drinking with or (for most non-tech roles) have sex with. I ask: Why? You should hire the people who'll do awesome work and lead, and some of those are 65-year-old women. If you don't, your halfway-house/I-could-live-here "utopia" will underperform and forever be beholden to the VCs.
Reality: if you make one real friend per job, call yourself lucky. My experience is that it's Poisson(0.75). That "culture fit" camaraderie is bullshit. Don't take seriously or trust anyone who wouldn't have your back (and give you a reference, even if you needed him to lie) if you got fired on bad terms.
Another lesson from the old (30): your life should be diversified outside of your job, because you can lose the latter at any time for any reason. Losing a job is bad enough as it is, but if you have nothing outside of it, it's catastrophic. The "culture fit" workplace is bullshit. Just build something awesome worth giving a shit about.
I'll agree with that, but it's dysfunction both in the hiring process and also in the R&D process. It's the middle management filter in both cases. Corporate culture is a big part of this. I worked for a major airline in the '90s and during that time, we had at least 5 initiatives to build online travel sites. Wasteful? Sure, but the winner of that process is a well-known travel site that's thriving today. That particular company had a highly entrepreneurial and competitive structure internally, which had definite downsides, but on the flip side, caused the company to produce some good innovations from within. I went to work for a competitor a few years later and things were completely different there and acqui-hire was the name of the game for that organization.
I'll agree with that, but it's dysfunction both in the hiring process and also in the R&D process.
M&A has replaced R&D, but it's an extremely wasteful and inefficient replacement. The silly social media apps are actually pretty rare; more common are firms that look like real companies but have junk technology that starts falling apart the minute it's sold.
It's not quite that - it's that the organizational structure prevents middle-managers from acting on their knowledge even if they do recognize talent.
Imagine that you are responsible for a department of people, all of whom need to work together to accomplish a big job. Assume that talent is normally distributed - maybe you have 1 superstar and 1 idiot, 10 high performers and 10 laggards, and 100 average developers. Also assume that the boundaries between categories are fuzzy - it's not that there's 1 superstar who absolutely stands out, it's that there are 2-3 people who are really good, and one is obviously better to you but that isn't apparent to everyone else, and another 20 or so people are just a bit behind them.
What do you do?
If you preference your best programmer, give him the pick of assignments and a lot of latitude to choose projects, you will demotivate all the people who are just below him. Ironically, a lot of michaelochurch's past comments are examples of this - his perspective on the "Real Googler line", and even the grandparent comment, is what things look like for someone who is talented and high-performing and yet right below the line. These folks become bitter, demotivated, subversive, and ultimately toxic to the organization.
Moreover, you destroy the effectiveness of your superstars as well. To accomplish anything in a large organization, you need cooperation from your peers. When your peers resent your special treatment or want to sabotage your rise, you won't get this. So the only way to succeed as a superstar in a large organization is to act very subtly and bring a lot of people with you, which usually means keeping your more radical ideas to yourself and only saying things that can be heard. (Which, really, is how it should be - think of how unpleasant it is to work in an environment with a "golden kid" who's always running around with the latest technologies, not answerable to anyone else, and yet has management's favor.)
This is also why getting rid of laggards is far more important for organizational health than hiring superstars. If you populate an organization full of folks who understand compilers, or understand immutability, or can work with multiple languages, or who understand design patterns, then those techniques become part of the general culture that the whole eng organization can use. If you lack that critical mass, any superstars that understand why those concepts are important will not have the freedom to use those techniques because of the political issues above.
There's very good evidence that the middle-managers know exactly who their star performers are. When they quit to found their own company, who do they call? You often have groups of programmers that have worked together through multiple companies, and as soon as you hire one the whole group comes on board within a year. But within a company, the constraints of organizational politics prevent them from favoritism, which is probably how it should be.
You make a really strong point. I think the problem you described above is why, in truth, closed allocation tech companies are destined to fail. Closed/open allocation is a continuum in reality, but the more closed a technology company is, the quicker it rots.
The "Real Googler Line" concept (and, as you know, I was only at Google for half a year and have no insight into what the place is like now) is the recognition that, above a certain level of recognized merit, people should be trusted to allocate their time and energies. As you said, people just below the trust thermocline where it goes from open to closed allocation (on whom the company relies to perform the most thankless, ugliest tasks) get resentful. You don't have a solution until you lower that trust thermocline down to the people who aren't capable of doing useful work, who you don't want to retain anyway.
The only conclusion I can reach is that a closed-allocation tech company doesn't work. I realize that open allocation isn't right for all industries, but in software, it seems to have minimal downside, incredible upside, and no reasonable alternative.
So, I agree with you. One shouldn't fix a tech company by elevating a couple top performers. (They'll get lonely.) One should go to full open allocation, and then those political issues go away.
I don't think open allocation works either once you get past Dunbar's Number. Instead of making politics go away, it makes it worse. All the political considerations that are out in the open when you're managing a formal org structure become hidden in the backroom deals of who wants to work with who else and how they want to go about doing it.
Here's another observation for you: there is a system where there is no org chart and nobody telling anybody else what to work on. That system is the startup ecosystem. And the end result of that system is what you term "VC-istan": some players become more powerful than others, and they freely choose who they want to associate with based on reputation, or how well their personalities mesh, or pattern-matching on previous successes, or who they went to college with.
The only "solution" I could come up with is that there is no solution. If you give people agency and freedom of association, you will eventually run into situations where their freedom to act impinges on someone else's freedom to act, and it is impossible to let both of them work on what they wish without forcing one or the other to make tradeoffs. Or, alternatively, they could both choose to work alone, in which case it is impossible to build anything larger than what one person can do by himself. Once you see that tradeoff, it's possible to make choices that stake out various parts of that space, eg. another thread currently on HN [1] describes how your options are management, niche development, or consulting as you get older.
Can you give an examples political considerations that are out in the open in a formal org compared to open allocation? I think you could make a case for the opposite.
Example 1: You have a group of people who care deeply about performance, reliability, maintainability, and long-term health of the code base. You have another group of people who care deeply about user experience, development velocity, and pushing features to market quickly. In an open allocation system these two mindsets will self-organize into different teams, and will avoid working on each others' projects. The former group will find that the latter are cowboys who throw together messes that they then have to clean up. The latter will find that the former are slow killjoys who result in mediocre products delivered late. The latter group will get to market first, which makes the former believe they're marginalized and gets most of them to quit the company. When the product inevitably starts creaking at the seams, there will be nobody around to fix it and introduce robust development processes.
In a formal org, these would likely be two job functions (call them "SRE" and "SWE", or "Infrastructure" and "Features", say) who are forced by upper management to work on the same codebase. Disagreements between them become explicit negotiations where all the technical issues and trade-offs are laid on the table, and then people can come to an agreement after looking at all the considerations.
Example 2: You have a brilliant developer who is terrible at working with other people and doesn't particularly want to deal with them. In an open-allocation world, he will end up shunned; nobody will want to work with him. In a managed org, it's often possible for his manager to pick off a certain really difficult technical challenge, have him go off and find a solution, and then interface with the rest of the group via some technical API.
Example 3: You have a technical lead who can't let go of the technical aspects, and insists on taking all the most interesting work for himself. In an open-allocation world, nobody will want to work with him, which means that he won't be a tech lead for very long, and eventually he'll probably get frustrated and leave. A manager could gently explain to him that taking all the most interesting work is incompatible with being a team lead, and have him choose one or the other.
Example 4: You have a project where consistency is the whole benefit of the project. For example, you're doing a company-wide visual redesign to make the company's product offerings more consistent for users, or you're introducing a language styleguide, or you're defining standard APIs for communication between components. These efforts typically never succeed unless they have top-down support from a strong leader. Everybody sees the need to agree on a standard, but when it comes to deciding what that standard is, everybody has their own agenda and nobody has authority to decide in favor of one alternative or the other.
Standards committees are good public examples of this: officially, it's "open allocation" where every company and member of the public who wants a voice has one, but unofficially they're dominated by the actions of people who can afford to implement and bring to market a concrete solution.
What's your definition of public? For #2 and #3 things seem less public, since the conversation is between the manager and the individual, rather than the entire team. Maybe I'm just misinterpreting though.
In the context of this post, "out in the open" = spoken or otherwise communicated between individuals. In organizations without formal structure, you still get all the messy emotions and politics, but people don't verbalize them - they just adjust their behavior to avoid things or people that they don't like. The resulting culture can get very passive aggressive. It also often has a very lopsided power dynamic, where people who are naturally comfortable being outspoken or voicing grievances end up setting the agenda, while other people who are quiet but may have better ideas or observations end up being left out.
You have a group of people who care deeply about performance, reliability, maintainability, and long-term health of the code base.
Having a large, single, closed-source codebase is a mistake. You should have a service-oriented architecture for the necessarily private stuff. Anything infrastructural should eventually be planned for release into the open-source ecosystem.
That's because maintenance is utterly necessary, but maintenance requires good people, and talented people will maintain OSS, because it benefits their reputations and usually involves improvements to infrastructure they need, but closed-source maintenance always ends up being a ball of work thrown onto people with the least clout, who are either not very good or pissed off.
That's how you solve the SRE/SWE conflict you talked about.
Either that or you take the hedge fund strategy: pay people so goddamn well (hedonic adaptation says that's in relative/growth terms, so 25% over market in the first year, and 25% growth each year) that they'll be happy to do whatever you ask of them. But for an average $120k developer that has you over $1M/year after 10 years, and most tech companies aren't willing to pay that.
You have a brilliant developer who is terrible at working with other people and doesn't particularly want to deal with them. In an open-allocation world, he will end up shunned; nobody will want to work with him. In a managed org, it's often possible for his manager to pick off a certain really difficult technical challenge, have him go off and find a solution, and then interface with the rest of the group via some technical API.
It's actually the norm for the best team size, in software, to be one. There are exceptions, but not many.
In an open-allocation world, either he finds a project where he can contribute on his own, or he asks for suggestions (from someone like this manager). The difference is that he gets to pick (although if he doesn't, he might have to be let go). He's not reliant on one company-assigned manager (who might abuse the power) to be his interface.
In an open-allocation world, nobody will want to work with him, which means that he won't be a tech lead for very long, and eventually he'll probably get frustrated and leave.
I see no problem with that outcome.
If he's not showing leadership, and no one follows him, that's what should happen. If it bugs him, he can change his attitude or quit.
How many managers, in a closed allocation system, are actually going to know that a tech lead is taking the best work? This requires people on the team going to the manager (behind the tech lead's back) and most people would be too afraid to do that, knowing what will happen if the manager names them to the tech lead.
You have a project where consistency is the whole benefit of the project. For example, you're doing a company-wide visual redesign to make the company's product offerings more consistent for users, or you're introducing a language styleguide, or you're defining standard APIs for communication between components. These efforts typically never succeed unless they have top-down support from a strong leader.
Ok, this one is a really good point. You're right that this is something that requires top-down delivery of vision. It's hard to imagine Steve Jobs running an open-allocation Apple. I think the benefits of open allocation in terms of what people are allowed to work on an experiment with are very strong. The fact is, though, that this can't be taken so far as to allow people to impose complexity on the design or the ecosystem as a whole. You probably need centralized authority and vision to keep, for example, a simple product design. I don't think that that should be an excuse to prevent people from organically forming and changing teams, though. It requires conservatism about what actually gets into the product, but not as much about how people work and what on.
The Dunbar's Number theory doesn't seem to hold as people thinks it does. Yes, organic groups become trust-sparse instead of trust-dense as they get bigger, and there may be reasons for the threshold to be around 150 in "organic groups" (i.e. those that come together without a direct purpose, such as an emerging tribe) but a corporation is not an organic group. Inorganic groups follow different rules entirely. They may fall into trust sparsity very quickly, or not at all despite scale. (Note: trust density doesn't mean everyone trusts everyone; it means that the prevailing attitude is that people trust each other to be basically valuable to the group. It's about the default setting of the "bozo bit".)
Open allocation takes work. The paradox is that it does actually require management (of a non-traditional form) but the management is exerted to protect the open allocation system (and, to some extent, to protect people from each other). You'll still have political issues in an open allocation environment. Management has to address them in a fair way without becoming extortionist or meddling as one gets when the conflict of interest between people management and project management is unaddressed.
Open allocation isn't a panacea. Open allocation has a million problems. So why do I advocate for it? Because closed allocation has all those million problems-- and a billion more that come from a corrupt, self-serving edifice ripe for abuse and that (except in an immediate, existential crisis where every decision must be made and executed fast) has no purpose.
You should think of closed allocation as like programming in assembly code instead of a higher-level language. If you actually need that extra 1% in execution speed over certain special cases, go ahead. Usually, you don't. This typically premature optimization of closed allocation will usually create incidental complexity that becomes permanent and corrodes your company.
You might argue that VC-istan is open allocation, but with extreme (and unacceptable) social inequality, bizarre income effects that can damage future jobs, and reputation-based extortions that make it, nonetheless, dystopian. It is a market culture, but a deeply manipulated one reminiscent of the blatnoy elite of the Soviets.
Where is the problem? The risk was outsourced, so were the investments. When it looks like a worthwhile project, the knowledge get's in-sourced. Don't see the recruitment fail in this particular happening of things.
Is this satire? We're talking about $19 billion here.
In either case, your statement is almost completely incorrect.
The WhatsApp founders were turned down before WhatsApp even existed. Therefore, it was not some bold tactical decision on the part of FB. Their technical acumen was up to snuff, so they were rejected based on some other criteria (not necessarily age).
The funny part is that if Facebook had hired them when they had the chance, they probably wouldn't have built anything resembling WhatsApp or whatever Facebook was trying to achieve by acquiring WhatsApp.
More to the point, these guys probably wouldn't have done anything interesting at all had they been hired by Facebook. They certainly wouldn't have had the freedom and the resources to develop something like WhatsApp.
I interviewed with Google not too long ago and it was clear to me that Google was the place to go work if I wanted to get a paycheck without having any impact on anything at all.
- Culture matters a lot in hiring. By this I mean a Stanford grad is much more likely to hire another Stanford grad in a field of qualified candidates. This isn't simply a form of nepotism as such. Two grads from the same college will share a larger common cultural base;
- The culture the founders bring shapes the organization. Many startups are started by Stanford and MIT grads. It shouldn't surprise you that this biases the makeup of their workforces and what they look for;
- Speaking as someone who has interviewed in the field of programming there are many frauds. I don't even necessarily mean deliberate frauds but there are clearly people who are employed as programmers/engineers who have no business being such. It's astounding how you can stump someone with 5-10 years of experience by asking them to code a simple loop (seriously);
- If people are worried about foreign labour putting downward pressure on wages, the indentured servitude that is the US work visa and immigration system should be the target of your anger. It allows bodyshops to hire people from, say, China and India and pay them a pittance because they know those people can't leave for 8+ years if they ever want a green card.
Make a green card automatic after, say, 6 years on an H1B, even if you change jobs, and a lot of those problems would go away.
As for other fields, I can't speak to those, other than anecdotally a lot of fields seem to have the earning potential of being a waiter in Manhattan.