Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Coding Horror: The Magpie Developer (2008) (codinghorror.com)
72 points by jalan on Jan 16, 2014 | hide | past | favorite | 73 comments


If I had to pick a single pain point about front-end development right now, it would be this. The new features in HTML, CSS, and native browser APIs plus the variations and limitations of each browser (plus the special and not particularly settled world of mobile/tablets) is hard enough to keep up with.

But we're now in the stage where there's a dozen frameworks out there, probably classifiable into at least three distinct paradigms, and then we have the languages that target the browser. And I suspect we're a long way from shaking this out into a semi-stable point.

The thing that I like least about this treadmill is that time invested in the ephemeral arcana of a stack/platform is time that isn't invested in skills that will transfer elsewhere and help you become a better general problem solver.


The cambrian-explosion of frontend frameworks you mean is mostly about patching up the web browser to do something it wasn't born to do though. The endless cycle of solving things that are solved elsewhere won't stop as long as the browser keeps being pushed as an application platform.


That's a rather negative take on it. I'd argue instead that the explosion in frameworks is a result of a broad interest in developing and experimenting with new approaches to providing applications. Web applications are typically network-aware, real-time, cross-platform and easily-updated; the basic technologies (HTML, CSS and Javascript) are easy to start using, and in some ways extraordinarily flexible.

I don't think that these are actually solved problems, and the web as an application platform can play a part in changing that.

Further, the whole concept of "patching up" a platform to "do something it wasn't born to do" is misleading. Every computing device you use has it's roots in older software that was developed without being intended to perform the tasks it currently is - we stand on the shoulders of giants.

The web as an application platform is still in its infancy, and we're currently playing around with a lot of different approaches - some will live, and some will die. It's how we make progress, after all.


I agree that's a rather negative take on it, but I can't help. Working on web development for almost 15 years has washed away my sugarcoat ;)

Frontend web development is still absolutely painful and ugly. It's based on a broken paradigm (hypertext vs. interactive interfaces); relies on hacks for fundamentals (XMLHttpRequest is probably the best example); multiple standards pushed by organizations with special interests (remember having to encode video in 4 different formats?); is labour and time-intensive; tooling is still catching up with the 80's.

None of those, in isolation, are big problems though. The big problem is that, because you can always patch everything up with some javascript, there's no drive to have a platform that rests on top of more sound fundamentals - condemning ourselves to an endless cycle of frameworks.


> the browser keeps being pushed as an application platform

Grumpy nitpick. The browser-as-platform hasn't been so much pushed on the rest of us, as if by some narrow conspiracy. It's been pulled along by mass economic force. Its ubiquity, extensive reach across OSes and devices, and freedom both in terms of beer and speech make it the best bang for buck a high percentage of the time, all things considered.

I agree though that the hypertext-browser and application-platform paradigms haven't always been easy to reconcile.


When I say pushed, I don't mean anyone in particular (although big players like Google do it), but rather, pushed by economic (no licenses, no app stores taking chunks) and political forces (availability of professionals, open standards). So, yeah.


Nevermind things that are solved on other application platforms, we now have frameworks that solve things that are solved in the browser. Moving boldly beyond reimplementing page loads with custom Javascript, Youtube now reimplements page load progress bar with custom Javascript.


You sound dismissive of Youtube's page loading. I believe they are loading only those parts of the page that change, which is much faster than re-loading the whole thing. It makes the user experience faster and uses less bandwidth.

Do you do much front end development?


I think they were being dismissive of the need to reimplement the page load indicator rather than page loading itself. I could be wrong though.


I believe his point is not that YouTube's implementation isn't useful, but rather, that it's insane, at a conceptual level.


What's so insane about it? I would say that the logic for which parts of the page to reload should definitely be in the frontend code, rather than the browser itself.


I do enough front-end to understand why Youtube does what it does. I don't do enough to think that every site implementing its own way to reload parts of pages is a sane idea.

The beautiful thing is that the progress bar in question comes from one of the companies that pioneered lack of a progress bar in the browser. But hey, Javascript.


Google pioneered a minimal browser that left most things up to the sites inside it. Seems to have worked out for them.

If Youtube finds that a progress bar is good for their user experience, more power to them. They can make sure it works as well as possible with their site, and change and remove it if necessary. It seems that you know more about Youtube's design than Youtube's designers.


Next up: have browsers ignore hyperlinks, Youtube designers can come up with a beautiful way to represent and interpret intent to navigate to a different resource and delight users - and change and remove it if necessary


Out of curiosity - I'm not a web person - what about Chrome is minimal? Which things does it leave up to sites that other browsers don't?


Is there really an application platform that does everything the joint browser-as-application-platform and browser-as-hypertext-viewer can do? Please enlighten me if I'm wrong, but I don't think there is, and I think that's why so many people have been working on improving the experience of the browser-as-application-platform.


So what we need is a permian extinction event?


So the next funding crunch. Instead of making Snapchat but for dog yoga and building the next great JS framework of the month in the process, developers will have to go back to making boring things that keep working for more than 18 months.


Wouldn't that just be an end to the explosion, rather than an extinction event?


The extinction will come when Github starts removing inactive free projects


as someone who does very, very little on the side of frontends, i think your time is still invested into a transferable skill that i miss out on:

(frontend) framework design, paradigms and trends. If the stable point you mention will eventually be reached, I am sure everything on the way has contributed and those who have been involved in one way or the other will contribute the most to it and will be the ones that can best contribute to iterative improvements on that stable point.


It must be so wonderful living in your world where there are only a dozen frameworks to choose from. Does money grow on trees there too? Are the streets made of gold? Do people fly their cars to the beach everyday and swim in cherry flavored cola? Please tell us more.


There's a nice world where people aren't snippy about comments that aren't particularly offensive.


Where?


I didn't find the poster's comment offensive. It was a joke; relax and find a sense of humor. Heck; I even up voted the poster to give them some karma since they have a valid point. Sheesh.


My comment was a reflection on your comment—"It must be so wonderful living in your world" implies the parent is shortsighted and was ignoring the difficulties of real-life development.


Rather this was you assuming the worst based on the text, which is a commonly made mistake via text based communications.

I was trying to be humorous since the parent poster had said dozens of frameworks; which I'm sure (s)he is aware is an extremely generous number.

There is a good link here on this common issue as well. Based on the number of down votes my post received; you apparently aren't the only one that took it the wrong way.

http://www.wired.com/science/discoveries/news/2006/02/70179

Personally; I believe that the way a person interprets something like this speaks more of that person's inherit trust/mistrust of society than anything else. I personally try to take the mindset that most people are on whole decent, nice people if you get to know them.


There really is more than a dozen front-end frameworks to chose between. Check out http://todomvc.com/ for examples.


Exactly; and that number just keeps growing since everyone thinks they can build a better wheel. I agree with the poster; thought what they said was an understatement so I threw out some sarcasm :) Oh noes!

Quick; somebody down vote this guy with his 20 karma! Seriously; down vote me some more people! I want to see what happens after zero!


Well-established insults heaped upon strangers are rarely mitigated by a followup "just kidding".


Let's say a generous baker's dozen. :)


A related notion is the Blub Paradox: http://c2.com/cgi/wiki?BlubParadox

"As long as our hypothetical Blub programmer is looking down the power continuum, he knows he's looking down. Languages less powerful than Blub are obviously less powerful, because they're missing some feature he's used to. But when our hypothetical Blub programmer looks in the other direction, up the power continuum, he doesn't realize he's looking up. What he sees are merely weird languages. He probably considers them about equivalent in power to Blub, but with all this other hairy stuff thrown in as well. Blub is good enough for him, because he thinks in Blub."

I, personally, am very slow to adopt new languages or frameworks for serious projects. Still haven't found anything I can't do with Java and/or Ruby on Rails. I do try to keep up with the news so I'm not totally caught off guard, and I make a point of building toy projects in other languages like Python, Haskell, Scala, Clojure, etc. It's important not to be a Magpie, but also not to get caught up as a Blubber and end up looking for a COBOL to Objective-C cross compiler so you can make an iPhone app.


The old MMORPG adage fits here well: Anyone worse than you is a noob, and anyone better than you is a no-life neckbeard.


Also, the neighborhood politics dichotomy: Anyone who has been in the neighborhood longer than you is a NIMBY bent on obstructing progress, and anyone who arrived more recently is a heartless gentrifier.


There is one for religion too: anyone less observant than you is a heathen, anyone more observant than you is a fanatic.


We're gonna leave out the sex one? "Anyone kinkier than you is a pervert, and anyone not as kinky as you is a prude."

Suppose this works for any topic which can include relative descriptions...


I always thought this was from George Carlin on driving?

'Have you ever noticed that anybody driving slower than you is an idiot, and anyone going faster than you is a maniac?'


Ruby is closer to Blub than Java. I'm not sure why you would even mention them both in the same breathe. You know that for Paul Graham (and anyone else who is informed) Blub essentially means LISP yea?


Uh, what? if "Blub" meant any specific language, or any specific kind of language, why anonymise it? Are we suddenly afraid of embarrassing java or ruby programmers for getting stuck on terrible languages?

Isn't it more likely that "Blub" simply means whatever language it is you happen to be using ?


He doesn't anonymize it at all. Read: 'Revenge of the Nerds'. He specifically mentions Java as the language the pointy hard boss prefers.

Also, the post I responded to (and my subsequent post) was a bit confused about what it referred to as Blub. Blub is in the middle and LISP is on the top of the power continuum.


You clearly don't understand the Blub paradox. Blub is whatever language you are using.


No, I understand it very well. Blub is whatever language you are using that isn't on the top of the power continuum. As of this point in human history, LISP sits at that end of the spectrum (all of this is directly from PG-- if you have issue with it, please read more of his essays). I've already mentioned 2 essays but he has written books on it as well.

In short, PG makes it clear in his writings that LISP is the most powerful language we have as of yet and he even puts forward the idea that it might theoretically be the most powerful language possible (due primarily to its syntax being a human-readable [and morphable] representation of an AST).


Where are you getting the idea that Blub refers to Lisp? If you read the actual essay, it seems to refer to Java or something similar. The whole metaphor is about why developers don't all use the most powerful languages, and he is clearly saying that Lisp is one of those.


Oh, I did read the essay but it's been a while and I don't remember all of the details. I took it that the post I responded to wasn't confused when it said that Blub was at the upper end of the spectrum but now that you mention it, you are probably right.


One thing that we seem to constantly need to be reminded of, is the fact that the world gets a new set of human minds, every year, looking at the scene. Children are among us. While some of us may have had decades to dissect the polity of the world, yet new minds are today discovering the basics. Thus, there really isn't any 'news' as such - just 'data relevant to those available to view it'. The consumerist ideal of 'the new new' is a fallacy; in fact things need to be at least 6 months old before they become 'the new thing', in most realms of human cultural interaction.

This fact of 'where new comes from' (sex, basically) is true of developers, as it is true of any other human responsibility that can be taken. Developers, new to the scene, who do not know what was there when they arrived (for various reasons), end up building new things. Those new things do in fact represent progress to the human species; in that they can be as-broken or as-brilliant as anything else, but won't - likely - be exactly the same as anything else out there. Difference drives us forward.

But calling people out specifically and associating them with animals is, alas, not a new thing. It has been going on forever, it seems. Is it not tiresome to a developer to be instantiating fallacies like 'magpie disorder' on other human beings so easily?

Not that I agree with the position that the 'always-new widget must be used' specifically; more that 'new ways to discriminate' isn't something this hacker, personally, wants to read about ..


I agree with the main conclusion here, but it's a stretch to reduce technology choices to a simple "new or not" dichotomy.

Let's say you're writing a new web service in Java, because it has features aplenty and is also the language your team is most familiar with. You're confident the JVM is a platform you want to build on.

Now you need to:

1. Choose a set of libraries or a framework. Do you go for Spring or Java EE, or for something newer like Play or Dropwizard?

2. Choose a build tool. Maven? Ant? Gradle? Maybe we'll write some scala, so SBT?

3. Choose tools for deployment, config management, etc.

4. A database.

5. And so on.

All of these tools have different trade-offs. There are so many trade-offs that I don't think blog post comparisons (or whatever) cut it. And so you have the "magpies" who try and figure out some of these trade-offs for themselves by experimentation. (That is what, in my opinion, hack days and 20% time are for, not your new production system.)

But don't listen to me, we wrote our new web service in Go ;)

More seriously, it was a major decision and I couldn't possibly write a few hundred words on my blog to justify it. I may write a few thousand, though.


"All of these tools have different trade-offs. There are so many trade-offs that simple blog post comparisons don't cut it. And so you have the "magpies" who try and figure out some of these trade-offs for themselves by experimentation. (That is what, in my opinion, hack days and 20% time are for, not your new production system.)"

Absolutely. Someone has to be the designated pseudo-magpie in order to architect the stack. Doing so effectively though requires a dev who can look past the buzzwords and elevator pitches to really get to the core of it. Essentially they have to be magpie and anti-magpie at the same time. Does this new technology really offer me any benefit or is it the same end result wrapped in new clothing?


Right. Because like it or not, justified or not, when new tools are developed, old ones are abandoned. Sometimes if you stay with what you believe is the tried and true, you end up having trouble supporting new features.

This is true mostly for libraries, though. If you switch your main programming language more often than once a decade then you're either a true magpie or you just don't know how to pick them.


It's become funny to me how people who aren't professional web developers assume that the "pace of technology" means you must obviously be learning and using the newest cutting-edge tech at all times.

Aside from such a thing being irresponsible for those of us with clients who need fundamentally sound and stable groundwork, it's an incredible waste of resources. Unless it's actually your job to evaluate new languages and frameworks (which, by the way, would be awesome!), it rarely makes sense to be riding the latest craze just because it is the latest craze.

It's good to have options. It's not necessarily good to try and use all of them.


I'm glad database technology with its deep (and provable) mathematical foundation like SQL is free from these distractions. I mean what if there was this huge push of Object-Relational Databases in the 1990s and 2000s, including XML-native databases? I mean what if there was this massive claim that normalization is just an old man's fetish, and a bunch of geniuses figured out you could store things in memory and flat files like the punch card days? And then they created this huge buzz on hash tables, and key value stores with some magic serialized JSON could be ankle deep in linked key/values stores and scalability and all such things were claimed but seldom proved. And then mathematics of normalization hit them in the face, so they have to invent a new buzz word for normalization while still pretending SQL is an old man's fetish. More acronyms I say!


You do realize that there are successful companies storing massive amounts of data in NoSQL applications, yes? I guess you don't have to call key value stores and such "databases" if it bothers you.

If you're making the case that traditional relational databases are still relevant today, I don't think you'd find many who would disagree. Databases of the sort described by Cobb are as valuable today as they ever were. NoSQL land has lots of competing technologies to draw the magpies. MongoDB was hot before. Now it's not. So it goes.

But if you're trying to make the case that SQL is the only way to store data, you lack exposure to the variety of data out there. There are situations in which using a traditional relational database simply doesn't make sense. Would you really want to run, say, an instant messaging application with millions of users on Oracle?


Both of you are correct.

Around 2000 or so, we started to see companies that

1. had Big Data;

2. spread it across data centers spanning different continents;

and 3. needed to display and update it in real-time.

These companies (Facebook being the modern example) basically needed to throw out normalization (i.e. to choose AP over CP) in order to get an acceptable UX for people interacting in different parts of the world. And these companies were prestigious.

But these two facts combined meant that everyone was quick to adopt these "pragmatic solutions to Big Data problems" in order to try to signal some of the prestige involved with having "Big Data problems."

But, since their Data actually wasn't Big enough for the real pragmatic solutions to be more helpful than harmful, the prestige-seekers sought to simplify the "pragmatic solutions" -- keeping all the pain involved with non-relational access, while shucking anything that could potentially operate at scale. Thus were "consumer" non-relational databases (e.g. Mongo) born.


To add another example: Graph databases. I've found that one thing that relational databases/SQL really don't handle very well are graph data structures. Sure they can store them, but retrieving and displaying the data in any meaningful way is quite difficult.

Edit: not impossible, mind. Just difficult.


Dear god yes, agreed. For someone still learning the ropes, the absolute torrent of new buzzwords/languages/dbs is really fatiguing.


Yes, and to imagine I only got started in front and back-end web-development 3 years ago when I'd say things were even crazier. While learning I was being bombarded by the explosion of HTML5, CSS3, LESS, SASS, Stylus, templating engines: Mustache, Handlebars, Jade, Dust, server side frameworks: express, flatiron, tornado, client-side: prototype, knockout, ember, meteor, etc.

It was common for me to see HN announcements for new JS libs or frameworks that would obviate my previous three weeks of work. This really prompted me to enter a magpie phase since it's hard to put your nose to the grindstone if you can hope (often with success) that someone else will solve your problem for you.


It truly is and even for experienced developers it can be overwhelming. It is simply impossible for anyone person to be knowledgeable every new technology and/or programming language.

I used to constantly fall in the trap of shallow but wide breadth of knowledge. Something new would come out and I would drop everything and dive right in. It feels rewarding at the time but really in the end has very little benefit (unless of course your a tech reporter).

My advice is to achieve a narrow and deep knowledge base. Pick a few that you feel passionate about and really concentrate on mastering those. It would help to pick those that have a large number of related job openings if its your livelihood. Mastering Java may seem old school, but the number of job postings I still see for java developers is amazing.


It is incredibly fatiguing. Oh, you were working at a company for two years on technology X? Sorry, we need to hire someone who knows Y which you couldn't because you were working on X. No, I don't care that you could learn it in 3 days, our HR procedure says we can't give you 3 days.

Then I go download some C source code from 15 years ago and it still compiles just fine and I smile a little bit.


Hacker News declares new things are dead. Along with Web Pages, Blogs, Net Neutrality and PSD to HTML.

The only things not dead are Edward Snowden and languages that compile to javascript. … And declaring things dead.


The Magpie Developer is easy to shoot down, but as this thread [1] makes clears, we usually switch roles in our career. If you're not a Magpie once in awhile, you're not trying new things enough.

https://news.ycombinator.com/item?id=6938645


After spending a couple days pitching in on a Wordpress redesign...I'm reminded how lucky I was to have a little magpie in me. PHP was my first web scripting language and I was even able to build web apps from it. Later on, my employer switched to Rails, and I had never even heard of Ruby. But going back to PHP years later, I'm surprised at how much the variety of experience I've had just understanding different patterns makes it easy to go back, and even understand things that I had never understood before in PHP.

I agree that newness is too fetishized, but trying out a fad can be a great way to unexpectedly learn and grow


I agree-- I don't adopt every tool I play with, but how are you ever going to learn better patterns for working if you never look at other ways people do things?


Hell, a lot of us are just now absorbing ideas that came about in the 60s and 70s.

There's good new stuff of course, but a lot of it is redoing an existing idea in a slightly different context, with new and exciting bugs waiting for you to discover them when you'd most prefer not to.


"Users don't care whether you use J2EE, Cobol, or a pair of magic rocks. They want their credit card authorization to process correctly and their inventory reports to print. You help them discover what they really need and jointly imagine a system."

Damn right!


This is definitely a reach, but the reason why I like learning math more than CS is because it's been around long enough to inspire confidence that it will continue to be around.

Likewise, this is why I would study something like HoTT--reasonable certainty that the things I'm learning there will form the basis of the final programming language.

I don't mind change, but I dislike putting weight in fashion.


I'm reminded of Peter Norvig's "Teach Yourself Programming in Ten Years."[0] How can we develop intuition about new tools if their half-life is measured in days while intuition takes perhaps years to develop?

--- [0]: http://norvig.com/21-days.html


But progress is real. The people who moved to Ruby weren't just following fashion (at least, not all of them) - the language genuinely improved on what had gone before. And now that they're moving on again, it's not (just) to try and get away from all those losers who're writing Ruby nowadays - it's because lessons have been learned, problems have been solved, and there are some improvements that you can't make without writing a new language.

Sure, it's possible to move too fast. But the dangers of stagnation are worse, IMO.


"...the vast majority of programmers have yet to experience a dynamic language of any kind..." - We've tried them now, can you please take them back? At least JS.


I love shiny new things and will keep on collecting them.

Thanks to this article, I may have found some more. It pointed to the 2007 article, well here's the latest Scott Hanselmans Ultimate Dev Tools:

http://www.hanselman.com/blog/ScottHanselmans2014UltimateDev...


I'm quite amazed that I almost never get any answer to the question : "what is the problem you're trying to solve ?". We should make placebo software. Zero feature, 100% marketing, but gives the occasion to look at the problems we really have. Maybe have a consultant for the vaporware acting as a therapist. Oh wait, I have a strange feeling of déjà vu.


Does anyone else find it odd that the majority of links in his articles are to his own articles?


Nope. He's written a lot within a clearly defined scope "the human side of software development", so his articles are often related. Beyond that, self-linking is a great form of self-promotion and getting readers engaged in a blog.


Maybe. That was the style of the day, and Atwood's blog can easily be read sequentially (such that the self-references aren't necessarily self-promotion so much as they are helpful reminders of previous concepts).

At least, I spent a summer in 2008 reading through his blog.


I believe these sorts of people are also referred to as warez d00dz.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: