Hacker Newsnew | past | comments | ask | show | jobs | submit | DaanDL's commentslogin

I never really understood what exactly is so readable about python. I've been developing in Python for 8 years now, and before that I was a C# developer, and I don't find Python to be that more readable.

Sure there's less ceremony, and yes, you can have your project going with just a single file, but other than that...?


I think the meme come from the fact that in 00s and early 10s most people looked at Python code coming from C++ and Java.

In Java bad OOP conventions were commonplace, like everything using getters/setters, deeply nested class hierarchies and insane patterns like AbstractSingletonProxyFactoryBean. It got impossible to figure out what's going on.

C++ just got every possible feature that badly interacts with each other, in an amount that never could fit in a single person's context window. That basically led to a situation where every programmer or company had it's own dialect of the language; the other dialects than your own were mostly incomprehensive.

Python has it's own share of bad features, and for a long time really bad ecosystem around the language - Python 2 vs Python 3; eggs vs wheels; easy_install vs pip; 123489 ways of installing Python and each of them bad. But, once it started to become better, in the mid-late 10s, around Python 3.5 or 3.6, it exploded in popularity.


Python data processing/ML in the 2010s became a huge asset for the language.

Ironically it also created a ton of really badly written Python in the process.

Commercially, almost all Python is fairly badly written, with types either not documented or not passing with any consistency even when documented. It is the default state of Python. I blame Python for it because it could have made type definition and conformance a default, but it didn't.

The AI boom has really carried Python up with it, but it was quite popular as early as the mid '00s. I remember grumbling in college around that time that the CS curriculum was shifting from Java to Python, because I didn't like Python and thought it was a worse first language.

Incidentally, even though I still hold those opinions, I can admit that history has solidly shown them to be unfounded.


C++ and Java and … Perl.

C# is also a great language, but notice how it have been moving closer to Pyhon-style syntax. E.g. now you can initialize a list like [a, b, c]. They wouldn’t add that syntax if they didnt think it was an improvement.

Less ceremony and boilerplate means more readable code.


Reaaaally?

I think a lot of the readability of python is in the fact you don't need to be recently familiar with it to pick up what its doing most of the time.

Over my career I've dipped in and out of rust, typescript, perl, swift, etc codebases. I'm no expert in any of these, but every single time I have to look something up to understand what this set of arcane symbols or syntax means.

When I dip into Python I just ... read it.

(None of this is to say I prefer Python, just that I really do get the readable thing)


I dunno, as someone who doesn't program in Python, I find dunders to be very confusing. Like, how is this readable?

_foo

foo_

__foo

_Foo__bar

__foo__

foo__bar

All of that is valid Python, and some of those forms mean different things depending on where they are used.


The second, fourth, and sixth form is options aren't used AFAIK.

Otherwise, a leading underscore indicates a private method but isn't enforced. A double leading underscore is also a private method but is "enforced" by giving it an unpredictable name. Double underscore (on both sides) means the function is digging in to python's API, like if you want to give a class some behaviour with + or = or [].

It's not trivial, and not particularly intuitive, but it's not necessarily terribly confusing.


The second form has no built-in meaning, but is frequently used in the wild. Often in local variables to avoid shadowing builtin types (`id_ = get_id()`) and in various libraries. Out of the top of my head, ORMs also use it to mangle reserved names.

edit: I googled a bit and PEP8 explicitly says "Thus class_ is better than clss". and "single_trailing_underscore_: used by convention to avoid conflicts with Python keyword, e.g..."

The fourth form is the mangling used for __x names internally (__x field in class Foo is actually _Foo__x

I don't know where GP saw sixth form, but considering all other forms are from real-world usage, someone probably uses it too.


What do you mean? Those are valid identifiers but programmers aren't required to use them.

"whitespace, not brackets" from a sibling comment touches on it, but a lot of people, beginners especially (but not uniquely), are put off by symbols when reading code. Python is less symbol-heavy than most languages, by using whitespace and syntax and words (eg. `and` not `&&`, explicit `lambda x:` rather than `x =>`) in their place. It doesn't go so far as COBOL as to be cumbersome, but far enough to make a difference to a lot of people.

If you're doing non-CS academic research and you get only one course/module to teach the new grad students "programming", python it is. That you can get a project going with 3K LoC in a single file is a bonus in academia :)

The scipy/numpy dataframes model is really neat though, python's has all the cool machine learning features, and since they're just a wrapper around some C++ and FORTRAN, it runs fast too if you do things properly.


" and before that I was a C# developer"

So .. you were already trained in reading abstract.

A beginner on the other hand sees lots of intimitading {} in C family languages everywhere. And Python does not need them and less is usually better in design.


I agree, especially very "pythonic" structures if overly shortened are hard to decipher especially if you don't use or read python on a regular basis.

Often times when I am reading a medium or advanced python codebase I need to look into the function definitions and operator documentation to understand what is supposed to be returned. Where with C-like languages I feel it is easier to build that context because there is more context written and less tricky syntactic sugar.


> if overly shortened are hard to decipher especially if you don't use or read python on a regular basis.

Sure, but this is the case for any language.


Python USED to be easy to read, before a lot of the newer features like type hints crept in. 20 years ago, Python looked like executable pseudocode.

I agree. My kotlin is readable. The functional code with typing all the way tells what every step is doing. My same code in python is a hot mess of nested list comprehensions and lacking lambdas.

Well yeah.

Dropping the ceremony means all that’s left is the ideas and the intent of the code. Which is exactly what you want for optimal readability.


The "other than that" is whitespace, not brackets. Whether that's a big deal is up to you, but the carry on effect of that is that the code is indented the way the control flow interprets it, so there are no bugs from misplaced braces. (Plenty of other bugs for other reasons, unfortunately.)

I find brackets help me understand structure from a distance much better than whitespace.

Misplaced brackets seem like a thing from the past to me when we didn't have IDEs. I don't remember ever having a bug due to that.


> I find brackets help me understand structure from a distance much better than whitespace.

I can't imagine how. Whitespace physically lays out the block structure on the screen; braces expect you to count and balance matching symbols, and possibly scan for them within other line noise.


This is a 00s POV. If you spend any time on syntax formatting in 2026, you're wasting it. It's a solved problem.

Any reasonable language with braces has standard formatter that will just put each brace level on a different whitespace level.


Yes, and so will any reasonable text editor automatically indent to the most likely position, and remove an entire indentation level with backspace, and substitute spaces for tabs per community standards, and keep everything lined up neatly.

But GGP was making a claim about the braces themselves solving the problem, and they clearly do not. The indentation automatically inserted by your tooling solves the problem. And it's at least as easy to communicate the intended block structure with colons and backspaces as with open braces and close braces, plus it doesn't waste lines (or invite bikeshedding) for the closing braces.


Working in C# i feel basically still read code structure by the visual block structure / indentation. I dont think I've ever counted braces in my professional life. The IDE makes sure it is formatted correctly and ambiguity is basically impossible.

Exactly. So if the indentation is the actually salient thing, why not use it directly?

I mean: you don't count the braces because your tooling counts them and makes the indentation match what Python would use anyway. If you had just created that indentation in the first place (which with a proper editor is at least as easy as typing the braces; you essentially type : instead of {, and backspace instead of } ) then you'd be in the same place, except without the extra punctuation noise (well, with half of it, because GvR thought the colons were a useful signal even if redundant).


Nevertheless it happens that while moving code around one wonders what indentation level that code should go. Undo, undo or git show the original code, look at it, retry more carefully.

Brackets would allow the editor to autoindent the pasted code.

No choice is perfect.


Whitespace and braces work together to make the code more readable; both by the computer and the human. And they make it less likely to have errors, because the braces convey intent (much like parens in math when they're not "needed")

So you would find bracketed code without any use of indentation easier to read than python?

It's no more 1990, when Python was born. Editors have been automatically indenting bracketed code for a long while. Probably notepad doesn't, or maybe plain vanilla vim.

The comment I replied to stated brackets helped them more than indentation.

Whitespace forcing proper indentation practices has always been one of my favorite aspects of python. I TA'd a data structures in C++ class and the lack of proper indentation making code unreadable was my biggest pet peeve. I always made the student fix their indentation before I would help them debug it.

I know that is mainly a beginner coding issue, but never having to deal with that issue was always one of the biggest advantages of python.

That said, I believe a lot of the stuff that was added in 3 and beyond (to make it more typesafe, accounting for unicode, etc) has made it a lot less readable over time. You can argue that it has made Python a better and safer language, but the pseudocode aspect has gotten worse. I kinda miss that.


Python and C are the only language in which I have experienced that class of bugs. And that is due to if statements without brackets in C and because Python has meaningful indentation which people have accidentally messed up when refactoring.

And today with autofotnatters I think only Python is still vulnerable.


If you are messing up indentation accidentally during refactoring there is either something wrong with your tooling (including your text editor) or you are letting things get too far out of hand before starting the refactoring.

It's 2026. I'm using Jupyter notebooks in Databricks. Guess what my tooling (including my "text editor", the Jupyter notebook), does not do?

Yes, I can castle-[ to shift a block of code left or right, but this is not always problem-free nor is it automatic nor does it have any sense of where the indents should go.

Yes, there is a "format python properly" button which often errors out says "there is an indentation error in your python so I cannot automatically indent it"

Would I like to use better tooling? I present my .vim file as evidence. Am I using what they tell me is state of the art? yes. And in 2026, state of the art does not solve python indenting, because python indenting is inherently a broken paradigm


> Would I like to use better tooling? I present my .vim file as evidence. Am I using what they tell me is state of the art? yes. And in 2026, state of the art does not solve python indenting, because python indenting is inherently a broken paradigm

I don't know what to tell you. I use Vim and find it trivial to get the indentation right using my distro's stock config.


Does your tooling not allow you to select multiple lines of code and press Tab or Shift-Tab to indent/dedent the entire block?

It usually only takes me a 1-5 seconds to fix the indentation when I copy/paste code that existed at a different indentation level. This is not something I'd complain about, personally.


Ah, the old "you're doing it wrong" argument. Moving code from one place to another (copy/paste from online or just from one file to another) is a fairly common source of bugs for a lot of people when it comes to Python. At some point, it becomes clear is an issue with the language, not the people.

I enjoy Python, but the significant whitespace is _not_ one of the reasons.


> Moving code from one place to another (copy/paste from online or just from one file to another) is a fairly common source of bugs for a lot of people when it comes to Python.

I genuinely don't understand how they manage this. Worst case, you paste at column 1, re-select and tab such that the baseline is appropriate for where you're pasting it, which is obvious. But more importantly, you shouldn't be copying and pasting unless you're proficient enough to fix such mistakes easily.

I also don't understand how it can be argued seriously that braces avoid the problem. If you'd paste at the wrong indentation level, why would you not equally well type the wrong number of braces?


There are plenty of python bugs from mis-indented code. Particularly given multiple parts of a flow that "else" can apply to.... for/else, while/else, if/else, try/else and so on. It happens quite often in python codebases I've seen.

Also, good automatic formatters (gofmt, rustfmt, etc) also indent along control flow lines, so without the braces you just changed a syntax error into a "hmm, this is acting really strangely" bug-hunt by using python.


People confuse having fewer keywords/concepts to learn for readability, which is not really the same thing.

Someone who is equally expert at Java and Python will probably consider Java to be more readable.


The concept of "readable" is not really relevant for experts, because, well, they are experts. Being an expert automatically means you can read almost any line of code and know everything it does.

Everyone else appreciates and is more efficient working with code that is intuitive to grasp.


I have many years of experience with Java, and rarely use Python... and I'd say Python is, in general, easier to read. There's generally a lot less "having to go look at _other_ code to know what _this_ code is doing".

Other than that? Exactly that!

But that's costly. Speaking of my own experience: going from a webapp fully hosted on an EC2 instance to a railway and vercel setup reduced my costs 10x.

t4g.nano is $3/m; a similar spec-ed fargate on ecs (just any docker container) is $10/m

This sentence beautifully encapsulates my point. I know that this is just ordinary jargon, but wow that's a lot all at once. And it does seem like something I need to know before I start.

sure but on the flip side - when I signed up for vercel I had literally no idea what was going on. It just said "do you want to start a blog? here are 1000 templaptes"

Maybe so, but it's still not the complexity nightmare that some would have us believe it is.

Okay, but not all results on there are valid, ForgeCode for instance has been cheating in the past:

https://debugml.github.io/cheating-agents/#sneaking-the-answ...


Hi Rod,

I know nothing about astrophotography, but while reading your article I wondered: what happens when a truck passes your house while you're taking these shots, don't the vibrations mess up with the results?


Today, we proudly announce, the Meta Rayban 365


Why couldn't they just've called it Macbook, instead of Macbook Neo?


"- yeah I have a macbook" "- what, an air?" "- no a macbook" "- ..?" "- the one in colors, not the one-port 12 inch one from 2015 but you know it just released!"

This already happened in 2015, they probably don't want for it to happen again.


To stop comparison to the old 12” 1 port MacBook?

If you were to align the MacBook line with iPhone line logically this would be an ‘e’ class device, the Air would just become the MacBook, pro remains pro, and there would be a nice gap for a new ultra light MacBook Air, a modern Apple silicon version of the 12” MacBook - expensive, small and fast, analogous to iPhone Air.

Also new names are fun. This name is a fun name. Nice to see some playfulness from Apple.


It might be helpful to have a modifier on all the models. It's a bit awkward (not that the naming geniuses at Apple have ever cared about how awkward it's to talk about their products, witness "Apple Watch Edition" and Max Macs) to talk about iPads, because one of them lacks a modifier. "Which iPad" "The iPad iPad", etc.


Because it's the new Macbook.


So in a couple of years we will have de MacBook Neo Neo?

I think they got just cheaper marketing since jobs died. No focus or brand protection.


Especially with things like: will my pencil work with this iPad.


Was about to comment the same. It's a common mistake/gotcha.


Possibly dumb question, but does that still hold inside p5js?


p5 is just a wrapper that adds the setup() and draw() functions, so yes


Then just use a small cup to scoop out some screws.


He needs 6 screws at a time, and the goal is to save time compared to counting manually. I'd guess that 7 would probably be fine occasionally -- maybe even 8 from time to time if the process is fast enough. I'd further guess that 9 screws is a non-starter (screws are inexpensive, but 9 represents 50% waste, which is quite a lot).

The lower limit is hard-set at 6 because the kits that he's producing and selling require exactly 6 of these screws for end-user assembly.

A small cup that would reliably scoop out at least 6 screws and no more than 7 or 8 screws sounds like a simple and elegant concept.

What does this cup look like? Is it faster to use this cup than counting by hand is? (Is it faster than the reproducible screw counter that he's already built?)


I hope you will too escape from your echo chamber.


I love facts, reasoning, and logic and I'm not known for being biased or opinionated, something that the Ars comments section has become where unpopular points of view are downvoted to hell.

AI is mocked even though the vast majority of Ars commenters have extensively been using chatbots for years. You know how it's called? Hypocrisy.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: