> Lua is a lot like Ruby or Python but about an order of magnitude faster
I'm surprised to hear either of those statements, both that Lua is a lot like Python and that it's an order of magnitude faster. Now, the last time I seriously looked at Lua was for my master's thesis work about four years ago, so I'm willing to believe that my impression of Lua is out of date. However, my experience on using it for a medium-sized project was that it is nothing like Python. My procedure for programming is
1) think of semantic construct,
2) think of how to express this construct in this language
3) type it in
My experience with Python is that step (2) is almost automatic - for whatever reason, every construct I'm used to using has a clear, terse expression in Python. And I've never spent an extended period writing Python, so I don't think it's because Python has really shaped the way I think. (The closest analogue to Python programming is using the STL in C++, in my experience.)
With Lua I had exactly the opposite experience. Every time it came to step (2), it felt like what I wanted to say was almost expressible in idiomatic Lua, but not quite. It just felt like I was running a race with a 50 pound backpack the whole time. Every time I tried to write a Lua script, I'd eventually throw up my hands and write my program in Python instead.
Now, it's certainly possible that I just never really understood Lua's paradigm that well, and that if I'd kept going I would eventually have reached enlightenment. I don't think so, though - tables just aren't that hard, oonceptually. However, the point is that, out of the box, Python gives me by far the least cognitive friction and Lua by far the most of any programming language I've ever used. I therefore have to strongly disagree that Python and Lua have much in common other than a small overlap in the niche they each fill.
And as for the order of magnitude speed difference, I find that hard to believe. Again, my information may be out of date, but when I looked at it Lua converted all numbers to double precision floating point numbers internally, which dragged its integer benchmarks down quite a bit. (In fairness, I think that was the same time frame where Python could only use primitive integers for the numbers 0-99, and had to box everything else.)
Lua is much faster than Ruby/Python/Perl/&c. It still doesn't get close even to the higher-level compiled languages like Common Lisp and Haskell, but e.g. the programming language shootout [1] makes it pretty clear that Lua is still a league above the other dynamic languages. For just a quick result, I tried timing just the Hello World program in Python, Ruby, and Lua, and while startup time isn't going to be the bottleneck of the world of modern web development, it's still a telling result:
$ time python2.7 -c "print 'Hello, world'"
real 0m0.107s
user 0m0.080s
sys 0m0.023s
$ time ruby -e "puts 'Hello, world'"
Hello, world
real 0m0.338s
user 0m0.030s
sys 0m0.013s
$ time lua -e "print('Hello, world')"
Hello, world
real 0m0.008s
user 0m0.000s
sys 0m0.003s
That said, I kneejerk-ly agree with you about Lua being an awkward medium for expressing programs. I personally haven't ever gotten to the status where Lua was as natural as Python, but I have also used Python a great deal more, so I don't know if my attitude is attributable to Lua's design, or Python's plenitude of libraries, or merely the experience differential between the two.
My timings where different on my 6 yr old Pentium D dual core, Ubuntu 64 bit:
time python2.7 -c "print 'Hello, world'" && time ruby -e
"puts 'Hello, world'" && time lua -e "print('Hello,
world')"
Hello, world
real 0m0.056s
user 0m0.040s
sys 0m0.000s
Hello, world
real 0m0.008s
user 0m0.000s
sys 0m0.000s
Hello, world
real 0m0.061s
user 0m0.000s
sys 0m0.010s
I just looked at the Alioth benchmarks and found that Lua (not JIT) is around three times faster than Python 3. I remember the differences being larger with Python 2.x, but I'm not going to take the time to look for those benchmarks. On one benchmark, Lua was much slower.
I barely know Python, though I've found it fairly easy to pick up and use when I've needed to. I know Ruby pretty well though, and I find Lua about as easy to use as Ruby.
I'm surprised to hear either of those statements, both that Lua is a lot like Python and that it's an order of magnitude faster. Now, the last time I seriously looked at Lua was for my master's thesis work about four years ago, so I'm willing to believe that my impression of Lua is out of date. However, my experience on using it for a medium-sized project was that it is nothing like Python. My procedure for programming is
My experience with Python is that step (2) is almost automatic - for whatever reason, every construct I'm used to using has a clear, terse expression in Python. And I've never spent an extended period writing Python, so I don't think it's because Python has really shaped the way I think. (The closest analogue to Python programming is using the STL in C++, in my experience.)With Lua I had exactly the opposite experience. Every time it came to step (2), it felt like what I wanted to say was almost expressible in idiomatic Lua, but not quite. It just felt like I was running a race with a 50 pound backpack the whole time. Every time I tried to write a Lua script, I'd eventually throw up my hands and write my program in Python instead.
Now, it's certainly possible that I just never really understood Lua's paradigm that well, and that if I'd kept going I would eventually have reached enlightenment. I don't think so, though - tables just aren't that hard, oonceptually. However, the point is that, out of the box, Python gives me by far the least cognitive friction and Lua by far the most of any programming language I've ever used. I therefore have to strongly disagree that Python and Lua have much in common other than a small overlap in the niche they each fill.
And as for the order of magnitude speed difference, I find that hard to believe. Again, my information may be out of date, but when I looked at it Lua converted all numbers to double precision floating point numbers internally, which dragged its integer benchmarks down quite a bit. (In fairness, I think that was the same time frame where Python could only use primitive integers for the numbers 0-99, and had to box everything else.)