Wednesday, March 1, 2006

self vs. @ (Python and Ruby)

Until a few days ago, I considered myself a "faithful" Pythonist. I liked ObjectiveC. I liked a bunch of other languages (notably Prolog or Haskell). I quite liked C++ too. But in fact my true love was Python.

I know it is strange to talk about "love". That is of course unproper. Unfortunately I've no time to find a better word to express the thing. Let say that I liked to code in Python independently of what I was coding. Quite the same thing than I love using MacOS independently of the task, while I may be forced to use Windows because of the task, but would not use it if it was up to me (few... complicated period).

Today I read this (pointles) discussion about "self" in Python. There are people who :
  • Does not like to self.foo
  • Does not like to
    def foo(self, else):
        # code

I perfectly understand this position. In fact I do not really like having to reference self explicitly everytime (even if I fully understand why python does this). But it makes damn sense. Explicit is better than implicit.

I'm calling methods/accessing variables of an object. So I should use the conventional way object.method|variable. There should be only one way to do it.

For example Java "optional" this sucks. At least in my opinion. It has no purpose in method/variable stuff. Some use it to make clear that they're referencing instance variables, some use a m_var notation. If you are a C++ programmer you could be using var_ or (if you haven't read the standard quite recently) _var.

Of course having a clear and readable way to distinguish instance variables methods is good. That is clear. It makes you easier to read.

self is boring. I often forgot it and got error messages (about the ninth consecutive programming hour this is not the worst thing I do, however). In this sense I also forget the ruby @. And it's better a spectacular error than a hidden bug. So... go on

I quite don't like having specify self among the formal parameters. You schiznick, don't you know its a method? Actualy Python does not. If you take a normal function and bind it to an object, the first formal parameter is the object itself. So its better that function was meant from the beginning that way and had a starting self parameter.

Of course this is boring, but its necessary and has not real disadvantages (a part from some additional typing.. something that with a decent editor is not a concern Aquamacs/Emacs or TextMate strongly advised).

And so all the knots get back to the comb. Python has this boring self everywhere. And it is there and should be there. Ruby hasn't. The @ makes the code quite readable (expecially with a decent editor). Of course it prevents name clashes and such. Having not to pass self to functions also makes it easier to refactor from functions to methods (ok, we know, in ruby every function is a method, but that's not the point). This in the case that the method uses no instance variables.

But...
But Ruby treats variables and methods differently. Instance variables need a "special" syntax. Methods don't. It's not clear if a method is a function or not (of course it does the Right Thing)

irb(main):001:0] def foo; "foo"; end 
=] nil 
irb(main):002:0] class Bar; def foo; "dont foo" ; end 
irb(main):003:1] def bar; foo; end 
irb(main):004:1] end 
=] nil 
irb(main):005:0] b = Bar.new 
=] # 
irb(main):006:0] puts b.bar 
dont foo 
=] nil 

As I said it does the Right Thing... I'm just talking about readability. Of course you can use self in ruby too... but well. This is something I would have preferred solved in another way, even if right at the moment I'm find quite acceptable to renounce to have "one way" for a bit more pragmatism.

No comments: