Saturday, September 11, 2010

What is this Java you are talking about?

The first time I heard about Java must have been around 1995 or so. There were these applets and I was looking for something to program my mac. Back then I was not a Unix user nor had any exposure to C. I remember I looked at some examples and I basically did not even understand what was it all about (and I'm not even sure I had the tools to compile the thing). I hated it. That single episode can have had very far reaching consequences in the developer I'm today. I seriously studied Java later. I think it was Java 1.3 or so (and I hated it again). Then I hated Java 1.4. I felt more comfortable with Java 1.5 and I almost overcome my childhood memories around the times of Java 1.6. Besides I did write some serious code with both Java 1.4 and Java 1.5 a part from hating it. The ones who know me perhaps could have been surprised to know that I do advise to learn Java. And it is not something recent (in fact I foresee my future will be much java-centric); it is something I did in the last 5 years. I love Python. I love functional programming, dynamic typing. I love lots of stuff Java does not have and probably will never have. I hate the static typing Java uses, I hate its threading model, I hate above everything else checked exceptions. So why learning Java? Because if you don't read Java, you miss lots of interesting literature. Clean Code is written thinking about Java (and some issues presented are typical of Java, while others are more universal). Patterns of Enterprise Application Architecture is "written" in Java. And so many other books. Moreover, many developers you meet are proficient in Java and in order to introduce them to any other language concept or feature you better express it in terms of Java. The idea I slowly matured is that Java is slowly replacing C as kind of Lingua Franca in computer science. It does not matter Java has defects (and big ones, too). Many new technologies are created inside the Java platform and while we can discuss if it is a good thing or not, things are going this way. Many things that once would have been done in C, now are done in Java (not that I'm suggesting Java will replace C in the areas where C is strong). A couple of days ago I realized another important truth. I kind of figured it out why the industry is using Java, still I had to understand why the academics do (a part from the joke that they love a language you can't use without classes). And the answer is the same: science is not about how good is the language you use. It is about creating, communicating, sharing knowledge. I could write my next system in clojure (that was exactly what I was planning to do). But then fewer people could understand and appreciate my work, and even fewer could take it and improve it. So... let's hope they will improve Java soon. :)

4 comments:

Valerio said...

Very nice post Enrico, and very interesting view about "Java" as a means to share and communicate.

I agree with you that now Java has become the "standard de-facto" in industry as in the academic world.
Moreover I totally agree that Java is a language that anyone could not miss to learn (probably to avoid it next day.. just kiddin' :)

Anyway, Java misses some features i (and you, i guess) love so much in dynamic languages such as Python.
You talk about Clojure, and I suggest you this article about "Closures" by M.Fawler
http://martinfowler.com/bliki/Closure.html


Btw, a lot of people think that Java implements the real object oriented paradigm thus knowing Java means understand the "real" OO programming with Abstract Classes, Interfaces, overloading and so forth.
I do not agree and for those who trusts this commonplace, I suggest this interesting article (book indeed)

Unknown said...

In fact there is too much code that needs to be "shared" in academic settings and too little time. It's not acceptable to rewrite each tool in the language the group is working with. And even with "all-java" stuff too much code is rewritten because many researchers do not release code in the first place or release code with no documentation and of very poor quality so that reuse is not possible.

Unknown said...

About Fowler's article... well, it happens I usually give a different definition of closure (he mentions that when talking about different opinions on what closures mean).

I think that is because I got closures from the functional world (Lisp/Scheme), which is also the point of view Python adopts. On the other hand Fowler comes from Smalltalk... so the block point of view.

Besides, after many complaints from old Objective-C hackers (coming from smalltalk), Apple added blocks/closures to their C implementation, which means they have been added to Objective-C as well.

Unknown said...

Of course I also agree with your point of view on Java object model. Besides I tend to disagree with whoever talks about the "real" and the "true" of something.

E.g., give a definition of object oriented programming. Call that the "true one" (for example, because it was the first). Then Java comes with its own model... who cares if it's not the "true" one. It either sucks or not (or more likely something in between). That matters: if it's good or not.

Of course when they are more or less the last guys in the OO train it is weird having them talk about the "real OO" (especially when Kay explicitly says that when speaking about OO he thought about something very different from C++ or Java).

But whatever, when the barbarians came to Rome...