Question

Why is most of the agent research and work done in Java? Is there a reason that the developers seem to have completely steered clear of .net framework or is it that it just doesn't get talked about among researchers as the .net is more commercial than Java?

Was it helpful?

Solution

One hypothesis on why Java is used: Researchers are mostly employed in universities. Researchers typically don't code - grad students living on ramen noodles code for them. Most universities are just factories for run of the mill java programmers. Ergo, most research gets done in Java.

Another hypothesis on why .NET isn't used: Disregarding Mono for a moment, .NET is tied to Microsoft's OSs. Chances are likely that the work environments provided by research centers aren't running Microsoft OSs / utilities.

It most certainly doesn't have to do with commercial viability - Java is just as "commercial" as .NET, for what it's worth.

OTHER TIPS

  1. JVM is cross platform.
  2. Java has very good concurrent programming support.
  3. Extensive 3rd party libraries for nearly everything conceivable.

.net (MS) requires invests in licences for the operating system and development platform. Java (Sun) and target operating systems are basically for free (you pay for support), enterprise proven and the language is widely spread.

Agent based programming matters in scaling scenarios. Scaling out on .net (MS) means to invest a lot!

The obvious answer is that Java is free (as in beer) for anything you are willing to invest time on. Other than the hardware, you can run Linux (or Open Solaris, or etc.), a free JVM, tons of free APIs - it is part of the culture, free encourages free.

The Microsoft ecosystem is more of a pay as you go environment. Many tools that have free in the Java world only have for pay options in the Microsoft world.

In the research world, where you have plenty of underpaid graduate students, manpower is much cheaper so the ostensible benefit of the commercial licensed tool saving manpower isn't as much of a benefit. Add to that that a research project could be required to run on a wide distribution of machines. When that happens in the commercial environment, the commercial entity is making money (say for example an e-commerce company that needs to increase the number of machines in its cluster - they are getting more traffic, more sales, more money so they can spend to increase infrastructure). In research, the increased licensing requirements of commercial software aren't necessarily justified by their underlying economics.

So all told, Java gets the preference. Once that happens, that becomes the tool everyone is talking about, and the effect snowballs to the point where .NET is crowded out.

Of course you will find exceptions to all the above, but the point is that it outlines the trend.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top