When Java first came out in the 1990'ies, features such as garbage collection and "write-once-deploy-anywhere" where killer features for its adoption compared to the then dominant C++. As time has gone by, other languages and platforms have caught up, among higher level languages, garbage collection and the same code running on multiple platforms is the norm (though cross-compilation may be required). Java's initial killer features are no longer valid arguments for using Java.
Libraries & Frameworks Not Required Instead, these days many people point towards Java's "ecosystem" of libraries and frameworks as a reason to stick to Java, or at least prefer other languages based on the JVM, such as Scala or Clojure, over other non-JVM languages.
Personally I think this argument is pretty weak: The reason Java has a large ecosystem of libraries and frameworks in the first place is that doing even the simplest things in Java is a laborious task, using a third party library is the go-to solution if you want to save hundreds of hours of work.
My point is that libraries and frameworks is a self-referential argument: they are required in the JVM eco-system because Java is inherently an unproductive language that requires a lot of effort and boilerplate.
The "libraries and frameworks" argument simply doesn't hold true if you use a more productive language such as Scala, Haskell or a Lisp such as Clojure - the language choice itself will allow a competent developer to easily and quickly do things that are hard and laborious in the world of Java. I'd argue that there is no longer a valid reason to instinctively stick to the JVM platform because of the "Java ecosystem". In fact, though Scala and Clojure are JVM languages, I've found in practice that relying too heavily on Java-based libraries and frameworks can hold you back considerably, as they do not support the idioms and productivity boosters that are possible in Scala or Clojure.
What About Familiarity & Maintenance? Another argument that may be used for the JVM as a platform is the familiarity of developers and operations people with running and maintaining the platform. This may hold some truth in the classical environment where developers who write a piece of software are different from those that maintain and run it after it has gone live. However I have repeatedly argued that this is an organizational anti-pattern:
Cost of software ownership tends to be high precisely because the people who initially develop an application are different from those who are then responsible for running and maintaining it. Regardless of professionality, the mere fact that someone will not feel the pain of daily running of a system means that they are likely to trade-offs that adversely affect running- and maintenance costs. If someone knows they will maintain a system, they are more likely to put in the effort to ensure that 3am Saturday morning support call doesn't come, it's simple human nature.
The second part of the familiarity argument is developer familiarity,
and I'll concede this is one that may in the short term hold some
weight, but in the long term is irrelevant: from helping train several
Java developers to Scala, I've found that the "softly, softly" approach
of writing "better Java" actually results in slower learning - the best
results are reaped when people are forced to make a clean break with old
bad habits, and learn new good ones. Yes, there is a period of some
weeks or even a month where productivity will be lower, but over just a
few months, productivity and skill will be considerably improved, so
they pay-off term is relatively short.
Making a clean break with imperative programming in general and Java in particular has great rewards for an organisation. There is no reason in the second decade of the 21st century to keep to the old safety blanket of programming in a way that simulates manipulating computer memory registries when we have much better tools with much higher levels of abstraction available to us.