While Java hasn't elicited elaborate rants like PHP, there's still a good bit of hate out there. The disadvantages are well established: Java, while fast at executing, slows down development through an incredibly verbose syntax. It's also much easier to create tangled webs of classes, so much so, it's coined the term Lasagna Code (Spaghetti code with too many layers). Still, it's the most popular language out there (or at the very least, in the top #3). Cynically speaking, this is because Java makes it easy to look like you're working.
Actually, yes. There just might be a place for Java in learning programming. There's lots of debate on the topic of a first language (though Python seems to be a consensus). Don't get me wrong, Java is a terrible first language (I don't care about your "standards"). There's so much overhead (Hello World can barely fit into a tweet), teaching Java to newcomers feels like requiring a full APA paper for each homework question.
So Java isn't appropriate for beginning programmers, nor is it a good choice for those who want to Get Things Done professionally. However, there's a huge gap between the two, and that's where I think Java might actually be useful. Remember my rant about the misuse of jQuery (and by extension, all other abstraction libraries)? Here's me, a fairly liberal programmer, advocating the use of verbosity! Next thing you know, we'll all be riding our airborne pig-mounts to work.
Seriously though, the one place verbosity shines is learning. It's much better to learn the system underneath what you're doing in order to write better code when you move to higher levels. In that case, Java seems designed to teach Object Oriented principles. Everything is nouns! Suddenly, absurdly classed systems become teaching examples exaggerated for effect.
This exaggeration also helps teach design patterns. Here's an admission: I never got around to really learning design patterns until I had to in software engineering. Up until then, I was using event listeners without realizing that they were a great example of the Observer pattern. Java's ridiculous explicitness forces students to learn both the how and why of each pattern.
Regardless of how much OO is useful to the professional dev, it's good to know it. One could almost make the argument that learning Java/Extreme OO at this stage will teach our hypothetical student why they're a bad idea.