Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
JDK 15 (java.net)
264 points by aib on Sept 15, 2020 | hide | past | favorite | 212 comments


I'm so happy that we finally have multiline text blocks in Java:

  String query = """
               SELECT "EMP_ID", "LAST_NAME" FROM "EMPLOYEE_TB"
               WHERE "CITY" = 'INDIANAPOLIS'
               ORDER BY "EMP_ID", "LAST_NAME";
               """;

It's one more step closer to scala/kotlin/etc.


Shame they don't actually include templating inside of the string literal. Still have to append with a lot of tokenized replaces. Since they've committed to no formatting in the literal section, will be interesting to see if they eventually support a syntax like python's f-strings: f"""${foo}"""


Maybe you're looking for String::formatted?

Directly from http://openjdk.java.net/jeps/378:

  Another alternative involves the introduction of a new instance method, String::formatted, which could be used as follows:

  String source = """
                  public void print(%s object) {
                      System.out.println(Objects.toString(object));
                  }
                  """.formatted(type);


I'd still prefer honest-to-goodness string interpolation. Formatting functions with positional placeholders need a linter to be kept (vaguely) maintainable, and I suspect that formatting functions with named placeholders would be difficult to make acceptably ergonomic or performant in Java.


> I suspect that formatting functions with named placeholders would be difficult to make acceptably ergonomic or performant in Java

yamlp performs string interpolation from YAML files:

https://bitbucket.org/djarvis/yamlp/

I reuse my library in my text editor:

https://github.com/DaveJarvis/scrivenvar/

The text editor can interpolate strings in a variety of contexts (such as Markdown, R Markdown, XML, and R XML documents):

https://www.youtube.com/watch?v=u_dFd6UhdV8

As far as I can tell, there is no performance degradation from the recursive interpolation; the editor reads, interpolates key/value pairs, and substitutes the resulting values into R Markdown documents riddled with up to a thousand simple formulas in about 600 milliseconds (on my desktop computer).

Also, the editor provides a way to maintain the interpolated variables in a tree-like view:

https://github.com/DaveJarvis/scrivenvar/blob/master/README....

The substituted values appear in the preview pane in their final form. This is all real-time and pure Java.


Which one is easier to mess up and harder to debug afterwards?

A. "${label1} ${label2} ${label3} .... ${label 400}"

B. "%s %s %s ... %s".formatted(label1, label2, label23, ... label400) // Ooops, typo!


For the particular typo you gave, it seems to me to be equally likely with either method.


I'll admit it wasn't the best example.

Here, have another one:

I want to have a string with this content:

"Fluffy Nero Polly<random text> ... <random text> Trexxie"

Which one is easier to figure out and debug?

A. "${cat} ${dog} ${bird} <random text> ... <random text> ${dinosaur}"

B "%s %s %s <random text> ... <random text> %s".formatted(dog, giraffe, dinosaur, ..., cat).


Maybe A, but not (IMO) dramatically so. But shouldn't A actually be "${cat} ${dog} ${giraffe} <random text> ... <random text> ${dinosaur}"? That makes it closer in difficulty to B.

To me, the difference shows up more with this scenario:

A: "${v1} ${v2} ${v3} ${v4}"

B1: "%s %s %s %s".formatted(v1, v2, v3)

B2: "%s %s %s %s".formatted(v1, v2, v3, v4, v5)

B3: "%s %s %s".formatted(v1, v2, v3, v4)

A does what you tell it, whether it's wrong or right. B has a chance of warning you that the argument list and the format specifier don't match. On the other hand, B gives you two things that have to be kept in sync with each other, and A can't get out of sync, since there's only one thing.

So: Less of a chance to make the error, or more of a chance to catch it. Which is better? I lean toward A, but I will admit that it seems subjective.


> Less of a chance to make the error, or more of a chance to catch it.

B only catches mistakes in the case that the number of parameters doesn't match the number of placeholders, which isn't even possible with A. If the string you built isn't correct, that's on you in either case, and should be immediately obvious by looking at the output. So in that sense, interpolation is strictly better than `formatted` at dealing with potential mistakes.


Go doesn't have interpolated strings but printf / sprintf and co, and its compiler will warn and error if your arguments do not line up or have the wrong type.

I mean it's not ideal, but sprintf and co are only intended for relatively short text (like logging), if you have any more, use a templating language instead. Plenty of those in Java as well (JSP, Velocity, etc).


This was already there, and is much less user-friendly than string interpolation, because it's easy to mess up order


http://manifold.systems/ has a Java template engine that can be helpful


Will definitely check it out!


The Manifold project[1] provides string interpolation for Java. You enable it as a Java compiler plugin, it's pretty cool.

[1] https://github.com/manifold-systems/manifold/tree/master/man...


while cool, it's pretty telling that similarly high-level languages like C#, Kotlin, Scala, Python, Swift and Javascript have had this widely-used quality of life for years while Java doesn't


And by telling I think you mean that over time Java has very thoughtfully evolved in a way that respects backwards compatibility, or that once a feature is in the language it'll very likely stay there for the next 25 years. Yes, innovation is very important (and happening faster than ever now with the 6-month release cadence), but a majority of the work isn't in deciding what goes in, it's deciding what stays out and if something does go in how does it satisfy both audiences of "move slow don't break" and "move faster we want features".

A good video on this process: https://www.youtube.com/watch?v=2y5Pv4yN0b0


Thanks. Looks very interesting


I'm glad Java is being conservative and not doing that. JS/Python f-strings have not proven themselves long-term. It also bloats the language with yet another mini-DSL. You can always add a proper templating engine on top.


Java relies on String + Object for composing strings, thus unsafely converting any object into string. An implicit conversion, akin to what JavaScript is doing.

String interpolation would be safer, as it would make the intent clearer, and you wouldn't risk bumping in corner cases. It would be even safer if the protocol could be overrided such that you wouldn't have to use Object#toString, but for Java that's too much already.

Java has been a very conservative language. And I understand why.

But it's funny how, for many features that were added later, people were rationalizing their absence with such lines too. E.g. we don't need anonymous functions / lambdas, as anonymous classes are enough. Well, turned out that Microsoft was right all along when they released those in J#.

Also adding a template engine is overkill for doing string concatenation.


> But it's funny how, for many features that were added later, people were rationalizing their absence with such lines too

There's nothing wrong with that. If string interpolation is really useful, then Java will add it some years down the line.

It's cheap to add features but literally impossible to remove them. There's no way to undo a mistake in language design. That's an asymmetry. I'd rather err on the side of caution than kitchen-sinking it.


Oh, I don't know about that. I wouldn't call it impossible.

I like how Java binaries still work on the latest Java, however, if distribution happens via binaries, why should the language keep source compatibility anyway?

Or, you know, the latest compiler could allow you to select the source code version you want. And automated code migration tools can work too.

---

Err'ing on the side of not getting features is why Java has lost a lot of mindshare.

Java is still super popular, but that's basically in spite of the language itself, because the language is awful.


> If string interpolation is really useful, then Java will add it some years down the line.

Now is the year. String interpolation is really harmless feature.


It deeply pains me that String is only a concrete type and that Object.toString() does not implement an interface.


J++, J# was as transition into .NET.


Indeed, you're correct.


> JS/Python f-strings have not proven themselves long-term.

Are you saying that they haven't been around for long or that they failed? If latter, then I respectfully disagree. Any recent JS and Python code I have seen uses them extensively. They are also a joy to use, and imho improve readability immensely. Ergonomics matter. This is one aspect of Python3 I would miss the most if I had to go back to 2.


Conservative? We've had "f-strings" since 1979, at least.


The ones in Python work significantly differently though; they have inline code execution (rather than using a varargs list after the f-string itself).


So does shell script and several modern programming langs. The sky hasn't yet fallen.


The Bourne shell had parameter expansion in strings in 1979, just FYI.


I've just executed:

echo "$(ls)"

in my zsh, so I'm not sure what to say, to be honest.


Closer to:

echo "Username is ${USER}"

As I recall (and Wikipedia seems to confirm), parameter substitutions were in the original bourne shell in 1979, so... Yeah, I'm not sure what's going on there.


Does this result in a string with a hanging indent, or is the common prefix removed?

(Sorry, being somewhat lazy, I could pull up the spec).


The common prefix is removed:

> Incidental white space surrounding the content, introduced to match the indentation of Java source code, is removed.


I hate such kind of magic. Very convenient at first but can cause countless hours spent on debugging in some cases.

I like how Scala does this via the indent marker and strip.


I was going to answer gp "probably not" - but I would've been wrong. It looks to me like Java now has some of the most sophisticated in-line multi-line strings/here-documents that I'm aware of in any language. More details in:

https://openjdk.java.net/jeps/378

I'm not entirely sure "most sophisticated" is entirely a good thing - but the jep looks pretty thorough at least.

Ed: how does Scala do it? Java uses the indents for the closing triple-quote to determine stripping/indent + cutting trailing spaces. Ie, if my reading is correct:

  a = """
         OK.
      """
Is "OK.\n" (even if there are any spaces after the dot).


At the string literal level doesn't do anything, but the standard library adds an extension method to string called `stripMargin`. This has some nice examples https://www.oreilly.com/library/view/scala-cookbook/97814493...


The indentation is removed at compile time, per the JLS: https://docs.oracle.com/javase/specs/jls/se15/html/jls-3.htm...

> Incidental white space is removed, as if by execution of String.stripIndent on the characters resulting from step 1.

The new stripIndent() method is provided as a convenience to developers.


That's for Java, the parent is talking about Scala.


It helps that Java is exclusively a compiled programming language, so these are just syntactic sugar concerns for the parser/lexer and the compiled bytecode ends up looking identical to having just used one long string with inline \n.

So given that, I don't think it really ends up being that sophisticated. I think this could be a decent easy programming question for a technical interview though!


I had to use this recently for an online course. It was super nice:

  def pushConstant(num: Int): String =
    s"""
       |@$num
       |D=A
       |$accessTopOfStack
       |M=D
       |$incrementStackPointer
       |""".stripMargin
accessTopOfStack and incrementStackPointer are other functions that return text. It made composing Assembly a breeze.


Yup, and it’s super nice for json or yaml strings.


IntelliJ shows you exactly where the block starts, so if you have a line with less indentation (causing the entire block to start at that column), it is easy to spot.


Thanks for sharing that. I can definitely see myself using the multi-line blocks since they are much better than the kind of things we need to do today.

At the same time I'm slightly scared of that magic. It helps that it's well-defined in the JEP.


Thanks, quote much appreciated!


If you’re writing code like this I can highly recommend http://www.jooq.org/


and closer to humans


I'm waiting for Try and Either in Java. Multiline text blocks are good but not enough.


If you want those, you can use Vavr[0].

Ultimately the issue isn't the presence or absence of those classes (because they aren't hard to write minimal versions of as a one-off if you need them and can't allow the dependency), it's the pervasiveness of use (or lack thereof) by libraries and especially the standard library.

If Try and Either were added to the Java stdlib tomorrow, the rest of the stdlib would still be stuck throwing exceptions for error propagation, because they can't change the return types of existing methods. The only benefit would be that third-party libraries could feel comfortable enough to use them in their type signatures. But, in practice, most libraries still target Java 8, so I wouldn't hold my breath for that, either.

[0] https://vavr.io


I think we can do better than Java, Scala and Kotlin. All of these ended up to be piles of abstractions and complexity. Scala went too hardcore for somebody that just wants to deliver. Kotlin gives you anxiety everytime they push for supporting a new platform. Java, though it is still nice, has accumulated too much cruft during the years. I wish for a Go-like language on the JVM. A modern language that offers you a minimum number of features from which you can build upon. Moreover, lower, well-thought out abstractions would give a boost in perfomance with the new garbage collectors that have been added in the JVM. I can imagine Clojure is something that I would like to try again sometimes, but the fact that it is a LISP and lacks even simple types make it less attractive to the majority of devs. Also, Clojure wouldn't be my choice for high-perfomance if I want to do mostly imperative stuff.


Scala is perfect for just delivering, myself and my business partner have been using it to rapidly prototype and build for a decade now. You just avoid all the stuff that tries to make scala into haskell and use it as the hybrid OO/functional language is was designed as.


You're lucky that you can do that kind of work. I've been working for a state company in Bulgaria and we had to maintain some Scala pipelines built with Apache Spark some years ago. It wasn't a pleasant experience, just a sea of vars/vals and cases of reinventing the wheel only to use some functional constructs when it could be simply done in Java by just adding a dependency that has already solved that problem like 10 years ago. It wasn't even some complicated logic in that, just calling different APIs and playing with dataframes, but the fact that they hid from me so much behavior only to reduce verbosity gave me some stressful days.


Any partitular reason scala was chosen? Spark has Python and Java API's as well.


Why not use that java dependency in scala?


Culture it seems.

Functional style is cool and some people will go to great lengths to use it.

Like shaving a few characters of a perfectly fine piece of code by using currying and instead make one single long line of code describing it.


Guest language syndrome. All libraries must be idiomatic instead of reusing the platform language based libraries.


Exactly that.


Use Groovy.


+1 to Groovy.

Can be very simple and elegant, like a better Python, on the JVM.

But also very rigorous, as a Java superset, with powerful features like mixing dynamic and static compilation in the same class.

Or even deep, with run-time metaprogramming and compile-time AST manipulation capabilities (giving tools like Gradle, Spock...).

And you have of course multiline interpolated Strings. And so much.


I've been using Groovy extensively to script and automate inside of Jenkins, and hoo boy it's actually kinda nice.

My absolute favorite thing is being able to do something like:

    someIntegerCollection = someStringCollection*.toInteger()
I also find the use of closures for things like collection filtering to be quite nice:

    canadianUsers = someUserCollection.findAll { user ->
      user.countryCode = "CA"
    }
The equivalent in Python is a little more cumbersome and overly verbose:

    canadianUsers = [user for user in someUserCollection if user.countrycode = "CA"]
I'll still be writing things in Python whenever I can, but when I do write in Groovy there's a lot of nice stuff I get to have.


And if you need it, static typing is one annotation away... This is such an underestimated feature. While prototyping it might be easier to use dynamic typing, but once a project becomes complex and has more people working on it etc, - you might want to switch it to statically-typed. Groovy simplifies this immensely and minimizes the amount of code that needs to rewritten.


I guess that is the stack behind the great Aviato? Groovy from back to front.


I'd be good with TypeScript running on the JVM, even.


I've got a few very specific use cases where I'm going to be so glad to have sealed classes[0]. Oddly not mentioned in the release notes, but they're like a final class, except you can say "These classes are allowed to be subclasses of this class".

Why is that useful? Because now you can do a switch statement that matches on type:

    int getCenter(Shape shape) {
        return switch (shape) {
            case Circle c    -> ... c.center() ...
            case Rectangle r -> ... r.length() ...
            case Square s    -> ... s.side() ...
        };
    }
As long as Shape is sealed, the compiler can be confident that these are all the possible subclasses of Shape.

[0]http://openjdk.java.net/jeps/360


Sealed classes also mean you don't have to play visibility tricks with your API to prevent an abstract class or interface from being inherited and can safely publish everything as part of your API without worry.


Wasn't that the point of jigsaw?


Note that this is not in JDK 15, but an upcoming direction for the language:

> The ability to reason clearly and conclusively about permitted subclasses will be realized in a future release that supports pattern matching.


What, really?

But http://openjdk.java.net/jeps/360 says:

> Release 15

Have they lied to me? Those monsters.


That page is titled: "JEP 360: Sealed Classes (Preview)"


... good enough for me. I'll add whatever flags I need and begin depending on it in production code.


I'm really intrigued at what pattern matching in Java will look like, and happy that our Java-land brethren will be able to use them. It's one of the features that I can't live without even though at first I thought of them as weird and "multiple ways of doing the same thing".


If you are interested, the draft for the pattern matching can be found here: https://openjdk.java.net/jeps/8213076

If the final version ends up working as explained in the draft, it is going to be a fantastic addition to the language.


in case you missed it, you might find this interesting: https://mail.openjdk.java.net/pipermail/amber-spec-experts/2...


Wow, good to see they have borrowed yet another feature from Scala.


In a sense, Java adoption of a language feature is the ultimate validation for said feature.


This is something I really like about the recent direction of JVM languages. It takes a bit for useful things to make their way into Java itself, but by the time they do they're well-shaken and we can see how they got used in practice.


Algebraic data types has been available in various programming languages for more than 40 years.


Sure, sure. But the point here is that there are JVM languages that have shown that these things can be done well on the JVM.


And Scala even uses the same keyword "sealed". Glad they took that on the Java side, and didn't change it just for change's sake. Maybe there are other languages that had a sealed keyword but again, with Java, they are clearly accepting as convention some of the things that Scala established, which is a good thing.


C# had sealed first.


Maybe your example is just too simple, but why would you do that instead of overriding the getCenter method in each subclass?


You're right that in the simplified example you have probably nicer alternatives, but having sealed clases where you know that others can't expand has its benefits to design some nice APIs, like state machines and type safe builders.

The benefits of enums (as these are referred to in some languages) is that after the safe downcast you have access to the fields and methods of the specific type just by invoking them.

Pattern matching also makes this construct even nicer to use (and I would argue that having one without the other is worse than not having either), but that doesn't seem to be included in the language yet, very probably for good reasons (I just don't follow Java development closely so I don't know about them).


Pattern matching is available as a preview feature: https://openjdk.java.net/jeps/375


If you're asking for a example you'd be more likely to use, JSON is a pretty good example. The GSON interface [0] has isArray()/getAsJsonArray(), isObject()/getAsJsonObject(), etc. for determining the type of an element, but this isn't type-safe. Pattern matching is a more convenient way to work with raw JSON:

  public int countNodes(JsonElement el) {
    return switch(el) {
      case JsonObject obj ->
           obj.entrySet().stream()
              .mapToInt(e -> countNodes(e.getValue())).sum()
      case JsonArray arr ->
           arr.stream().mapToInt(::countNodes).sum()
      default -> 1
    }
  }
[0] https://www.javadoc.io/doc/com.google.code.gson/gson/latest/...


This is very typical in functional programming languages. You add functionality through compiler assisted ”pattern matching” instead of method polymorphism. Its benefits are more apparent if the feature is not so obviously inherent to the classes. Say you are defining the way to transmit the shapes to some random legacy system. Then you can stow away the legacy related code somewhere and still have the compiler check that you have handled all shapes.


Sealed classes don't have to have anything in common. Sealed classes are an example of a discriminated union. A contrived example is you might have a function that takes or returns circles or ducks but _nothing else_. Not triangles, not pigs. It has to be a circle or it has to quack.

Where are these used? Well, they are used in a sense all over the Java language already in the form of methods with checked exceptions, which say in their contract that they return either a successful result or any of the declared errors but nothing else.


I would assume the Shape family lives in a third-party library and getCenter lives in first-party code.


Wow, I can't wait for this. May actually enjoy writing Java if this and Records get merged in. Absolutely massive for decomposing functionality.

Too bad it doesn't appear to be coming in 15 :(


Just saying that Kotlin has had this since ages.


Nice to see ZGC marked as production-ready. With Shenandoah, ZGC, and G1 the options for garbage collection in Java are so much better than they were just a couple of years ago. Even better, the need to do a bunch of brittle tuning and tweaking has mostly gone away.

Java still makes the GC work harder than other languages. In some of my testing, idiomatic "microservice" Java code makes ~10x as much garbage per request as equivalent idiomatic Go code. Some of that is the language, and some is just that libraries and frameworks are super allocation-heavy. If the ecosystem can tighten that up, then these great GCs will have even more impact.


I believe the main reason for this is that Golang is heavily "value-oriented". Combined with escape analysis it causes most of the short-lived objects to actually be allocated directly on the stack.

While the JVM also has escape analysis, because everything is a pointer the chances of an object escaping is considerably higher.

I'm hoping that when Project Valhalla eventually lands, it'll help greatly with allocations and reduce memory pressure even further.


Fully agreed about java libraries and frameworks being more allocation-heavy than they need to be. The problem is that the libraries and frameworks are built using the abstraction capabilities it has, and in Java, those abstraction capabilities are very allocation heavy.

I'd love to see a jvm or graal implementation of ASAP memory management. The idea generally being having extremely aggressive analysis on your code, reducing heap allocations where possible, statically managing objects where they can provably infer object lifetimes, reference counting everything left over that provably can't result in cycles, and then garbage collecting everything else.


> Fully agreed about java libraries and frameworks being more allocation-heavy than they need to be.

Much to the chagrin of Martin Thompson, even the standard library is affected. Avoiding allocations is something most Java programmers don't even think about, because they are not aware it might be an issue and because there is no alternative - in Go and Rust you can stack-allocate instead, in Java you can't (not really).


GraalVM has Partial Escape Analysis, which improves on HotSpot's Escape Analysis by finding even more allocations that are suitable for Scalar Replacement. It's supposed to be especially well suited for highly functional code e.g. anything that makes heavy use of closures.


"ASAP memory management"

Do you mean this?

ASAP: As Static As Possible memory management

https://www.cl.cam.ac.uk/techreports/UCAM-CL-TR-908.pdf


Yes. To be clear though, the idea is already hard enough to implement, but it would be incomprehensibly hard in the JVM ecosystem. The JVM directly implements allocation decisions, but the analysis necessary for ASAP is done mostly in higher level reprensentations of code, and there is no direct API in java byte code to tell the JVM to statically allocate or reference count a specific object.


Hardly incomprehensibly hard. It's been done already, it just never got out of the lab into the product.

http://codrutstancu.com/p81-stancu.pdf

SVM has support for snapshots/isolates, where you can ~instantly instantiate a new JVM with a fresh heap. If you set your GC params such that it never really collects, you get something close to that but without the complexity.


Guess so. That paper is pretty well known.


> reference counting everything left over that provably can't result in cycles, and then garbage collecting everything else

This is interesting. I skimmed some introductory parts of https://www.cl.cam.ac.uk/techreports/UCAM-CL-TR-908.pdf as well as the definition of the SCAN function and searched the PDF for "reference count" to see how it does what you propose. At the moment I don't have the impression that ASAP dynamically reference counts anything. Let alone selectively reference count certain objects but use a tracing GC for others. Could you tell me where exactly to look for details? Wouldn't references from RC objects to GC objects, and vice versa, be an absolute pain to deal with?


Sorry, I seem to have partly confused this with another approach, but can't find the paper at the moment.

That being said, it does read like it uses similar approaches to automated reference counting: code is inserted by the compiler to manage reference counts and eventually dispose of objects when no longer referenced. It also appears that there are approaches toward using regions and linear types where possible. I think the takeaway though is that ASAP makes as many decisions as it possibly can to statically enforce memory safety and reclamation, but building a finely grained program-specific approach toward garbage collection for anything else. Literally every distinct program would result in a distinct garbage collector tailored to that program that is inlined into the code.


Depends, if you look at projects like Elasticsearch, you see a shift towards using a lot of off heap memory and less reliance on heap over time. Mostly the JVM and languages on the JVM are evolving towards things like having immutable data structures, async processing and IO, using data classes/structs, etc.

A lot of my recent projects have actually been fairly modest in terms of memory requirements compared to a few years ago where I'd have thought nothing of using 4GB heap on a simple spring app.


What is idiomatic in Java? Netty does little-to-no allocations. One can implement very thin layer on top of Netty to provide API similar to Go standard HTTP server.


You have to work extra hard and give up language niceties to reduce allocations, more so than certain other languages that are more value oriented.

You want an List<T> of objects? It's now a list of pointers to individual allocations. You want a Path class instead of just passing strings around, for type safety? One extra allocation per object.


I thought the rule of thumb was that only live objects matter for GC performance (mark-and-sweep only counts what's alive and just purges everything else). Is that not true in modern era?


The more allocations you have, the more frequently your young-gen GC has to run (or the larger your heap size needs to be to keep GC frequency constant). Allocations are cheap, but they aren't free, and you definitely pay a cost in GC.


I haven't been this excited about Java since JDK 1.2.

Especially Project Loom. Virtual threads will be a game changer. A return to original Java's rallying cry of programmer productivity.

There's always been steady progress, sure. But it seems like Oracle is much less risk adverse. Or maybe the governance changed. Or maybe Sun, IBM, other had been holding back a lot of progress. I'd love to know the backstories. To better inform future efforts.

Regardless, Java today is amazing and rapidly getting better.


Nice. We talk about this quite a bit. Start with https://inside.java and https://inside.java/podcast.

There are lots of reasons for the continual stream of innovation but the biggest change has been the 6-month release cadence.


> Virtual threads will be a game changer.

I never really understand this. Tasks and threadpools work great. Are you excited that you can reduce memory usage? I have never been restricted before.


What if your tasks use blocking IO, and the number you tasks you have exceed a reasonable number of threads?

Your answer might be to use async, event-driven IO instead, but the central thesis behind virtual threads is that this is a terrible programming model, and there's no reason the programming language can't help you write your code top to bottom, but still get the desired performance.



Do you know if Loom has any implications for reactive frameworks, like reactive spring? Or do they already contain a... "non native" implementation of this concept?

https://spring.io/reactive


With Loom, the simple, blocking, non-reactive frameworks (like Spring MVC) should give you similar throughput to the reactive ones -- that's the main impact. I don't know how reactive frameworks intend to make use of Loom.


Loom will be a performance game changer even for reactive frameworks. I'm not even talking about it the fact it use io_uring and other SOTA OS technologies. I'm talking about the fact it allows to have a non blocking socket thus enabling for the first time asynchronous jdbc access and thus postrgresql pipelineing/batching. This will allow spring to come at the top 4 fastest http servers on earth according to the benchmarckgame


Thanks I'm honored you're replying. :) Looks like performance improvement will be nice.


I really liked Ron Pressler's [1] presentation on his work. The details from the Q&A really helped me.

https://youtu.be/23HjZBOIshY?t=152

This is good too.

https://archive.fosdem.org/2020/schedule/event/loom/

I'll butcher it, but my TLDR is: Program sequentially, get threading for free. Any blocking calls are now async under the hood.

[1] aka @pron or https://news.ycombinator.com/user?id=pron


Investment in Java slowed down in Sun's dying days. Oracle has increased the investment considerably.


What's funny is there were versions of the JVM for Sparc way, way back in time (Java 1.1) that had so-called green threads which were multiple threads mapped to a single (or multiple) native threads.


I think you may be confusing two different concepts here, which is understandable given the fact that this happened some 23 years ago.

The first versions of Java only had green threads. That was a cooperative threading model where only a single OS thread was used and the VM did a context switch while blocking for IO (in essence, every method declared as throwing InterruptedException could potentially context switch).

The mixed model you were talking about is most likely the threading model of Solaris where threads were split into a userspace part and a kernel part. It was possible to configure pthreads so as to use a multiple userspace threads on top of a single kernel thread (in a similar way as green threads worked in Java). It was also possible to run it in a one-to-one configuration or a version where you had a large number of userspace threads mapped to a smaller number of kernel threads. The last model was default in Solaris for a long time.

Eventually it was discovered that kernel threads were so fast in Solaris that there was practically never any benefit to the thread-pool model, and by Solaris 2.6 (I think, it's been a while so I could be wrong about the version) they changed it so that the default thread model was one-to-one.


Is Project Loom included in JDK 15? I really would like to try it out.


Each of the projects (Loom, Panama, Valhalla, Amber, etc.) are in different stages and are intended to deliver features over time. Project Loom itself is not a single deliverable. As mentioned, you can get an early-access build of most of the work that has gone into it so far [1]. The build is based off an incomplete JDK 16 as of this comment. JDK 15 does have one JEP that is part of the path towards Loom [2].

[1] https://jdk.java.net/loom/

[2] https://openjdk.java.net/jeps/373


There are experimental releases.

https://jdk.java.net/loom/


virtual threads are go routines in java. in golang they made a huge difference - for example you can have low overhead generators http://www.golangpatterns.info/concurrency/generators (go routines is one of the selling points of the go language/runtime).


Will the virtual threads be blocked on I/O calls though? It's usually the most common reason threads wait idle. If the implementation is just syntax sugar laid on top of an existing thread pool that can be exhausted by blocking I/O calls then I'd call it a nice evolution but not very groundbreaking.


Will the virtual threads be blocked on I/O calls though?

No. All automagically handled for you. Just write what looks a lot like regular code.

Now, can you change the default behavior? Yes, but now you're on your own.


Isn't much of the excitement gone due to Oracle's license for its JDK?


https://adoptopenjdk.net/ fixes most of the issues.


We hope not, and if it is, it's a misunderstanding. Java is freer and more open than it ever has been. There are many downstream builds under open source licenses, including one from Oracle at http://jdk.java.net


> Java is freer and more open than it ever has been

but pron of Oracle says, a few comments over:

> No one offers free LTS ... those "free LTS" distributions you should know that you're using a JDK that's not fully maintained ... Maintaining a JDK is a lot of work ... For real support for an old version you must pay.

So it doesn't sound like it's really free. If the free version doesn't get security updates you can't really use it in practice.


Why? OpenJDK is free, and if you don't need the Oracle support you don't need to use their JDK, they are almost the same except the support.


Why? There are at least 10 other vendors that provide a free version?


No because anyone that actually works with Java knows the difference between various Java implementations.


That might not be completely fair. In our sister organization, we had a 20 year Java veteran switch his teams to Elixir because he was concerned about Oracle's licensing. Even though it was a misunderstanding, I wouldn't assume "anyone that actually works with Java knows the difference".

Furthermore, it's a little bit of gatekeeping for those who may be entirely new to understand the difference between something like OpenJ9 vs AdoptOpenJdk vs Amazon Corretto.... you get the point.


Which makes sense when you think about it for a moment. Being an expert Java developer doesn't make you good at--or necessarily even interested in becoming good at--legalese.


I get the point that there are still people that don't get that Java is like C, with OpenJDK playing the role of ISO C.

It was quite clear since the beginning, plenty of Code ONE, Devoxx, NDC, JavaZone, Jax, InfoQ talks

Conferences that I expect any veteran to spend some of their learning budget, or at very least read some of the related articles on Java magazines.


I have a different impression. I have the feeling that the Java language itself stucks in it's original idioms. If there would not be other languages on the JVM, such as Kotlin, Graal.* or Clojure, I guess the Java ecosystem would percieve a long starvation/death.

My impression is that languages such as JavaScript and Go gain more and more track for enterprise software stacks. They are much more multi-paradigm at heart then the Java language could be. I wonder why Java doesn't learn from C++, which reinvents itself every few years. The lambda notation in C++ is really a thing Java could easily have.


> The lambda notation in C++ is really a thing Java could easily have.

Java has had lambda expressions, with what I would describe as cleaner syntax than C++'s, since JDK 8. Is there something I'm missing that C++'s have over Java's?

This has also lead to a lot of Java becoming much more functional, see especially the Stream API[0] (also added in JDK 8). I would say it's much, much easier to write functional-style code in Java than it is in Go.

[0]: https://docs.oracle.com/javase/8/docs/api/java/util/stream/S...


Stream API is bad example of design IMO. Checked exceptions are not integrated (even if I don't like them, they're part of the language and are not going anywhere soon it seems). Simple operations require lots of ceremony (list.map(x -> x + 1) vs list.stream().map(x -> x + 1).collect(Collectors.toList())). Also Java lambdas can't change outer variables, so one have to resort to old tricks with one-item arrays.

Java lambdas are just syntax sugar for anonymous class from language point of view. They're implemented differently under the hood, but they don't bring anything new to developer, they just save few lines of code which is nice, but could be much better.


> Also Java lambdas can't change outer variables, so one have to resort to old tricks with one-item arrays.

On the rare occasion that I need to do this, I usually find that I'm thinking about the problem the wrong way and there's a much clearer way to write it than messing with mutable variables outside my closures.


Tend to so but it depends on situation.


Even though there is a book on Functional Programming in Go, I'd say, the entire idea really bends the language into an odd shape. Certainly one could implement map/reduce/filter (I think Rob Pike actually did create examples on his GitHub at one point), but the idiomatic paradigm of Go isn't FP. It's definitely not hard to beat in that arena! Now Kotlin or Scala on the other hand....


Java added lambdas in Java 8. Are you thinking of something else?


Indeed, Java's notation for lambdas is marginally better than C++'s, where you have to use braces and an explicit return even for tiny lambdas.


Things are innovating faster than ever, but, the trick is always to address the challenges of both today and tomorrow, while staying true to compatibility and stability.

Check out Brian's talk on Stewardship: https://www.youtube.com/watch?v=2y5Pv4yN0b0


The problem with metaprogramming, multiparadigm, and so forth, is other people.

I kid!

My bro is a C++ partisan. We've been arguing about this for 25+ years. Every time I'm about to use my crushing grip of reason to submit him, he wiggles free.

He's made a pretty good career from fighting misc compilers over the years. So who am I to judge?


/me nervously glances at all the production infrastructure I manage that's still on JDK 7 or 8.

I've just recently been able to upgrade some things to JDK 11.


JDK 11 is the current LTS version, so that's probably fine. With 2 releases per year, it's hard to follow the bleeding edge. Especially if you have to ship stuff to clients.


Yeah, this seems to be par for the field. Java 11 is what you should be targetting now. We're in the middle of the 8 to 11 migration as well.

I got the impression (but please correct me if I'm wrong) that only this year most major libraries got all the kinks worked out for Java 11. In particular, when using them together.

Guice is one that only recently got the nasty (but relatively harmless) warning you get when using it with Java 11 under control.

Java 17 is the next long term support release, so most of us working on longer lived code bases will probably skip all the releases in between. It doesn't seem worth the effort, because if the Java 9 and 10 experience is anything to go by, you will be hunting down issues with Maven plugins, transitive dependencies, and libraries holding back for now all day.


> [...] if the Java 9 and 10 experience is anything to go by, you will be hunting down issues with Maven plugins, transitive dependencies, and libraries holding back for now all day.

You had issues with 10? The only issues I got were with Java 9, because introduction of modules broke some stuff. Later it was just making sure that ASM or aspectj/lombok is in the most recent version that has new JDK version added, which might have been a problem in 10 (I know I was trying to upgrade on day 1).

But nowadays, all the most popular libraries are on par with the next Java version that is in the works at least a month before the release.

Since JDK 12, I just do an upgrade every 6 months and had no issues so far.

And if you don't ship software to clients then there is almost no downsides to upgrading and a lot of upsides.

If you do ship software to clients you should be using jlink, this way you ship smaller binaries and clients don't need to worry about JDK, because you will provide it to them .

And you also need to lookup what LTS means in case of Java - it is not the same as in e.g. Ubuntu.

In Ubuntu you get free of charge fixes in LTS, in Java you need to buy an LTS version from someone (they will make fixes there) OR use the latest JDK version (which gets all the fixes free of charge).


> Guice is one that only recently got the nasty (but relatively harmless) warning you get when using it with Java 11 under control.

Personally I've dropped guice and just stick to Weld anymore since it's the reference implementation of the standard. I haven't had any problems running it on JDK 14.


It's not hard because the changes are small and gradual, and you have to update your JDK every quarter anyway -- whether you use the current version or an old one -- just to get the security patches. It's just a matter of how that work is spread. It's likely that updating once every 3 years is more work overall. Plus, you've done that already back when there were still major releases, and the semi-annual feature releases didn't get a new integer number (8u20, 8u40 etc. were big feature releases).

Plus, keep in mind that LTS is a paid service. What appears to be "free LTS" isn't really LTS, but builds of the OpenJDK Updates projects. Those builds serve as the basis for various paid LTS offerings, but in themselves they don't actually maintain the full JDK. OpenJDK Updates just backports stuff from the mainline version. If you're using a component that's been removed from the mainline -- and you probably do or else you'd upgrade -- then that component is not maintained in OpenJDK Updates (so-called "free LTS") because there's nothing to backport from.

The only version that's 100% maintained for free is the current JDK (15 as of today).


> It's not hard because the changes are small and gradual

Size isn't the only issue. Version number of fetishism is very much a thing in large companies. Megacorps and banks are very slow to update. I can't tell my customers to switch to Java 15 tomorrow (or even next year). Especially after the complexity of upgrades to 9 and 11, both of which broke things in many subtle ways. This may not be an issue for many applications, but it was certainly an issue for us and our customers.


9 was the last major release ever, and it was a big one at that (bigger than 8). And since most people skipped both 9 and 10, there was even more work. If you use the current version, all that will never ever be necessary again. If you don't, this will happen over and over, and worse: because OpenJDK's development ignores vendor's LTS services, things can disappear between one of the versions with LTS and another without warning. Because your customers don't educate themselves they're condemning themselves to an eternity of pain (paid support, higher upgrade effort, more hardware) when salvation is already here (free support [1], lower overall upgrade effort than ever in Java's history, plus always enjoying the hardware savings).

I admit that doing away with major releases, giving the feature releases new version names, and the last ever major release was too much all at the same time and is confusing. But the payoff in learning what actually happened is big, as is the loss in not. Your customers might be losing both time-to-market and serious money for a lack of a day's worth of education. They can start here: https://blogs.oracle.com/java-platform-group/update-and-faq-...

[1]: The only version for which you get free support from anyone is the current version. No one offers free LTS for old versions. https://news.ycombinator.com/item?id=24487298


That's not true unless you mean "free LTS" JDK from Oracle. There are several other projects providing free LTS, TCK-certified builds including: RedHat OpenJDK & Amazon Corretto to name two off the top of my head.


No one offers free LTS (or, rather, what you call "free LTS" is very different from a real LTS service, which is always paid, and you might not be getting what you think you are). The "LTS release" you get for free are builds of OpenJDK Updates. Try to open an issue on, say, Nashorn against a "free" LTS 11 and see if it gets addressed. It won't be -- unless you actually pay for a real LTS -- because Nashorn has been removed from the mainline and there's nothing to backport. Same for CMS or any other package or component that's been removed.

If you're using one of those "free LTS" distributions you should know that you're using a JDK that's not fully maintained. The only JDK version that's fully maintained completely free is the current version. To be fully safe and run maintained software you can either use the current version for free or use an old version and pay someone for LTS.

Maintaining a JDK is a lot of work. There are a couple of hundred full-time developers working on OpenJDK tip and maintaining the current release; there are ~10 full-time engineers (probably less) from all companies combined maintaining both OpenJDK 8u and 11u. For real support for an old version you must pay.

(I work on OpenJDK at Oracle)


This situation is still a bit hard to understand for me as a simple Java dev. Until now I thought that there were 3 kinds of "LTS":

1) OpenJDK Updates binary builds: they take the source code of the project, build it, and provide binaries. Examples I know are AdoptOpenJDK and the free Azul OpenJDK distribution.

2) Open source LTS: they take the OpenJDK Updates project, then add bugfixes they did for their PAYING customers. They publish the source code of the result. Here I see RedHat OpenJDK (such as the OpenJDK 8/11 builds distributed with RHEL 7/8). Those are then made available for free in binary form as well, such as part of CentOS. If you want a big fixed in those versions you have to pay, but you can benefit from bug fixes made for others. "LTS" here means as long as the RHEL version lives.

3) Paid LTS with custom support. Those do the same as 2, but don't release the source or binaries to the public, only to paid customers. Maybe there are even custom builds for specific customers. That would be Oracle mainly, and Azul and IBM as well.

What's unclear to me is if fixes from 2 flow into 1. Also, I don't know which kind Coretto is (Probably 1).

Is that a correct assessment of the situation?


Almost, although I have to say you know much more than most. Because OpenJDK itself is open-source, all versions are open, except for Oracle's for its paying customers, because Oracle, as the main developer of OpenJDK, owns the source code.

There are other differences, too. Adopt, made by a particularly amateurish team at IBM that is barely involved with the OpenJDK project and quite unfamiliar with its workings, isn't a member of the vulnerabilities team and so gets access to security fixes later than all other vendors (and that's not the only thing that makes their build more problematic than all others). Among those that do participate in OpenJDK to varying degrees, including Amazon, there are differences in how much of the changes to their branded forks they upstream to OpenJDK Updates (RH upstreams more; Azul less).

Anyway, the important thing to know is that there is only one version that is fully supported for free -- the current one, and so the safe choices are either some paid LTS or the current version.


Maybe you should call not LTS but "Oracle's LTS" ? OpenJDK is open so any company should be able to provide own "LTS" but it's differ from Oracle's LTS and you argue that Oracle's LTS is superior than others. (I agree Oracle has great resource to maintain LTS and still manages CVEs)

> Anyway, the important thing to know is that there is only one version that is fully supported for free -- the current one, and so the safe choices are either some paid LTS or the current version.

It looks like overstatement for me but reasonable perspective for Oracle employee.


What I'm saying is that you can buy LTS from Oracle, Red Hat, Bellsoft, or Azul -- that's how all of them make their money off of OpenJDK -- but not a single one of them offers it for free (and neither does Amazon, whose JDK staff is smaller than or similar to Bellsoft's).


While it's definitely important to be aware of what exactly you're getting for your non-money from different JVM vendors, I think that point of view underestimates how many people use LTS versions purely to tick a "we are using supported software versions" box in their corporate bureaucracy's paperwork, and never intend to actually ask for any support at all.


The question is what level of maintenance is required to call something "supported." But yes, there is some maintenance done for free on old versions of the JDK in OpenJDK Updates in the form of backports.

The thing is that the money saved in hardware/hosting by running the current version would more than pay for any update costs. By using the current version, companies will be using better maintained software and paying less money for it. Those "free LTS" builds, which aren't really LTS, is a mind game that keeps companies on old versions that cost more money to run in the hopes that people will ultimately pay for real maintenance, perhaps once they fall behind too far.


Lucky! We still have a mix of 1.6, 1.7, and 1.8; our recent big upgrade came from moving a few apps from 1.6 to 1.8... we’ll get to this jdk in about 15 years.


I'm clearly missing something, but why is this so hard? I thought Java made a point of being fully backward compatible, both at the source level and the binary level (.class/.jar)? Is that true? If so, why is upgrading to a later jdk more work than, well, installing the newer version?


Between JDK8 and 9 there was the introduction of the module system, which broke many systems, cause they did somethings they weren't allowed to do in the first place (and which wasn't included in the backwards compatibility guarantees): Access internal JDK classes. There were some compromises to make it still work (e.g. they continued to allow access to some parts of sun.misc.unsafe until there's a replacement with the same performance, cause most of the things in there are pretty fast), but some things just didn't work with the module system and had to be changed.

By now, I'd say if you depend on a library that still doesn't work on JDK9+ you should replace it, but depending on your requirements that may not be so easy and so some people are still stuck on JDK8.


(not gp) It is about stability, and that magical ephemeral "Something". In software, like JDK or OS, "Something" does change all the time with bug fixes and maybe even just a different builds. Jump from 8 to 11 has a lot of Somethings changed, and transition is not always painless.

I have tests that use JSON serialization and deserialization (and then some), and they pass perfectly fine on Java 8 runtime, but fail in exactly same place on Java 11, without recompilation. I have spent some non-zero time trying to get to the bottom, but so far without much luck.


gson uses reflection to serialize types, including standard library types, which are not guaranteed to have the same fields from one release to another. Presumably the people who write the jdk realized this potential source of bugs and tried to prevent it from happening, meaning fewer ambiguities about what changes are breaking at the expense of breaking gson (and probably others). see: https://github.com/google/gson/issues/1216


Java is better about this than most languages but they still break things sometimes.

For example, in addition to the module system, Java 9 changed the implementation of some of its XML rendering libraries and caused meaningful behavior changes.

https://github.com/CodeFX-org/java-9-wtf has a list of some of the stuff that broke.

That said, 8 -> 9 was the only really painful transition in my experience. Since then the upgrades have been smoother.


It's true for the most part, ideally and on average, even between major JDK versions. But there are zillions of production setups out there which are not ideal or average and in turn are made up of not-entirely-ideal or not-entirely-average other software which itself has to go through and verify the upgrade is sane, etc, recursively. Add things like support relationships to all of this and pretty soon you're looking at real work.


Starting with 9 we saw a lot of functions depreciated for a decade finally removed and other breaking comparability changes. Moving up the Java version system is finally going to stop dependency rot, but there is a lot of old stuff out there, with old dependencies. This might be a Python2/3 moment for Java.


A lot (more than you'd expect) of Java projects and libraries depend, directly or indirectly, on the ASM library (https://en.wikipedia.org/wiki/ObjectWeb_ASM), often including a built-in renamed copy. Newer versions of Java add new functionality on the bytecode generated by the Java compiler (in a backwards compatible way, that is, newer versions of Java can fully read bytecode generated by older versions of the Java compiler), and the ASM library has to be updated to understand that new functionality. The Java bytecode verifier has also become more strict, rejecting bytecode generated by older versions of the ASM library and other bytecode generators (things like the "Illegal type in constant pool" error).

And in the particular case of Java 9, they decided to deprecate several things, and quickly removed them in Java 11 (that is: with less than a year warning, and no warning if you follow only the long-term versions). That includes a lot of J2EE classes which previously came with the JRE, and now became external dependencies (with the most recent version of these external dependencies sometimes having slightly different behavior).


Having Apache Spark as a dependency kept us locked into JDK 8 for ages. They've finally got things updated now, but I don't know what their hold up was.


Same here, though it doesn't seem like they've upgraded fully. The Dockerfile[0] for their Kubernetes base images are pinned at 1.8, for example. I bit the bullet and built Spark against newer JDK's myself in the last couple years, but look forward to ditching that.

The Hadoop ecosystem has definitely been dragging its feet on JDK bumps.

[0] https://github.com/apache/spark/blob/master/resource-manager...


>I thought Java made a point of being fully backward compatible

In theory yes.

In practice the plugin and scanning systems the software I use has had a multitude of failures between JDK8 and 11.


There are at least some versions that aren’t binary compatible. Because of that we have one system stuck on JDK 7 :/


Lots of garbage collection changes and jvm flag changes that are not compatible too.


How Java counts CPUs in a Docker like world has changed multiple times since JDK8.

That might not sound too important but it has an effect on GC chosen amd obviously how you might have tuned your thread pools.


The things you migrated to JDK 11, can be easily migrated to JDK 15 (or at least 14, if you have some libs depending on Nashorn), the biggest issue with migration was JDK 9 (because of the modules), so if you are past that then it is a breeze.


Not to worry: next LTS release will be Java 17, scheduled for September 2021 - next year.


And you know that LTS in case of Java is meaningless if you don't have a support bundle bought for real money?

The only free of charge LTS is the latest java version, all versions n-1, n-2 are out of free support, you can only buy one.


> Removal of Nashorn JavaScript Engine

anybody know why a javascript engine was ever included in core java in the first place? seems... niche. and yet core java never included a simple web server (officially).


It's been replaced by GraalVM JavaScript which has Node.js built in. Performance is supposedly on par or better than the V8 version of Node.js. There's also support for ECMAScript modules directly from Java (without Node.js).

GraalVM offers interop between JavaScript, Python, Ruby, R, C/C++ and of course Java and other JVM languages.

I think GraalVM has the potential to make the JVM something of a universal platform. I guess that was the goal with Nashorn as well, but now it seems much closer to fruition.


It’s great for web development when I want the same code to run client side and server side.


I mostly don't work on web stuff, so maybe this is a dumb question, but can you give an example of what benefits that gives you? What code do you want to run in both places?

In my limited experience writing web code, the client side concerns itself with rendering and user interaction, the server side concerns itself with efficiently supplying dynamic data to the client, and other than request payload definitions the two have no logic in common.


The most straightforward example I can offer is my custom mark-up parser for formatting text. I wrote this parser in Javascript where it runs live in the browser, but when the final text is submitted it's parsed on the server side for security and replicability reasons.

Of course I could rewrite it in another language, but with Nashorn I don't need to. I can execute many hundreds of lines of Javascript with around four lines of utterly boilerplate Java. And it works perfectly. I don't need to worry about edge cases or dealing with subtle variations in regular expressions (don't at me) and other string parsing nuances.

I'm just a sole developer; I don't have minions. If I had to rewrite it today I'd probably look at a transpiling approach, but even then, defensive text manipulation tends to be edge-casey at the best of times.

(Also note that my priority here is browser performance, where it's running continuously on hundreds of clients simultaneously—often budget smartphones with limited CPU power. On the server it only has to run once per submit and the CPU cost on the server is beyond trivial.)


Makes sense. Thanks for the example!


This is the 3rd JS engine with its replacement. The idea I think is that it would be embedded in for scripting like uses, like many C++ games used Lua or Flash.


com.net.sun.httpserver looks like standard enough for practical purposes: https://stackoverflow.com/questions/58764710/is-package-com-...


I once used it for an embedded expression evaluation language in a Java application.


I used it quite a bit for simple scripting before I learned python.


For some context--I had to look it up for myself

  Release         GA Date           Premier Support Until
  11 (LTS)        September 2018    September 2023
  12 (non‑LTS)    March 2019        September 2019
  13 (non‑LTS)    September 2019    March 2020
  14 (non‑LTS)    March 2020        September 2020
  15 (non‑LTS)    September 2020    March 2021
Ref https://www.oracle.com/java/technologies/java-se-support-roa...


It's pretty hard to believe that I've been writing Java now for over two decades (not continuously mind you, but I'm not currently in a gap either) and with no signs of slowing down.

I've been writing Java for longer than I've been driving, and way longer than drinking actual java. Geez. There's a certain fluency I have with this language that I don't have with any other language, not even Python/PHP/JavaScript/C#, all languages that I have 1k+ hours experience in.


Recently been going back to it a lot, I feel the same. The code flows right out my hands, I can see all the refactor paths and still learning new things with it (recently AOP, doing new optimized next-level introspection stuff, having fun with mappedbytebuffers... I know I'm late to the party but hey...)


> Remove the Solaris and SPARC Ports

https://openjdk.java.net/jeps/381

Wow this is the end of an era.


Also check out Tribuo - Machine Learning library in Java announced today during the keynote : https://tribuo.org


When is the next LTS release? I'm stuck on 11 until then.


in 2 years with Java 17


They release a new major version every 6 months. Java 17 is coming next year in September.


Thanks for the correction


You being stuck is a poor technical decision


Why?


Anyone has any idea about the current state of JavaFX?


It's being developed in the open source now as it is out of JDK - https://openjfx.io


Does this include JRE? I've noticed that the OpenJDK Docker Hub images and variants have a hard time releasing the latest JRE.


Oracle doesn't offer a standalone JRE anymore. Several OpenJDK distributions do (Adopt, Azul, RedHat, Debian, ...) and I assume the version 15 builds will be available soon.


I worked with Java in the past so glad NodeJS arrived and Golang.. productivity is the key.


You should give a try to Kotlin then


I was going to but then since Android also accepts Java I gave up on the Kotlin since I'm confortable with Java already, but I do like a lot golang recently I've also developed in C++ and I'm frankly learning a lot of golang and liking each day more and more.. it's fast to process and there's no hick-ups like in NodeJS, it's like the best of C++/Java with the best from NodeJS and no hick-ups at least for now.. I'm very impressed with the golang performance it's outstanding and what I got impressed too is that VSCode installs all the Golang extensions so it seems like I have all the typo verification on the fly without compiling remembering me a lot of Eclipse days with Java.. Good old compiling.. haha. We never get away from compiling.. I think Golang is under-marketed it deserves a lot of attention.


It feels like there's a new JDK every five minutes these days


Every 6 months, actually.

In practical terms, unless you want to test out Java features (which can be removed with the next release), you only need to track the LTS-releases every 3 years and actively upgrade every 6 years to stay on a supported version.


The average cadence, over the lifetime of Java, is actually something like a new version every 2.5 years. They have really ramped up.


Up until 10, releases were feature-driven, meaning, based around the delivery of something when it was done like Generics in 5, Lambdas in 8, modularity system in 9, etc. With 10 and on, Java is now a time-based release cadence, every 6 months in September and March, like clockwork. What's ready goes in, what's not waits for the next train.


I'm not sure this is ideal for users, no? Each time something is deprecated (e.g. nashorn in JDK15), I have to go check the long term support schedule, etc. And I have to imagine it's much harder to maintain, creating the potential for patchwork fixes across various JDKs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: