> Java Web Start. This is the canonical way for desktop applications.
Not anymore. Java Web Start was removed because the very concept of a JRE, a system-wide runtime downloaded directly from Oracle and managed by the user or their IT department, no longer exists (despite some OpenJDK distributions offering something they call a JRE -- perhaps to maintain a sense of familiarity -- no one actually provides a JRE anymore, although I guess OpenWebStart is attempting to resurrect it). Like any other application, Java applications now only have two parties: the user and the application's vendor. It is the responsibility of the vendor to deliver the application along with any dependency, including a Java runtime. The vendor may choose to supply their own automatic update mechanism, for the application and the runtime, but users need not interact directly with the Java runtime anymore.
Web Start, like Applets, was entirely predicated on the notion of the JRE. With the latter gone, the former is meaningless.
Why was the JRE discontinued?
1. The software ecosystem now discourages system-wide third-party runtimes. On the desktop, the app store model rules; on the server, containers are popular -- both are much more friendly to an embedded runtime.
2. An embedded runtime with jlink gives the user a better experience by giving the software vendor full responsibility over their software and not requiring the user to deal with runtime components they don't, and need not, understand (although, admittedly, popular Java build tools currently lag in their support for jlink). In other words, the new way is better, but it does require getting used to.
> While it can in theory work with any application, it shines with modularized applications.
Not just in theory. jlink is the recommended practice for deploying any Java application, desktop or server, modular or not. While a modular application can help automate more steps of the packaging process (like not requiring running jdeps to find dependencies and automatically generating launcher scripts), jlink is not especially tied to modular applications. It is how all Java applications should be distributed (jpackage internally uses jlink).
I love WebStart. I have been using it a long time. I use OpenWebStart and wish WebStart was supported in OpenJDK. I deploy a 738 byte .jnlp file to the desktops of my users and OpenWebStart ensures that they always have the latest version of our application. Yeah, I can use jlink/jpackage to give them a single executable blob. But with WebStart/OpenWebStart, it automatically checks to see if there is a newer version. I deploy to desktops that are not always connected to the Internet. It works great. They use the cached version of the JAR when disconnected. When connected, they are unobtrusively upgraded when a new version is available. jlink/jpackage is a step backwards for me.
It is not a step backward for anyone. The jlink model does not preclude auto-update of either the runtime or the application. It does preclude a JRE, i.e. a centralised "system" Java runtime that the user obtains from a third party. You can write a launcher that updates the runtime and your app that does what Web Start does, but for your applications. You do not and should not control which runtime is used for other applications from other vendors, and the user shouldn't need to go to a third-party that does, either.
Yes, I can write a launcher that queries a URL and downloads the latest copy of my JAR if necessary. Fortunately for me, it already exists. It is called OpenWebStart. It is a shame that it is no longer in the JDK.
Yes, it would be nice to narrow down the runtime, but that is not a requirement for me and others in my situation. I deploy multiple Java applications to desktops. I have been doing so since the late 90s. Requiring my users to use a current JRE/JDK is not at all problem.
Ideally, I would have all of these apps share a common JVM so that they are not running multiple garbage collectors, but I realize this is a capability that most have simply discarded.
I was not talking about something like Web Start and the JRE. The JRE was a centrally-managed software that determined the runtime for all Java applications. I don't think it's a good idea for one party to determine the runtime for applications written by others.
It is better for your applications to manage their own runtime in a way that does not interfere with others. That is not what Web Start or OpenWebStart do; what they do is discouraged, even if you are used to it, because it makes global decisions over the user's environment that apply beyond the application itself.
> Ideally, I would have all of these apps share a common JVM so that they are not running multiple garbage collectors, but I realize this is a capability that most have simply discarded.
I'm not sure that one JVM doing more work would necessarily be better than multiple ones, but since you now control the runtime, you can do that if you like. Use a launcher that runs all of your applications JARs in one VM; since you control the applications, too, you could make sure that they are well isolated.
> Not anymore. Java Web Start was removed because the very concept of a JRE, a system-wide runtime downloaded directly from Oracle and managed by the user or their IT department, no longer exists (despite some OpenJDK distributions offering something they call a JRE -- perhaps to maintain a sense of familiarity -- no one actually provides a JRE anymore
Then what does the "java" command do exactly, if not run Java applications in some kind of runtime environment?
> It is the responsibility of the vendor to deliver the application along with any dependency, including a Java runtime.
Honestly, this appears to destroy a fundamental promise that Java made when it was first released: Write once, run anywhere. And not just in the present, but for future systems as well. Are we now expected to have vendors provide 30 different downloads tailored for every operating system and architecture combination possible? And if that application needs to be run in 10, 20 years when the packages that were made can no longer run directly?
It seems like a fairly dismal state of affairs, to me.
> 1. The software ecosystem now discourages system-wide third-party runtimes. On the desktop, the app store model rules
Does "desktop" only include Mac OS? Neither on Linux nor Windows does the "app store model" rule. They exist, but are more like an obscure "you can also do it this way" method rather than the norm.
> on the server, containers are popular -- both are much more friendly to an embedded runtime.
Containers are indeed popular, and maybe an embedded-JRE makes sense. I personally stick with installing whatever JRE package comes with Debian, myself; run them the traditional way.
It certainly clears up a lot of uncertainty about the security of whatever Java runtime is being used, when a distribution (eg, Debian, Fedora, RHEL, etc) with a reputable security team takes care of that issue for me. The Java runtime gets updated, my servers restart, instant security buff. Trusting this to software vendors is just insane.
> although, admittedly, popular Java build tools currently lag in their support for jlink
This seems to show that developers haven't bitten the Oracle dream of vendored JREs. The "old" way of launching Java applications, both desktop and server, works perfectly fine to the present day, and few seem to be willing to alter that.
> Then what does the "java" command do exactly, if not run Java applications in some kind of runtime environment?
if you're using a normal desktop OS there's no system-wide "java" command (anymore (thankfully))
It's just a new iteration of the eternal software cycle :
10: "apps are shipped together with all their libraries / dependencies ! space is wasted / all apps must be upgraded when there is a security issue instead of one single dll ! let's share the runtime/DLLs/whatever which will solve all our issues !"
20: "oh no ! all apps are now dependent on a single runtime/DLL which {does not updates as fast as upstream | must be installed manually by the user who is clueless and will use the competition's software which JustWorks™ instead | causes DLL hell | must be patched to work with my software in a way that would break other software using the same runtime }, let's switch to static linking / bundling everything !"
> Honestly, this appears to destroy a fundamental promise that Java made when it was first released: Write once, run anywhere.
Correct. And I’m fine with that, it was a terrible idea right from the start for most use cases IMHO. If I never need to maintain a system wide Java again to run user applications I will be very happy.
The purpose of system runtimes (for Java, Python, whatever) is for running system components IMHO. User installed application dependencies beyond actual OS components are a user/vendor issue and I think the right way to handle that is on a per application basis.
I have this old KVM system I occasionally need to access. It is pretty old, so it's primary access method is a java browser plugin and those have been removed from every modern browsers.
Luckily, it still has that JNLP link, so I was able to get it to work. I am really glad that JNLP exists, and that they didn't use OS-specific bundle -- I am sure the latter would have been Windows only, and I don't run it.
>Correct. And I’m fine with that, it was a terrible idea right from the start for most use cases IMHO.
I completely disagree. Platform dependent Java is worse than useless the same way electron made Linux support worse.
Running a complicated Electron app under wine can be very difficult and yes you will have to run electron under wine one way or another because not every company provides linux builds of their electron apps.
What proportion of popular Electron apps do you think would have had first class Linux ports if Electron hadn't existed? I'm guessing not many.
I do sympathise, I fully realise Electron apps are sub-par in many ways compared to native apps, but to be brutally honest that's mainly a problem for platforms that would have had native apps otherwise. For linux that's generally not the case and it's either Electron or nothing.
Now ok, maybe nothing would be preferable in some ways because it might encourage native Linux competitors, but that's pretty unlikely to work out in practice. I think it's more likely that the availability of Electron versions of popular apps makes Linux a lot more viable as a user desktop, and that could promote better Linux support generally. For example once Outlook is an Electron app there might finally be an officially supported MS Outlook client for Linux. That's huge.
> Does "desktop" only include Mac OS? Neither on Linux nor Windows does the "app store model" rule. They exist, but are more like an obscure "you can also do it this way" method rather than the norm.
Even on mac OS, how much is it the norm? I think my only app store app on my work MBP is Slack.
For stuff that is available in the App Store, I prefer to get it there, even if it's more expensive than buying it directly from the vendor. It means that when I set up a new system I can go to my purchases tab and quickly re-install the 20-30 apps I have paid for that I only use every 1-2 months and they'll be ready the moment I need them, without having to dig through my email for license keys and download links
> if not run Java applications in some kind of runtime environment?
Well, it is a runtime, but it isn't some global environment. Just a process. It doesn't even have a concept of installation. If you have multiple runtimes, then runtime1/bin/java ... and runtime2/bin/java ... would run the program using each image's own configuration and librarary.
> Honestly, this appears to destroy a fundamental promise that Java made when it was first released: Write once, run anywhere.
I fail to see how. You write once (and you even build once; you just package multiple times) and run anywhere.
> Are we now expected to have vendors provide 30 different downloads tailored for every operating system and architecture combination possible?
The answer to your question is that that's up to the developer. If you like, you can write an updater and a launcher and package it for every architecture once, and then have it download your applications as JARs. But, if you choose not to, just remember that jlink "cross links" and the only platform-specific part of the process is the packaging.
> And if that application needs to be run in 10, 20 years when the packages that were made can no longer run directly?
Then repackage the JARs. It's as before, only more in line with how people want software delivered these days. When preferences change, the deployment process might change again.
> and maybe an embedded-JRE makes sense
It's not a JRE, just a runtime.
> I personally stick with installing whatever JRE package comes with Debian, myself; run them the traditional way.
I think you mean the JDK, and sure, developers can use the runtime in the JDK to run JARs. The new deployment process is for non-developers, who shouldn't have the JDK.
> Trusting this to software vendors is just insane.
Oh no, quite the opposite, in fact. Security vulnerabilities can and do exist in each and every component: the application, the libraries it uses, their transitive dependencies and the runtime. It's better for security for the end user to have one source in charge.
> The "old" way of launching Java applications, both desktop and server, works perfectly fine to the present day, and few seem to be willing to alter that.
Well, it's gone, though, and the new way is so much better that those that have used it don't look back. It's great! It's more secure and a better experience all around. Sure, there are those who are afraid of change, change takes time, and it's perfectly fine. They'll switch to the new, better model, each in their own pace.
> I fail to see how. You write once (and you even build once; you just package multiple times) and run anywhere.
"Packaging" seems to be much more part of writing than it is part of running. In particular it seems like something that has to be done by the original maintainer, rather than something that any user can do for themselves even if the original maintainer and source have disappeared, which was one of the great strengths of Java - I can download a jar that was produced by someone who disappeared 20 years ago, and have a reasonable level of confidence that it will work unmodified on my machine today.
> Oh no, quite the opposite, in fact. Security vulnerabilities can and do exist in each and every component: the application, the libraries it uses, their transitive dependencies and the runtime. It's better for security for the end user to have one source in charge.
For cases where the application vendor is a big competent organisation, maybe. For cases where the application vendor is a hobbyist who could never have maintained the dependencies or runtime on their own, not really - a lot of the time they're simply not up to providing security updates for the whole transitive closure of their dependencies on a timely schedule.
> I can download a jar that was produced by someone who disappeared 20 years ago, and have a reasonable level of confidence that it will work unmodified on my machine today.
You can still do that without a JRE; in fact, it is easier because there is no central notion of a single "Java version."
> a lot of the time they're simply not up to providing security updates for the whole transitive closure of their dependencies on a timely schedule.
Right, but the JRE didn't do that, either. A well-updated runtime with an ill-maintained application and dependencies is no better than an ill-maintained application, dependencies and runtime, especially if the runtime is has a lower attack surface area.
> > I can download a jar that was produced by someone who disappeared 20 years ago, and have a reasonable level of confidence that it will work unmodified on my machine today.
> You can still do that without a JRE; in fact, it is easier because there is no central notion of a single "Java version."
But if the jlink-based approach you're advocating had been the norm 20 years ago, that jar file would never have been published; I'd be able to download packages of that application that worked for the systems of 20 years ago, but it would be a lot harder to run those on my current machine.
> Right, but the JRE didn't do that, either. A well-updated runtime with an ill-maintained application and dependencies is no better than an ill-maintained application, dependencies and runtime, especially if the runtime is has a lower attack surface area.
Depends on how much attack surface there is to that application and dependencies vs runtime, remembering that the Java standard library actually covers a lot of the basics. Often for a small application the Java runtime (including standard library) is actually the only part of the whole stack that's doing high-risk things (opening network sockets, parsing low-level data formats, implementing stateful protocol negotiations...).
> I'd be able to download packages of that application that worked for the systems of 20 years ago, but it would be a lot harder to run those on my current machine.
The JAR is still there, though, in the package. Although I'm not sure why you'd think that a long-abandoned piece of software is more likely to run an a much later runtime than on its own embedded one.
BTW, it is still perfectly acceptable to deliver an application as a JAR to developers who are expected to have a JDK and to know how to use it, and, of course, JARs are how all libraries are packaged.
> Depends on how much attack surface there is to that application and dependencies vs runtime
Ah, but that attack surface is smaller thanks to jlink. An application only distributes the runtime modules it actually needs.
> Often for a small application the Java runtime (including standard library) is actually the only part of the whole stack that's doing high-risk things
For a small application that is local, i.e. not a server -- perhaps. If it is meant to be used developers, they can run it with their JDK, and if it is meant for end-users, then surely it shouldn't dictate how all other applications are run. The author of such an application would use some library that adds auto-update for the application (or just its runtime).
But I think that the belief that most attack vectors are due to the runtime rather than the application or its libraries, at least when it comes to servers, is less reality and more wishful thinking. The monolithic runtime in the JRE was both large and heavily studied, so there was a good stream of vulnerability reports, but that doesn't mean it was any less secure than your own application; it could have just felt that way because the application's security wasn't as well tested.
>The JAR is still there, though, in the package. Although I'm not sure why you'd think that a long-abandoned piece of software is more likely to run an a much later runtime than on its own embedded one.
Easy, because I don't want to run the application under wine.
But it might not run unmodified under a new Java runtime, either. Codebases hacking into the JDK, using internal non-API classes, and making themselves tied to a specific version are common (even codebases that don't do that can become tied to a specific version or a range of versions, e.g. by generating or parsing bytecode). I hope they become less common when encapsulation is finally turned on in JDK 16, but even then applications can choose to selectively remove encapsulation -- for themselves and/or their dependencies -- use internals, and become tied to a specific version. Anyway, the JAR is still there, so you could try.
Fair enough, assuming it's easy to extract that eliminates my big concern. As a user of a less popular OS I worry about the jar-based flow becoming a second-class approach, but I do appreciate that having a first-class way of distributing "native" executables is important.
> Although I'm not sure why you'd think that a long-abandoned piece of software is more likely to run an a much later runtime than on its own embedded one.
The point is that the jar<->jvm interface is a lot more stable than the jvm<->host system one. I suspect old JVM binaries would have linking issues (e.g. libc SONAME) or might not even be the right architecture to run on my current machines.
> Ah, but that attack surface is smaller thanks to jlink. An application only distributes the runtime modules it actually needs.
Sure, but things like low-level parsing are used by the application. It's not quite the same thing, but I maintain a couple of projects where approximately 100% of the security notifications I get are about vulnerabilities in Jackson, not because it's the majority of the library surface but because of what it's doing. Obviously Jackson isn't part of the JVM, but for an equivalent project that was using XML or CORBA I can easily imagine most of its attackable service being JVM code that the application really does use. There's also things like JMX which the application may well want to include even if it's not directly invoking it, but I remember as being a significant source of vulnerabilities.
> The author of such an application would use some library that adds auto-update for the application (or just its runtime).
Wasn't that the point of Java Web Start links? I usually favour leaving things up to libraries, but at the same time things need to be secure by default.
> But I think that the belief that most attack vectors are due to the runtime rather than the application or its libraries, at least when it comes to servers, is less reality and more wishful thinking. The monolithic runtime in the JRE was both large and heavily studied, so there was a good stream of vulnerability reports, but that doesn't mean it was any less secure than your own application; it could have just felt that way because the application's security wasn't as well tested.
I actually agree with that - most of those old applications probably only have security by obscurity. But still, there's a significant practical difference between "theoretically vulnerable" and "has a one-click prepackaged exploit". So encouraging every application to ship a static copy of the same chunk of well-known low-level code has a significant downsides - any vulnerability will become the equivalent of that zlib double free where essentially every MacOS program had to ship a new version.
> The point is that the jar<->jvm interface is a lot more stable than the jvm<->host system one.
This is complicated. The Java SE spec is very stable. But, unfortunately, many libraries and applications don't just program to the spec, and, rather, hack into the runtime (which is much more than just the JVM) and in that case their compatibility becomes much less stable than the VM<->OS one. This is changing soon with encapsulation (finally!) being switched on in 16, although applications will still have the option to selectively disable it, essentially saying, yes, we're giving up on portability but we know what we're doing.
> XML or CORBA I can easily imagine most of its attackable service being JVM code that the application really does use.
They're not in the JDK either, anymore. I don't think that the statement that most vulnerabilities will likely be in the runtime is true.
> Wasn't that the point of Java Web Start links? I usually favour leaving things up to libraries, but at the same time things need to be secure by default.
Yes, but it wasn't as good as we wanted. The problem is that Web Start still centralises the runtime and forces the user to interact with a third party. It's better for the application to choose how to manage its updates. This could, indeed, be done in a library, and I don't see why it would be any less secure than the JRE.
> So encouraging every application to ship a static copy of the same chunk of well-known low-level code has a significant downsides - any vulnerability will become the equivalent of that zlib double free where essentially every MacOS program had to ship a new version.
But it's not the same chunk. jlink customises the runtime to only contain the modules the application actually needs. This means that the JDK modules are like any other dependency, and there's no good reason to have a different update mechanism for them, which would also be controlled by a third party, especially since you'd need the application and dependency update anyway.
From the outside it looks like an understandable shift away from desktop applications (the devs and money isn't there). I made an desktop app used Clojure + CLJFX and I got a ~25MB JAR. It then balloons to ~65MB jlinked packages. I ended up just releasing the JAR and telling people to install a Java 11 runtime (though I may revisit this at some point)
I think what it boils down to is that b/c of Java's reflections you simply can't get rid of unused code/dependencies and things constantly bloat very fast. Small apps that should be 2MB are >20MB and are "un-emailable". On the server side this is a nonissue but to me it makes the ecosystem unsuitable for user-facing applications. There is a reason you don't see people making lil JVM desktop applications anymore.
I love writing Clojure, writing the app with CLJFX was fantastic and fun but if I wanna make a quick app to crunch some data and make some plots I'm now very hesitant to reach for the JVM. I'm hoping with GraalVM the situation will change (b/c I don't think it supports reflections/RTTI and it can prune dead code/dependencies) but last I checked it didn't work with JavaFX.
PS: I manually pruned the dependencies in `deps.edn` with `:exclusions` which helped a lot. Otherwise the app is in the ~200MB range.. But it's a manual process and tedious. The toolchains can't guarantee which code is unused.
https://github.com/geokon-gh/corascope/blob/master/deps.edn
As a former servlet API guy switched over to android I'm getting good results using proguard for pruning unused dependency parts from an ancient desktop app I'm maintaining on the side. Surely a hit on build times and on sanity as well whenever you use a library that relies on reflection internally, but it completely changed the way I deal with the convenience/download size tradeoff when deciding on dependencies.
Yeah, it's not a bad idea. I did try proguard at one point.. but it's such a baroque tool. It's likely something I did wrong, but I never managed to get proguard to remove anything properly. Either nothing was removed or it'd remove too much and kill the program. It's been a while, but something about it gave me the impression it didn't play with Clojure
Baroque it is. Clojure might be a death sentence for the proguard approach, considering how deeply the terms clojure and invokedynamic seem to be connected. But if you happen to depend on some deep library stacks only for some minor scraps of it's functionality it might still be worth it, define a blanket exception for your own code and, if not picked up automatically, for the few entry points where you call from clojure into the java world. If you don't mind keeping twice as much as strictly necessary, kicking out some low hanging fruits shouldn't be that hard. (but in combination with build times and future friction a division against proguard might still be very, very rational)
Right, yeah. That was exactly the issue. Tons of garbage gets dragged in when you use a little corner of a huge library. And I'm mostly leaning on Java libraries. Yet it seems it somehow couldn't figure out which parts I was using and which I wasn't. I'll try to give it another shot eventually. Thanks for the tips :)
People generally can't email apps anyway though. Or do you mean you just feel a general sense that desktop apps should be a few megabytes in all cases? That is achievable when you depend heavily on the operating system to supply functionality, but is a lot harder to achieve when writing to portable toolkits. Bandwidth and disk space make this less of an issue these days though.
That is the official Oracle line. However, the reality is that lots of organizations download JREs from AdoptOpenJDK [0] and go about their business as usual, without packaging their apps to fit Oracle's now-preferred model of delivery.
Yeah that is what i download on my Windows PC for running Java programs too.
However what pron writes isn't to be dismissed as it sounds like Oracle/OpenJDK developers do not care anymore about backwards compatibility and looking to force people into not relying on system-wide libraries so they can start breaking things in newer versions without user-facing repercussions (i guess developer time to upgrade code doesn't matter, probably after years of webdev everyone is already accustomed to the idea of wasting time on fixing their dependencies instead of working on the task at hand :-P).
IMO this isn't a good thing as basically the biggest benefit Java always had was its stability and ability to run Java programs going back to Java 1 days (Sun even sued Microsoft about breaking JVM compatibility - and won).
> as it sounds like Oracle/OpenJDK developers do not care anymore about backwards compatibility and looking to force people into not relying on system-wide libraries so they can start breaking things in newer versions without user-facing repercussions
Backward compatibility is of extreme importance, as always, but applications and libraries have had ways to bypass the documented API and hack into the runtime's internals. Which meant that the only way to achieve real backward compatibility was to change absolutely nothing. This is pretty much what happened for some time, not because anyone wanted it but because the investment in the platform decreased in Sun's dying days, and it took some years for Oracle to ratchet it back up. Now that the platform is under heavy development, even with the same commitment to backward compatibility, it is far from perfect because libraries have gotten used to hacking into the JDK. We are, however, taking steps to stop that, by encapsulating internals.
The new model is just better, and wasn't chosen because of this, but it is certainly another benefit.
No one downloads JREs from anywhere for versions later than 8 because they no longer exist. They might download runtimes that various vendors, like IBM (under the brand AdoptOpenJDK) call JREs, but aren't. Also, IBM's Adopt team is not involved with OpenJDK and are a bit confused about the project in general, but it is true that even more trustworthy JDK vendors distribute runtimes that they confusingly call JREs to provide a sense of continuity. Rest assured, though, those aren't JREs, and they do not support the old deployment models.
Also, that "line" is from the actual developers who develop OpenJDK.
There might not be an official Oracle JRE(TM) anymore but JREs in the sense of independently distributed runtimes clearly still exist (and the acronym seems like a reasonable one).
I can completely understand why Oracle might want to shift the Java world towards the jlink/embedded runtimes model.
I can also completely understand why folks who've been shipping .JARs out to other folks who want to run them using runtimes obtained and distributed separately, which they call JREs, and have been doing so for the past 25 years would find this claim that "JREs no longer exist" as a concept, confusing at best and disingenuous at worst.
No, not really. There are runtimes that vendors call JREs, but they don't provide the same functionality as the JRE, only that of a Java runtime. Most of the code that's made the JRE the JRE has never been open-sourced and made part of OpenJDK.
> this claim that the concept no longer exists, confusing at best and disingenuous at worst.
It's not a claim. The JRE is gone. You can scour the OpenJDK source code for the JRE bits. It's just not there. You will find, however, the JRE's worthy successor, jlink. You will also find jpackage. Of course, I didn't mean that the concept is gone from people's minds (and it seems that OpenWebStart is attempting to manifest it in their software), but it isn't in OpenJDK.
True, you can insist on asking your customers to install a Java runtime from some third party, and you can suggest that they share it among various applications, but that's not quite how the JRE used to work, it is not the recommended practice for OpenJDK, and it is confusing and disingenuous to claim that this obstinacy can conjure back the erstwhile JRE. It was great while it lasted, maybe it will come back in some other form in the future, but right now we have something different and better. In time, all Java developers will internalise this fact -- and come to enjoy it, I both hope and trust -- regardless of how long they've done things differently.
If it's a runtime environment for Java, it's a JRE in the sense that the term is usually used and understood. Language is a tool for communication; if you want "JRE" to mean something other than the common-sense interpretation of that acronym, the burden for making a clear distinction between your technical definition and the common-sense everyday use of that term is on you.
It's a runtime, not a runtime environment. The common meaning of a JRE is some system-wide Java environment, that the user obtains from a third-party (Oracle) and installs on their machine, with a system-wide "current" Java version. That no longer exists.
It is true that if everyone understood the difference between a JRE and any other Java runtime, yet chose to call them all "JRE" and expect the meaning to be understood from context, insistence over proper terminology would have merely been pedantic, but that's not the situation. Discontinuing the JRE in favour of custom runtimes was a big change, and one that people still don't fully understand.
And yet users obtain runtime environments for programs distributed as java bytecode and install them on their machine, system wide, all the time. Claiming the opposite isn't pedantic, it's something far beyond pedantic. For this usage pattern, the only thing that has disappeared is the distinction between full JDK download and the slightly trimmed down "runtime only" that had been causing all kinds of avoidable trouble immortalized by the Google search term "tools.jar".
I am not "claiming" anything. I am informing developers that as of versions beyond 8, we've discontinued the JRE and it no longer exists, and what you've described does not replicate its functionality.
> For this usage pattern, the only thing that has disappeared is the distinction between full JDK download and the slightly trimmed down "runtime only"
That's incorrect. What's disappeared is any notion of a system-wide Java environment managed by a third party, and that's what the JRE was.
The JDK is now also not a system-wide environment in any way. While the usage you've described of asking end-users to "install" a JDK (i.e. placing it on your hard drive; the JDK also no longer has any installation beyond that) and pointing some system-wide configuration at it is discouraged for non-developer, and while people might still do it despite the JDK providing much better alternative, it still does not work like the JRE, and not only because the JDK is larger (BTW, speaking of the JDK size, as of JDK 9, the JDK contains the entire core libraries not just once but twice, once as jmod files used by jlink to create runtime images, and once again in its own runtime image).
It seems you're struggling with the distinction between what Oracle calls the JRE, and what large or at least significant-to-HN swaths of industry seem to be calling it.
You seem very fixated on the desktop aspects of this that have been removed, such as JWS and the Control Panel. Neither of those are relevant to Linux server distributions.
Until common Linux distributions no longer have an option to globally install a "default" environment that provides a Java Runtime via /usr/bin/java globally, it is nonsensical to talk about the JRE "not existing" regardless of what the OpenJDK team has decided for the future.
I guess you're asking whether now that the JRE no longer exists [1] and some people choose to distribute other kinds of Java runtimes and call them "JRE" (different runtimes named JRE from different vendors are not the same), is that an acceptable use of the term? I don't really care as long as people understand, one, the difference between the JRE and any other Java runtime, whatever they choose to call it -- I will use the OpenJDK terminology, though -- and that two, OpenJDK discourages the use of runtimes managed by third parties on non-developer systems and such users shouldn't be encouraged to install one; a runtime shared by multiple applications and managed by their vendor is perfectly fine (as is deploying an application intended for developers as a JAR and relying on them to use their JDK's runtime to run it).
BTW, the JRE was desktop software. For the server, there was a "Server JRE" (https://www.oracle.com/uk/java/technologies/javase-server-jr...) which is, indeed, much more similar than the (desktop) JRE to the various server runtimes various vendors now call "JRE".
[1]: Technically, OpenJDK has never had a JRE because the JRE has never been open-sourced, but now that Oracle has open-sourced the entire JDK and discontinued the JRE entirely, there is no more a popular JDK that isn't an OpenJDK build (although there are not-so-popular JDKs that aren't OpenJDK, like OpenJ9), and so the popular Java implementation no longer has a JRE.
The JRE was mostly about the plugin (Applets) and Web Start. Those two are gone. Then its main features were the auto-update and the control panel. These are also gone.
It's not an environment, just a runtime. The difference is that the JRE was some system-wide global environment, whose version was selected by the user and controlled with a control-panel. Now the runtime is just an executable picked by the application.
The dominant old deployment model has been and still is building a fat jar (or a folder of jars with a start script containing a comically long classpath) and declaring which minimum version of a java installation it requires. This might not have received much attention at Oracle and likely not even in later Sun days, but that's because it worked so well that it didn't need any.
Linux distros still ship JRE and JDK separately (CentOS & Ubuntu, to name a few). There are a few reasons for this: the first one is that libraries will get shared [1] between multiple java processes. The second one, probably the most important, is that distro will keep JRE/JDK with security patches up to date. It will usually take a single package to be updated instead of (manually) updating every java app JVM out there.
If you run multiple java apps on a single server, per-app JVM can be a problem.
[1] Here I'm talking about core JVM libraries sharing (e.g., libjvm.so), done by Linux kernel.
No one ships a JRE (for versions after 8) because it no longer exists. Some distributors ship runtimes that they call a JRE, and the main (and pretty much only) reason for that is that some people are used to the old way and this gives them the illusion it still exists. It's a harmful practice, and such runtimes should not be used. Developers can use the runtime in the JDK, and end-users should use runtimes embedded in applications. The security story for embedded runtimes is both easier and safer, because each and every application needs to be updated anyway for vulnerabilities in the application itself or its dependencies, and embedded runtimes have a lower attack surface-area, as they include only modules the application actually needs.
Your argument is very similar to the one that people pushing static linking make. As such, “just update the application” has the exact same response: nobody is going to do this.
I a not making an argument, I am explaining the situation, which is true regardless of what people think about it. I'm also not saying "just update the application." It's up to you how to maintain your software, but if you don't, the situation was just as bad with the JRE.
Maybe I'm missing something, or for the OpenJDK team, JRE means something else, but how AdoptOpenJDK still provides JRE builds [1] for all versions from version 8? You are saying that their JRE build is JDK but 4x smaller?
> The security story for embedded runtimes is both easier and safer, because each and every application needs to be updated anyway for vulnerabilities in the application itself or its dependencies, and embedded runtimes have a lower attack surface-area, as they include only modules the application actually needs.
So, you are saying that if someone discovers a security bug in Java/JVM 11.x (assume the latest official version), it is a better practice to update every application on servers instead of just pulling down newer JVM build and let package manager do the magic? Sounds promising /s :S
> You are saying that their JRE build is JDK but 4x smaller?
I am saying they're distributing a runtime image and calling it a JRE, even though it isn't (i.e. it isn't centrally managed). Also, the IBM team that builds the Adopt distribution are not involved with OpenJDK and are a bit confused about the project, but, to be fair, even competent vendors, like Azul, provide runtime packages that they've chosen to call (confusingly, IMO), JREs.
> So, you are saying that if someone discovers a security bug in Java/JVM 11.x (assume the latest official version), it is a better practice to update every application on servers instead of just pulling down newer JVM build and let package manager do the magic?
No. First, you could let the package manager do the update either way. An application vendor still has the option of sharing a runtime among apps; the point is that it is up to them. Second, you need to update all of your applications regularly regardless. They have security vulnerabilities, too. Instead of having different update mechanisms for different application components, each application will have one update mechanism of its own choosing.
Luckily, Oracle cannot just delete concepts. JRE does not exist on Oracle management roadmaps, but the world can still use it.
Zulu has a JRE (even JRE with JavaFX) which is so handy and has the added advantage of not being from Oracle.
There is a reason tooling lags support for jlink: it's a mandated "solution" that nobody asked for. Just like JPMS, another great solution from Oracle in search of a problem which is still giving fruits 4 years later.
> although, admittedly, popular Java build tools currently lag in their support for jlink
Just an anecdote. We recently ship a CLI app companion to one of our WebApp that mostly dedup few dozen XLS files before uploading the content to the WebApp using jpackage.
We are using this Maven plugin [1] to generate an exe, a rpm and a dmg.
We spent far more time figuring out how to notarize an application on MacOs that using jpackage.
Other than size savings for the download and storage of potentially redundant things-formerly-known-as-the-JRE, were there performance benefits to having a shared JRE?
Theoretically, a slow but Java-optimized system could cache (components of) the JRE in memory and make those pages available to novel JRE-based executables, much like a binary shared library [0]. Was this ever done in practice? I imagine it would have sped up Java Web Start startup times considerably - and mobile platforms as well, perhaps?
On the desktop a clear no, on the server, back when you still had "apps per server" instead of "servers per app", who knows what might have been going on between the classloader hierarchies of your favorite EE platform and it's JVM.
Hard to say, because many performance improvements were done after the JRE had been removed. There are reasons why a custom runtime can have better performance, though. jlink can place both a modular application and the runtime into a single image file that can be loaded quickly, and whose metadata can be cached efficiently.
I was just going to point out that look, right there on Oracle's website, you can download the JRE. Even the "latest" version 8. Then I opened adoptOpenJDK in another tab and realize there is version 15 already?!
Seems like a marketing fail - not the JRE is gone, but all the recent versions of Java without a JRE are.
From the non-developers' perspective (end users and admins), "Java" is still very much a system component. The perception is basically: The only reason it is not preinstalled is due to some legal keruffle in the Windows XP era, you don't have to care about versions because it updates itself (annoyingly). It's sad that the idea of a JRE on every device has been given up.
I don't think it is sad at all, because the JRE was replaced by something better -- or, at least, more suitable to the current software landscape -- if perhaps still misunderstood: a software vendor decides which runtime to use, how to manage it and whether to share it among multiple applications rather than a third party controlling a component the end user doesn't understand.
As to the marketing front, I would admit a failing on our part. Marketing has been centrally controlled by Oracle rather than by individual product divisions, which has made communication cumbersome, although that is now changing. This is perhaps a good time to emphasise that I'm not speaking on behalf of anyone in any official capacity, just on behalf of myself, as one of the hundreds of people working on OpenJDK at Oracle.
In any event, that is the situation and people might need to change their perception. They don't have to like the new situation, but they do need to understand it, although so far it seems that once they understand it, they also like it.
> although, admittedly, popular Java build tools currently lag in their support for jlink
That was definitely a problem. I had some bad experiences with maven-jlink-plugin last year, and seeing that it was stuck in 3.0.0-alpha1 since 2017 did not exactly inspire confidence.
However, I just checked and the development of the plugin is now moving again: they released 3.0.0 in Novemeber and 3.1.0 in December, and the documentation has also been updated. So I guess it's time to try again...
It has been officially discontinued for Oracle Java 11 and the OpenJDK. Some other vendors do offer a JRE and Amazon has their own variant of the OpenJDK called Corretto.
Conceptually I always understood the difference between the JRE and the JDK to be that "the JRE lets you run Java" and the "the JDK lets you develop Java".
To me this is the same as having libc vs. libc-dev or xlib vs xlib-dev or whatever installed.
I don't really understand the branding distinctions but I would imagine there's still room for a smaller download that lets you run .JARs without including a javac and profiler and whatever else you might want in order to develop and debug them?
[EDIT: Nevermind, I see that text all applied to Java 8 but not to Java 11 or Java 15. And implicitly as others have explained in this thread the reason being that there is no official use case for someone just wanting a JRE to run a .JAR they have; they should not just have a .JAR they should have obtained all of it together from the application vendor...]
Which Java package do I need?
Software Developers: JDK (Java SE Development Kit): For Java Developers. Includes a complete JRE plus tools for developing, debugging, and monitoring Java applications.
Administrators running applications on a server: Server JRE (Server Java Runtime Environment): For deploying Java applications on servers. Includes tools for JVM monitoring and tools commonly required for server applications, but does not include browser integration (the Java plug-in), auto-update, nor an installer. Learn more
End user running Java on a desktop: JRE: (Java Runtime Environment): Covers most end-users needs. Contains everything required to run Java applications on your system.
That minimal JVM was of size 300 kb so nothing if to compare with the rest of .class files of your app. So Java executables can be in principle smaller than comparable Go's executables.
That sounds far from trivial. Now in addition to your application you also have to maintain your minimal JVM to keep up with the latest developments in the mainstream ones.
I mean it’s not that much more complicated than webpack. You don’t actually have to create your own minimal JVM. But “take the JVM source, remove a bunch of stuff, and then compile” is pretty pedestrian as far as carrying patches for your dependencies goes. There are shops that are running their own modified kernels in prod.
At the time, I had a Minecraft jar on my hands, and I was tired of it not showing up in Spotlight's results because it was "just" a Jar. So I cobbled together this lazy script to get around that. I guess I got carried away writing the documentation when I published it, but it was never meant to be used "in production".
People often stumble upon the project and pose questions (same goes with some of my other projects like the barely-standing https://www.open-elevation.com/ ), but I just don't have the energy and time to fix things, answer issues and all tat jazz
I feel _really_ bad for not helping people out, clearing their doubts, fixing their issues, and improving my "creations" that were thrown out into the world. It's just that from my perspective, these things I built are extremely simple, mostly weekend-projects, which were for my own use, and I just happened to put them out there to help out whoever needs them. I accept donations on https://www.open-elevation.com/ to try to at least break-even on the cost of the hardware (I don't, not even close, but that's okay), but really I don't have time and energy to help out...
All this to say that I'm sure if I had put in just a tiny bit of effort throughout the years, even if just by looking at issues and pull requests created by others, I'm sure jar2app could be on this list (and _especially_ because of the work that others could have put into it, if I just provided them with feedback once in a while).
So, if anyone's out there, I'm sorry! I really just don't have the time, energy and in some cases the money...
The easiest way to distribute JVM applications on the desktop is the via the browser. TeaVM and Flavour make it easy to build rich browser apps in no time. You won't even miss Swing or JavaFX!
Scala.JS is an alternative if you write Scala, a compile target for the browser (or Node of course) that also can use most Typescript libraries via the Scalablytyped plugin
What's the story with JavaFX nowadays? I'm not a Java dev, but from a user perspective it's a bit of a PITA since OpenJDK doesn't (?) package JavaFX with it (and neither does the latest oracle JRE/JDK?). It seems to be open source but the main program I'm thinking of, an old-school raytracer for minecraft Chunky [0] requires it be packaged with the JDK, hence the requirement for Java 8.
Is this just laziness on the part of those devs to not package it up or is there something else going on?
Yes, this indeed looks like laziness of the developers to me. When the app was developed, JavaFX came pre-packaged with the JDK/JRE, so they still rely on it. Today, it should perhaps just be a regular dependency of your app and be part of your app.
If I were to build a Java desktop application today, I would even bundle the entire JRE, since you cannot assume that it is installed on most desktops anymore.
AFAIK you need to include JavaFX DLLs with your application and configure appropriate java.library.path. It's not laziness, JavaFX is not part of Java anymore, so you need to deploy it like any other native component.
JavaFX is no longer part of the core OpenJDK distribution, but it's not dead and gone, it's just maintained separately.
There's no need to stay on Java 8 to use JavaFX, although I'm not qualified to speak on whether it would be a lot of effort to upgrade a JavaFX application to a newer Java.
One issue is that I think Oracle forced package renames and, afaik, there’s no compatibility layer available for packages that expected the old Class Names. To my mind, this sort of defeats the purpose of the JVM’s universally unique symbol names.
I think I’m confusing it with the Java EE changes. However, renaming is only easy to accommodate if you have source access. The JVM’s model is portable binaries, so forcing renames breaks some of that model.
My understanding is that JavaFX is discontinued and so that 'it happens to still work with Graal' is just a bonus. Things change over time, so unless there is a focus on things actually working and integrating, this won't last.
Oracle has in fact discontinued their support for JavaFX, and of course gave up on it long ago. It's apparently being supported by a tiny company 'Gluon' [1] so I'm wary to think that there will be any material progress comparable to a well supported platform.
There is a new problem with jpackage after JDK 14, when your app depends on native libs, you'll get UnsatisfiedLinkError. There is a ticket for that, but it seems that one core dev couldn't repro it and the ticket is closed now... https://bugs.openjdk.java.net/browse/JDK-8259661
> If the issue is incomplete, add a comment noting what is needed and close the bug as 'Resolved' - 'Incomplete'. This is our way of saying "need more information".
This is the text from the actual bug report:
> Looks like issue with provided demo app. [...] Please provide more complete example with command line used by jpackage, verbose output of jpackage and output of error message when resulting app is run from terminal.
Other important and recent Java distrubution option is GraalVM Native Compilation. We are using it for a encryption app and Spring Boot app. Our final native version of GraalVM application is not only smaller in size but super fast. The only drawback with GraalVM is the time it takes to native compile is very high and heavy on CPU (I am sure this will be solved near future).
Declaring a dependency on https://packages.debian.org/buster/default-jre makes apt install the LTS release. A lot of comments here are actually advocating static linking, which is kind of a shock—I don’t see a good reason for it at all.
Except it is up to Debian to keep that JRE running, as it is no longer part of default Java builds.
All the JRE after Java 9 have to be maintained by the community, if they care to still use them.
JRE don't work on the days of App Stores, even Google has finally grasped that ART cannot be part of the OS and has taken the steps to ship it out of band on Android 12.
I've often wondered why Sun never made .JAR files an executable format like .sh, .bat, .tcl, .py, .doc, etc. It seems like a good way to reduce friction in binary distribution and get a lot more users of the language.
The installers used to associate .jar with the java executable (at least on Windows) so a double click would start it. I don't know if they do this still.
So much this. I understand that it's easiest to reuse your react skills to build your desktop application, but why does it have to consume so much memory? Is this something that's wrong with electron or rather the UI frameworks that run on it?
IntelliJ uses a ton of custom widgets and styles, and nowadays even a custom JVM build to look the way it does. And on Linux, fonts in IntelliJ are hit-and-miss in my experience.
Time has passed and more importantly Electron based apps lowered the expectations of desktop users.
I use dbeaver (https://dbeaver.io/) as my go-to database tool and it's absolutely fine. Not as snappy as native apps for sure, but between Spotify, Slack and VSCode I can't really complain. I used to... but not anymore.
Reading the article and seeing applets, my mind was like "hey, 90's tech, let's talk about ActiveX too". I mean, if we're talking about dead & malware ridden tech for web ActiveX is king, followed by Java applets and only in the 3rd place is Flash.
Don't get me wrong, current model of app stores and their "verified" apps is just as bad when it comes to malware, but hey, it is what it is.
Wouldn't Flash being ActiveX on IE make it the worst as it becomes the sum of both its own and ActiveX's badness? :-P
(FWIW i think that complexity aside, ActiveX in general is a nice idea - but it wasn't designed for the web and Microsoft just shoved the closest equivalent they had to applets into IE because they were afraid of Java's compile-once-run-anywhere threatening their desktop dominance and ActiveX's reliance on x86 and Windows API would put a stop on that; but didn't put much care into it beyond that point)
Not anymore. Java Web Start was removed because the very concept of a JRE, a system-wide runtime downloaded directly from Oracle and managed by the user or their IT department, no longer exists (despite some OpenJDK distributions offering something they call a JRE -- perhaps to maintain a sense of familiarity -- no one actually provides a JRE anymore, although I guess OpenWebStart is attempting to resurrect it). Like any other application, Java applications now only have two parties: the user and the application's vendor. It is the responsibility of the vendor to deliver the application along with any dependency, including a Java runtime. The vendor may choose to supply their own automatic update mechanism, for the application and the runtime, but users need not interact directly with the Java runtime anymore.
Web Start, like Applets, was entirely predicated on the notion of the JRE. With the latter gone, the former is meaningless.
Why was the JRE discontinued?
1. The software ecosystem now discourages system-wide third-party runtimes. On the desktop, the app store model rules; on the server, containers are popular -- both are much more friendly to an embedded runtime.
2. An embedded runtime with jlink gives the user a better experience by giving the software vendor full responsibility over their software and not requiring the user to deal with runtime components they don't, and need not, understand (although, admittedly, popular Java build tools currently lag in their support for jlink). In other words, the new way is better, but it does require getting used to.
> While it can in theory work with any application, it shines with modularized applications.
Not just in theory. jlink is the recommended practice for deploying any Java application, desktop or server, modular or not. While a modular application can help automate more steps of the packaging process (like not requiring running jdeps to find dependencies and automatically generating launcher scripts), jlink is not especially tied to modular applications. It is how all Java applications should be distributed (jpackage internally uses jlink).
(I work at Oracle on OpenJDK)