Java was not underhyped in 1997 (2021)
59 comments
·July 17, 2025jasode
le-mark
> Java and JVM's WORA "Write Once Run Anywhere" will kill Microsoft's Evil Empire because it will render Windows irrelevant.
There was the sentiment but this doesn’t fully capture what the hype was about. Java out of the gate had Suns network vision built in via jndi, rmi and object serialization. The hype was about moving applications onto the network and off of windows or any particular vendors OS.
And this did come to pass, just not how Sun was selling it with Java. For example; Office, Microsoft’s crown jewel and anchor into a lot organizations is now almost entirely web based.
hedora
It didn’t really come to pass.
Office web is comically slow, even when I have $5K of machines a 100ish GB of DRAM laying around my house.
In the java vision, it’d transparently offload to that network of machines. Also, the user could decide not to offload to untrusted hardware (e.g., I don’t want to trust Microsoft’s cloud compute).
ndiddy
I think the "move applications onto the network" idea basically killed Java on the desktop and relegated it to a backend only language. It wasn't a bad idea, but it was too early. Because of the focus on network distribution, the two main ways to distribute desktop Java software were Java WebStart (lengthy start-up times on early 2000s internet speeds, integrated poorly with your OS) and applets (only viable for corporate environments where the IT department had control over the whole network and all the clients on it due to security problems, no integration with your OS). If you had to distribute software that ran locally, you had to roll your own solution or buy some third-party tool since Sun didn't have anything that made this easy.
zozbot234
> Java out of the gate had Suns network vision built in via jndi, rmi and object serialization.
It's kind of obvious since having a standard, platform-neutral virtual machine doesn't just enable WORA; it enables sending binary code and program state over the network, which is quite handy for all sorts of distributed computing flows. We'll probably do the same things using Wasm components once that tech stack becomes established.
zozbot234
> Java will make lower level languages with manual memory allocation like C/C++ obsolete because CPUs are getting faster.
Except that this actually happened wrt. a whole lot of application code. Sure, Java was slow and clunky but at least it was free of the memory unsafety that plagued C/C++. What was the mainstream "safe" alternative? There was no Rust back then, even Cyclone (the first memory-safe C style language) was only released in the mid-2000s.
jasode
>Sure, Java was slow and clunky but at least it was free of the memory unsafety that plagued C/C++. What was the mainstream "safe" alternative? There was no Rust back then,
Before Sun Java in 1995, companies built enterprise CRUD apps with "safe" memory languages using MS Visual Basic, PowerBuilder, and xBase languages like dBASE and FoxPro. This allowed them to develop line-of-business apps without manually managing memory in C/C++.
gompertz
Interesting! I never heard of Cyclone before. Looks like another Bell Labs contribution.
vanschelven
Missing from the article — which is funny, considering it's written from the perspective of a university student — is how deliberate Sun’s academic strategy was. In 1998 they launched the "Authorized Academic Java Campus" program, licensing Java tech to universities and setting up official training centers. Even before that, they were offering Java tools free to schools for teaching and research.
Combined with a massive branding push — Sun doubled its ad spend from 1995 to 1997 — Java ended up everywhere in CS education. By the late ’90s, first-year courses using Java weren’t a coincidence; they were the result of a planned, top-down push.
nailer
I’m a professional programmer now, but I was not in the early 2000s and I remember looking at my girlfriend’s computer science book. I saw the unnecessary boiler plate and leaky abstractions of 90s style OOP and wasn’t sure if I was wrong or ‘serious’ programming had become insane. It was nearly a decade before I worked out the Java had nothing to do with the Alan Kay’s original concept of OOP and the rest of the industry started to abandon it.
ecshafer
Java was not an attempt to get mainstream programmers to LISP or Smalltalk, it was to get them halfway there from C++. It was just to make application software a little nicer and less manual memory management.
I also dont think we should blame Java the language for the OOP insanity that also infect C++, Delphi, etc. it was an industry wide insanity that thought they could replace pesky programmers with a single god architect.
whobre
It was ridiculous. They seriously wanted to rewrite everything in Java, including office and web browsers. It was 10 times worse than the recent “rewrite in Rust” mania and way more unrealistic.
hodgesrm
> It was ridiculous. They seriously wanted to rewrite everything in Java, including office and web browsers.
There's another perspective. Many people were looking for something like Java well before it was released: VM-based, portable, modern object-orientation features, etc.
Case in point: databases. In the early 1990s I worked on a project at Sybase that attempted to rewrite SQL Server from the ground up to bring in object [relational] support. The team focused on VM-based languages as a foundation, which were an area of active academic research at the time. Built-in object support, portability, and ability to support code generation for queries were among the attractions. The project started with Smalltalk (slow!), then moved to an acquired VM technology (it was bad!), and finally a VM we designed and built ourselves. These gyrations were a key reason why the project failed, though not the only one.
When Java came out in 1995--I got access to the alpha release in September--it met virtually every requirement we were trying to fulfill. At that point most attempts to build new databases on other VM tech became instantly obsolete. (Other vendors were also looking at VMs as well.)
Not coincidentally Nat Wyatt and Howard Torf, a couple of key engineers from our project, founded a start-up called Cloudscape to pursue the Java route. They created the database we know today as Derby.
Somewhat more coincidentally, Java became dominant in American DBMS development after 2000. Examples including Hadoop, Druid, Pinot, HBase, etc., are just a few of the examples. I say "somewhat more concidentally" because at that point most of us saw Java as simply more productive than C/C++ alternatives for building reliable, high-performance, distributed systems. That view has obviously evolved over time, but between JIT, dev tooling, and libraries it was definitely true in the early 2000s. It helps to remember how difficult C++ was to use at that time to understand this perspective.
In summary, a lot of the hype was the usual new technology craziness, but Java also met the needs of a population of people that went far beyond databases. There was a basis for our excitement. Just my $0.02.
Edit: typo
II2II
Two points:
The hype accomplished something that would be otherwise impossible: it established Java as a language in what was likely record time. Consider another popular language: Python. It was created about 5 years earlier, yet it rose to prominence about a decade later. Or consider Rust. It is, in many respects, as significant as Java. While major developers were shipping significant applications written in Java within 5 years, Rust is only creeping into important software a decade after its introduction.
The second point is its easy to underestimate the dominance of Microsoft in those days. You think that Microsoft is dominant today, well, that's nothing compared to the late 1990's. Microsoft's market share was closer to 95% for the PC market. The workstation market was starting to crumble due to competition from Microsoft and Intel. About the only thing that was safe were mainframes, and that was highly dependent upon one's definition of safe. Nearly everyone who was competing against Microsoft wanted to see a chunk taken out of them, which meant pretty much everyone since Microsoft had its fingers in so many markets. And, as it turns out, nearly everything did have to be rewritten. Sometimes it was to deliver SaaS over the Internet and sometimes it was to target mobile devices.
zozbot234
If I had to pick a language that's "as significant as Java", I'd pick Golang way before Rust - and Golang has found significant success. The first genuinely usable version of Rust was only out in late 2018, so it's way too early to argue about its slow and "creeping" adoption curve.
> The second point is its easy to underestimate the dominance of Microsoft in those days. You think that Microsoft is dominant today, well, that's nothing compared to the late 1990's. Microsoft's market share was closer to 95% for the PC market.
By the late 1990s Linux had become a viable platform for a whole lot of things, and people were beginning to take notice. Most obviously, that probably put a big dent into the adoption of Windows NT as a server OS on x86 machines, which had been progressing quite well until the mid 1990s. That also probably helped Java because it meant you could seamlessly run your server workloads on "toy" x86 machines or on more "serious" platforms, without changing anything else.
klntsky
RIIR is justified in most of the cases because in the past the only reason to use a GCd language was memory safety for most of the apps.
RIIJ was justified too, because people believed the web will end up being java applets all the way down.
cenamus
And RIIJ also gives you memory safety
throw0101c
> And RIIJ also gives you memory safety
I think Java helped in the mainstreaming of memory safe and GC languages, especially in more corporate spaces where C/C++ was still mostly still a thing.
Certainly sysadmins were using a lot of Perl during that time, but for "real" enterprise software, I don't think dynamic-ish languages were as accepted. The use of Perl and rise of Python widened the Overton window.
mdaniel
And plausibly sandboxing, too, since the JVM used to carry a policy language with it that allowed granting access by package (or by public key, IIRC), a vestige from its days of running in the browser. But they recently killed that whole feature due to disuse
epcoa
Well all the parts that you didn’t write in XML and XSLT, etc.
Disposal8433
I remember some greybeards hyping the CPUs that could run Java bytecode (it existed for a short time). I was a junior C++ fanboy at the time and I already knew that they were wrong.
brabel
Were you right because you knew something they didn't, or you were just as irrational (maybe more given you were "a junior"?) but got lucky in being stuck with an opinion that eventually turned out right?
jmyeet
There are design decisions you can reasonably question in Rust but the big one that justifies its existence is memory safety. It's simply too important. Not everything needs it but key infrastructure, most notably Web browsers, do.
I predict we will be having buffer overrun CVEs in C/C++ code for as long as we have C/C++ code.
The realities of writing safe, multithreaded C/C++ on processors with out-of-order processing, context switching and branch prediction is simply too complex to get right 100% of the time. Rust makes writing certain code difficult because it is difficult to do/ C/C++ fools you into believing something is safe because you've never encountered the circumstances where it isn't.
We have tools like valgrind to try and identify such issues. They're certainly useful. But you'll be constantly chasing rabbits.
I've seen thread and memory bugs in production code written by smart, highly-paid engineers at big tech companies that have lain dormant for the better part of a decade.
That's why Rust exists.
giantrobot
I was excited about ApplixWare Anyware Office[0] around 1999-2000. I'm pretty sure I got a copy bundled in a boxed copy of SuSE or RedHat. It was the first time I'd really seen a real productivity application written as an applet. It was an interesting idea that was eventually delivered by JavaScript.
more_corn
Agreed. I lived through it and it was seriously overhyped.
jerf
You can understand Java hype in 1997 by understanding it as selling the Java of about 2007, but Java of 1997 couldn't deliver. Both because it was a young language, and had all the problems of a young language like poor library support for just about everything, and because the hardware in 1997 wasn't ready to deliver on the hype. Even in 1997 we weren't really looking for web pages to take 60 seconds to "start up", and that could easily happen for a "Java applet" on a home computer. (Or worse, if trying to load the applet pushed the system into swap. In this era 32MB-64MB would be normal amounts of RAM and the OS, other apps, and the browser have already eaten into that quite a bit before Java is trying to start up.) And then it was fairly likely to crash, either the applet itself, or the whole browser process.
And it was just about shoved down our throat. They paid to get it into schools. They paid for ads on TV that just vaguely said something about Java being good, because they didn't really have anything concrete they could point to yet. They paid to have really bad enterprise software written in it and then jammed into schools just to make sure we had a bad experience, like Rational Rose [1]... my memory may be failing me but I think it was implemented in Java at the time, because it was a Swing app (another Java thing shoved down our throats but not ready for prime time even by the standards of 1997). I was using it as an undergrad student in 1999 or so and I couldn't hardly click on a thing without crashing it. Not the best look for Java, though I'm sure it was not Java qua Java's fault.
Still, it fits the pattern I'm trying to show here of it being grotesquely hyped beyond its actual capabilities.
They shoved enough money at it that they did eventually fix it up, and the hardware caught up into the 200xs so it became a reasonable choice. Java isn't my favorite language and I still try to avoid it, but in 2025 that's just a taste and personal preference, not because I think it's completely useless. But I feel bad for anyone in the 1990s ordered by corporate mandate to write their servers in Java because the ads look cool or because Sun was paying them off. It must have been a nightmare.
In fact, you can understand the entire Dot Com era hype as selling the internet of about 2007 in 1997, or in some cases even 2017. It all happened, but it didn't all happen in the "year or two" that the stock valuations implied.
derriz
> Both because it was a young language, and had all the problems of a young language like poor library support for just about everything, and because the hardware in 1997 wasn't ready to deliver on the hype.
Outside of Perl’s CPAN, library support in 1997 sucked for all languages. Being able to write a hash-table or linked list C was a valuable commercial skill as nearly every single code base would include custom versions of these sorts of very basic data structures rather than pull them from a commonly used library.
“Using a 3rd party library” meant copying a bunch of source code downloaded from who-knows-where into your source control repo and hacking it to work with whatever funky compiler and/or linker your project used.
jerf
I mean even by the standard of the time, though. The Java hype meant that if a UI wasn't written in Java, it sucked, so everything had to use the Java UI. But even as young as Windows still was at the time, the UI toolkits were much more developed than the Java ones. The Java ones looked like they were written to a bullet-point list of the minimal features they needed to shove them out the door, in a new language nobody knew, which is probably because they were. As with Rational Rose, even as a student I could ram straight into a brick wall of missing features every direction I turned. I can only imagine what a professional of that era had to deal with. Compare that with a modern student, where they may still not know how to do a given thing but their main problem is that they don't know how to find or evaluate the hundreds of options that exist.
I know that it wasn't like it is today where a casual weekend hobby project can easily pull in a few hundred libraries with just a couple of shell commands, but you still needed some things to get going. It was theoretically possible to sit down with a blank text editor and write assembly code that functioned as a GUI app, the last few dying gasps of that philosophy were around, but it's not what most people did.
rr808
Java and the JVM are actually very good. The real problem to me is the Java-enterprise way of thinking which usually involves Spring IoC container with too much magic that makes it really difficult to understand. Get off Spring its a great platform.
brabel
I've been a "mainly" Java programmer for 15+ years (I've used several other languages, but Java has remained the main one over all this time). I only did something like 1 year using Spring in one of my early jobs. So, when I saw some people online talking like "you ain't a Java programmer if you don't do Spring" I used to think they're complete idiots, I am living proof you can write lots and lots of Java and basically never encounter Spring.
However, since then I've seen several surveys of JVM programmers, and apparently Spring is used in something like 80% of Java projects, so it's not surprising a majority of people, even Java developers, think that Spring is somehow mandatory if you're a Java programmer. But of course, it's just a framework, one of many, it's just the most popular one. Can you "do JS" professionally without knowing Javascript today? I'd think so. I guess React is about as dominant in the JS-world as Spring is in the Java-world.
stephenlf
> Hype is about excitement; it’s about the tantalising possibility that if you jump on board at just the right time, you’ll become part of something unprecedented and maybe end up rich and famous along the way. In the late 90s and early 2000s, a lot of people did exactly that – and, yes, many of them used Java along the way, and a fair few of those got rich and famous by getting in right at the beginning, and getting out before anybody realised their idea was never going to work
Did you just look three years into the future and write about the GenAI hype?
binarymax
I will die on the hill that Java is inferior because it doesn’t have native support for unsigned numerics.
layer8
I used to think that way too, but there’s a good argument to be made that overflowing your integer types to negative values instead of to small (or large) positive values avoids a lot of silent bugs you’d otherwise have with unsigned types. A language solving that would need to work with static proofs of non-overflow, and have any desired overflow to be explicit.
Java in the meantime has gained all the unsigned operations as methods in the Integer and Long classes, so the relatively rare cases when you need them are straightforward to handle.
The only real annoyance is that byte is signed. At least there’s a bit of unsigned support in the Byte class now.
Lastly, minor point, Java actually has an unsigned 16-bit integer type, called char.
pron
Other than unsigned types for FFI or wire formats (which Java supports just fine) or for bitfields (which Java doesn't have), what do you want unsigned numerics for?
The risk of unsigned types (even without the C or C++ issues around mixing with signed types) is that too many people make the mistake of using them to express the invariant of "a number that must be positive", which modular arithmetic types are a really bad fit for.
One possible use is for a memory-efficient storage of small positive values, say in a byte. But then you have to make a choice between forcing the value into a signed type for arithmetic (which Java easily lets you do with Byte.toUnsignedInt) and allowing signed and unsigned types to be mixed.
rf15
I'm doing Java for my main work, and boy, this still doesn't sit right with me after decades in the space. just give me my properly unsigned bytes, please
pron
If x is a "properly unsigned" byte that has the value 1, what is the value of `x - 2 > 0` and why?
The choice of having unsigned types or not is always one of the lesser evil, and in a language where emitting signals directly to hardware ports is not a primary use case, the argument that not having these types is the lesser evil carries a lot of merit.
hashmash
The problem with having unsigned integer types is that it introduces new type conversion issues. If you call a method that returns an unsigned int, how do you safely pass it to a method that accepts an int? Or vice versa? A smaller set of primitive types is preferred, since it has fewer conversion issues.
Unsigned integer types are only really necessary when dealing with low-level bit manipulation, but most programs don't do this. The lack of unsigned integers makes low-level stuff a bit more difficult, but it makes the language as a whole much easier. It's a good tradeoff.
TuxSH
> If you call a method that returns an unsigned int, how do you safely pass it to a method that accepts an int?
Mandate 2's complement be used.
> Unsigned integer types are only really necessary when dealing with low-level bit manipulation
They also give one more bit of precision, useful when dealing with 32-bit integers (or below)
binarymax
Literally every other language with unsigned types handles this just fine?
hashmash
I guess it depends on what "just fine" means. What happens when a conversion is applied? Is there silent data corruption (C), or is there an exception (Ada, D) or perhaps a panic (Rust, Zig)? Is the behavior dependent on compiler flags?
Keeping unsigned integer types out of the language makes things much simpler, and keeping things simple was an original design goal of Java.
pron
By no means do C or C++ handle unsigned types just fine. In fact, they're widely recognised as a source of problems even by those who think they're useful (when used carefully).
liampulles
I started out as a Java dev. I came in around Java 8, and I loved using the streams API (still do) - trying to stream through everything is just enormous fun. And I loved adding all sorts of Spring magic to make programs do all sorts of fun things. And I loved trying to use all sorts of package private protected stuff to define an intricate architecture, and make complex generic utilities to solve any 2 variations of an implementation.
And then, of course, I woke up and smelled the roses, and realized the mess I was making.
lordleft
The fact that Java is still a go-to language for many companies, including technically sophisticated FAANGs like Google and Amazon, speaks to its robustness and utility. It’s a great language with staying power.
murukesh_s
Java is still the only go-to language for almost all of the fortune 100 (or 500) companies other than perhaps .Net. No other languages including Go, Rust (Difficult to get devs from consulting companies), Python, Typescript (Considered inferior by enterprise backend dev bros) are being used for building core backend APIs. Almost all the devs for these large enterprises are outsourced from large consulting companies like Infosys, Accenture, TCS, Wipro etc and all of them are still doing Java. I know it by working in large Banks and later trying to sell a non-Java platform to these companies and failing just because it was not written in Java..
Also most of the large enterprises need distributed transactions as they use multiple databases and message queues in a monolith architecture - and no other language have the lib/tooling that can beat Java.
zozbot234
> Java is still the only go-to language for almost all of the fortune 100 (or 500) companies other than perhaps .Net
One factor in that choice is that Java can run seamlessly and with official support on mainframe and midrange compute platforms that are still quite popular among Fortune 100 and 500 companies. (Many of them are building "private clouds" as a replacement but it's a slow transition.) While you might be able to get other languages to run, sticking to Java is a broadly sensible choice.
joshdavham
> No other languages including Go, Rust (Difficult to get devs from consulting companies) […] are being used for building core backend APIs.
Could you elaborate a bit further? People at consulting companies don’t use Go or Rust? Also, do these top Fortune companies recruit from consulting companies often?
murukesh_s
Yup, I was also in for a surprise. I waited a decade for Java to be dethroned but no.
https://digitalcareers.infosys.com/infosys/global-careers?lo...
Just search for Rust or Go lang you can know why. Infosys employs 350,000 employees and almost all of them working for Fortune 500 companies. There is no single Rust or Go opening from what i can see. Go and Rust did not even make it into the dropdown.
> top Fortune companies recruit from consulting companies often
If you have worked in large Banks, Pharma, Automobile (IT), FMCGs you know. There will be a couple of middle managers onsite (i.e. in US) and rest of the devs, often hundreds of, are located in the offshore (Asia/South America).
bitwize
By "consulting companies" he means indentured-servitude shops that rent out programmers by the hundreds to large companies and even governments. You know, like Deloitte or Accenture.
ynzoqn
> it was used throughout my degree course, right up to the final year module on programming language design where our coursework assignment was to build a Scheme interpreter – in Java.
It sounds good.
null
nurettin
Today some of the most common development tools like pycharm, android studio and dbeaver are java programs. Your java programs will run on the most obscure platforms (aix, as400) as promised on the package. So despite all the hype and sales tactics they must have done something right.
jmyeet
I suspect many today don't fuly appreciate the context of Java in the 1990s and how different the outlook was. A lot of what we take for granted now wasn't even imagined and even if it was, it wasn't certain. Java was hyuped on three fronts:
1. Desktop applications;
2. Server applications; and
3. Browser applications.
We had more platforms then. On the desktop front, Mac was in decline but still existed and was strong in certain niches. On the server front, there were many UNIX variants (eg Solaris, HP/UX, Digital Unix, etc). Cross-platform really was a big deal and much more relevant.
We still had desktop apps then. Being able to write a Swing app and run it "everywhere" was a big deal. Otherwise you wre writing things in thing slike Visual Basic (and thus Windows only) or I don't even know what Mac used at the time.
On the server, this was the very early days of Web servers. Netscape still existed and sold Web server software. The most common form of HTTP serving was CGI bin. These had a significant start up cost. There were other solutions like ISAPI/NSAPI. Java brought in servlets, which were persistent between requests. That was massive at the time.
It creates problems too but it's all tradeoffs.
And the last is Web applications. This was the newest area and had the most uncertain future. Java applets were pushed as a big deal and were ultimately displaced by Macromedia (then Adobe) Flash, which itself is now (thankfully) dead and we have Javascript applications. That future was far from certain in the 1990s.
I remember seeing demos of Java applets with animations and Microsoft doing the same thing with a movie player of all things.
Single page applications simply didn't exist yet. If you wanted that, and honestly nobody did, it was a Java applet or maybe a Flash app. The Web was still much more "document" oriented. Server a page, click something, server a page for that and so on.
I still wrote this form of Website in the 2000s and it could be incredibly fast. PHP5+MySQL and a load time sub-30ms. You could barely tell there was a server round trip at all.
So Java still exists but almost entirely in the server space. It's probably fallen away from other platforms like PHP, Python, Node.js, etc. But it absolutely changed the direction of tech. Java has left a lasting legacy.
I would go as far as saying that Java democratized the Internet. Prior to Java, every solution was commercial and proprietary.
The word "hype" is being used in 2 different ways.
definition #1 is about Java features : The original "Java is criminally underhyped" essay by Jackson Roberts is talking about "not over-hyped" in terms of Java's technical capabilities ... such as types and package manager, etc. E.g. Java has types which Javascript/Python do not and typing is a positive thing to help prevent errors -- therefore -- Java is "underhyped". The particular language capability not being used as much as the author thinks it should is the basis for defining what "hype" is.
definition #2 is about Java's marketplace effect: The "overhype" of Java in the 1990s was extrapolating and predicting Java's effect on the whole computing landscape. This type of "hype" is overestimating the benefits of Java and making bold proclamations. Examples:
- Java and JVM's WORA "Write Once Run Anywhere" will kill Microsoft's Evil Empire because it will render Windows irrelevant. (This didn't happen and MS Windows still has 70+% desktop market share today. 30 years later and Microsoft is one of the top 3 tech companies with a $3+ trillion dollar market cap while Sun Microsystems was acquired at a discount by Oracle.)
- Java will make lower level languages with manual memory allocation like C/C++ obsolete because CPUs are getting faster. Let the extra "unused" cpu cycles do automatic garbage collection in Java instead of C programmers manually managing memory with malloc()/free(). (C/C++ is still used today for games, and tight loops of machine learning libs underneath Python.)
- Java plugins will enable "rich" web experiences. (It turns out that Javascript and not Java plugins won the web. Java also didn't win on desktop apps. Javascript+Electron is more prevalent.)
That's the type of overhype that Java failed to deliver.
Same situation with today's AI. Some aspects of AI will absolutely be useful but some are making extravagant extrapolations (i.e. "AI LLM hype") that will not come true.