20 years: Happy Birthday RHEL

20 years is an aeon of time in the IT world. Hardly anyone can remember the names of the providers and systems that defined the IT world at the end of the 1990s. Sun, Compaq, Tandem and DEC were the dominant system vendors at the time. And cloud and container belonged in the world of the weather service and logistics. VMware has dabbled in virtualization, but only on the desktop. “Linux existed back then, but it was an exotic fringe phenomenon. It was sold on floppy disks mostly to hobbyists,” recalls Jan Wildeboer, an evangelist and Linux veteran at Red Hat. But then more and more companies started to take an interest in Linux.

Jan Wildeboer has worked for Red Hat for many years as an EMEA open source evangelist on aspects of free software.

But they didn’t just want the software, they wanted support above all. “They needed a roadmap that stretched over many years, so we had to change our entire business model,” says Wildeboer. Suddenly product sales were no longer important, but long-term support. That was the birth of RHEL and the associated subscription model. In 2002 the first version of it was released, which was still called “Red Hat Advanced Server”. They started directly with the version number 2.1, because Red Hat believed that “1.0 and .0 are always considered dubious”. The “ease of use” was still very low at that time. As a result, users had to install package by package, which meant installing an application that required a handful of RPMs had to be done manually – often in the correct order – and without dependency management. Only after a few releases did Yum take over this part, so that from then on all packages could be updated with a few commands.

One of the biggest problems in the first few days was generating enough trust among developers and IT managers. This included testimonials, but also comprehensive information such as a list of certified hardware on which Linux ran stably. But even more important was keeping the promise of a long support period. RHEL 2.2 was supported until May 2009, which is around seven years. That was revolutionary back then!

On the application side, Linux was mainly used for new applications, especially for the emerging web servers, while high-performance workloads continued to be the domain of expensive UNIX systems with proprietary hardware.

Only with more powerful x86 servers did Linux conquer data centers. In the process, new problems emerged: The systems were hardly utilized to capacity. This then led to server virtualization in the mid-2000s. Here, Red Hat first experimented with the Xen hypervisor, but then used the Linux kernel-based virtual machine (KVM). Virtualization opened the door to the next big step in evolution: cloud computing. The public cloud as we know it today actually only came about thanks to the potential of Linux: no expensive licenses, constant innovation and the ability to run on standardized hardware. RHEL had a major advantage here: It ran on bare metal, virtualized or in the cloud, i.e. a lingua franca for running applications across all footprints. Or rather: Across all hybrid clouds.

But the VMs brought new problems, because starting up a VM takes a lot of time – from a mathematical point of view. The solution: containers! With RHEL 7 in 2014 came the corresponding breakthrough in the form of Linux containers. Popularized by Docker, Linux containers weren’t entirely new, but they were suddenly much more user-friendly. Programs could now be developed on the laptop and then integrated into the CI/CD pipelines.

A lot has changed in the IT world in the past 20 years. Open source has completely taken over the server landscape. Proprietary hardware and software can only be found in niches. Operating systems are no longer the small piece of software that communicates with the hardware, but are the comprehensive foundation of the entire IT infrastructure. This also applies to RHEL. Insights has been added here – which has helped make the work of system administrators easier. Ansible playbooks can now be used to automate many tasks that are mundane and prone to human error. Despite the greater complexity, system management has become simpler, faster and more standardized.

And IT development will continue to be dynamic. Quantum computers are already knocking on the doors of data centers and suitable applications are already being simulated. So it remains exciting!

More from iX Magazine

More from iX Magazine

More from iX Magazine

More from iX Magazine


To home page

Leave a Comment