Linux ===================== Linux Tutorials, Notes, and Reflections. What is Linux? --------------- Linux is a product of open-source initiatives, old-school Unix inspiration, and a kernel originally authored by Linus Torvalds. Linux itself is not an Operating System. Rather, Linux is a kernel that people build their "distributions" around. A distribution is a curated collection of tools that supports the Linux kernel. Companies or organizations publish their Linux distributions and may offer service contracts or charge licensing fees for expansive use and/or production features. Linux-based distributions are great! Working at a Linux workstation has taught me to explore any and all use cases with Linux. With Linux distributions, you can get practically anything working a lot faster than sitting on a Windows machine. For people who can afford cloud resources, it may be better to run Linux on a public cloud with Amazon's EC2, or Googles GCP, but I enjoy sitting at a machine while it does my work. What about Unix and the BSDs? ------------------------------ Unix was developed by Dennis Richie and associates at Bell laboratories in the early 70s. Unix, the concept of a monolithic kernel, everything being a file, and the essential tools for editing files efficiently. Many of the ideas for Unix came from their teams' work on a system called Multics. Bell Labs' work on Multics inspired a desire for a more robust, interactive, and sensible computing experience. Operating Systems were highly proprietary, and there wasn't a product that did what they needed. Bell had a government-blessed monopoly on the telephone industry, so developing an adequate Operating System was something they had resources for. In the 70's, organizations had centrally located computer system with many small resource constrained teletype terminals talking to the system over networking devices. Early Unix was developed under this model. The PDP-11/20 that ran the first Unix would fill a small closet with attached tape decks. Systems were super expensive. Everyone logged into the central computer to do their work. Unix came about because the engineers at Bell labs realized that they could implement their own system on the PDP11/20, a 16-bit computer from Digital Equipment Corporation. sophisticated system. Bell developed tools that made creating software less cumbersome: tools like `sed`, `awk`, and `grep` which are still power tools in the programmer's toolbox. As the 80's marched on, computers got smaller, cheaper, and mass produced by IBM. With Unix and a spendy computer, a person could do work independently on their own desktop system. Dedicated access was now possible. Interactions between Unix workstations on the network created a rich ecosystem for doing work.Having multiple computer systems meant that people could work on software in parallel. Early libraries developed at Bell laboratories gave us the C programming language: the same language that we build most Operating Systems today. Modern programming languages are often built on top of binaries that are compiled from C. For example, the most-installed Python implementation is Cython, an implementation of Python on top of compiled C binaries. However, the story of Unix doesn't end at Bell labs. Bell labs began marketing their Operating System to organizations, and Unix started to dominate the Operating System market: it was a killer collection of programming that added tremendous value to the expensive and limited compute resources of the time. As computer systems became more prevalent at Universities, University of California at Berkeley cut a deal with Bell Labs to purchase the source code for Unix, and the Berkeley Standard Distribution (BSD) was born. Bell labs and Berkley worked out a licensing agreement such that BSD could be redistributed to other research universities. The paths of Unix and various BSD forks cross several times over the decades. When someone came up a better solution to a problem, much of their solution could be shared with others. People shared source code with each other by dialing into computers over telephone infrastructure, connecting with SSH and commiting code to a code versioning control system. The same paradigm with which we develop code on today. On Distribution Preference: ------------------------------- For a Workstation/Desktop system, I like to have lots of options for Windows Managers, good driver support, and ample troubleshooting documentation. There are a ton of good options nowadays, but most are based on Debian or Ubuntu (a derivative of Debian itself). Redhat Linux systems have good desktop systems as well, and I use both Fedora and Rocky Linux as desktop systems. My preference for Debian lies in my faith in the Debian community. Debian has demonstrated to me over the years solid commitment to stability, security, and ease-of-use. Ubuntu builds on Debian, adding a focus on their Snapd containerization platform. Ubuntu has a major focus on accessibility features that I appreciate. For a headless server, I am usually focused on getting something specific installed. I consider compatibility with my use case, and I go with an option that I like that fits software requirements. For example, to leverage someone's container orchestration framework, I might need Redhat-based linux, so I might run Rocky. If I want to try "devstack", I'll want to run Ubuntu since that's where you get that product. I like using Debian whenever possible. If Debian isn't a viable option, I'll try Rocky. If Rocky isn't going to work either, I'll try Ubuntu to see if it's any more compatible than Debian. Ubuntu has different enterprise customers than Rehat, so their priorities differ. Working with many different Linux distributions has taught me that Linux's strength lies in how many different groups are working on their own version of the solution. One system simply cannot be the end all be all for every use case. Trying to do everything leads to not doing anything exceptionally, so we need to be okay with specialization, levels of complexity, and developing systems that facilitate work between disparate systems. Before you install an operating system, take the time to consider how to get the most out of your hardware. While it's tempting to latch onto one ecosystem for dear life, exploring all the options gives you the option to pick the right tool for the job. Right tool, right job. Dual-boot Lifestyle -------------------- I work, study, and game with Linux, but I share computers with a household of people who prefer a Windows environment for gaming and casual use. I need Windows on anything I own with a GPU, but I have learned to enjoy the freedom Linux distributions offer for the desktop use case, so Dualboot it is! My first dual-boot ____________________ My first experience with Linux was when I tried a live boot Fedora CD-ROM. I tried out the desktop interface, and it took me a few days exploring the interface to decide it was something I would want in a dual-boot configuration. By installing the Operating System alongside Windows, I could play around in Linux when I felt like doing smart stuff, and Windows would still be there for running World of Warcraft: The best of both worlds. I booted back into Windows XP, and I shrank back my C:\\ partition. I booted off of my CD, and I set about installing linux on that 20GB that I shrank off of Windows. Having a permanent installation meant that I could tackle problems with the configuration, and my changes would survive a reboot. Eventually, I even got the 1440x900 resolution to work! It was neat, but I was more into playing World of Warcraft on Windows than exploring Fedora at the time. How I Use dual-Boot today __________________________ 9 out of 10 times I sit down at my laptop, I will want to run Linux, but it's nice to keep Windows happy for everyone else to enjoy use of the hardware. I have a dual boot on my Dell gaming laptop from 2017. Windows 10 still has about six months of updates left. It's all about getting some Windows gaming use out of the machine before we'll have to invest in a replacement. Dual-boot gives me the option to play games on Windows while still having a full Linux environment available when I need it. While I love me some Linux, gaming is a first-class experience on Windows. While Linux has come a long way by means of supporting big game titles, some things require additional tools like Wine. Steam's compatibility features are fantastic, but they may not work exactly as you'd hope on every title. I am very optimistic about Linux as a gaming platform. While native Linux support would be great, the progress on the Wine-derived compatibility framework by Valve inspires hope that 100% Linux gaming system could be liveable. Even if you get things working, you have just done things to your neat Linux system to run something that "just works" on Windows. If I have a GPU and a Windows compatible machine, I'd rather play games on Windows anyday. The whole TPM compatibility fiasco with Windows 11 means that a lot of machines are being left in the cold too early in life. .. toctree:: :maxdepth: 2 :caption: Contents: Chroot.md how_to/Ed.md