Before I get to the point, I want to make one thing clear. My intention is not to start a war or a drama. What I am expressing is not out of spite or animosity between you and the instructors, lecturers, senior engineers, etc., but from differences observed from the UNIX lectures on YouTube and what I found from my personal experience and what I learned from the original creators through books, archives, etc. The point of this blog to encourage taking deep dives in engineering especially in India (my homeland) where somehow the playful involvement with machines is considered not favourable (or at least that's what I feel).
From what I'm able to discern from most lectures (especially on-line) is that most people teaching about UNIX do not know about UNIX. Sounds ironic doesn't it? This part is the hardest for me to articulate as I want to be unapologetic while not being condescending but I'll do my best. The videos where my issue comes from are of two kinds:
For the latter category, I'm writing this blog because I appreciate the effort but I disagree with the direction taken. This usually stems from the comparison between Windows and Linux, there are thousands of videos getting either the chronology or the context wrong but none of them are worse than two things I'll be discussing here.
I'm calling this one as a minor gripe as most people get the macro architecture right but the details wrong. This sounds as nitpicking because all it takes the repositioning the elements to correct the architecture. Upon my understanding, this is what Unix architecture looks like.
As opposed the more popular one, here.
If I consider the second diagram as true, then I'm supposed to have a dysfunctional Linux build as I don't have comp, vi and many more yet it's working. Why is that happening? The reason is that commands aren't the part of shell, they're applications residing in the user-land. If this sounds weird to you then don't worry because like most Unix lecturers, you don't live in UNIX too. Not being snarky but that's true, very few people will consider using Linux or any Unix-like OS as their daily driver because they aren't de-facto of the desktop market.
On comparing both diagrams, you'll notice nothing drastic has been changed but this is still important as most new users on Linux get confused between command prompt, terminal and shell languages like bash, zsh, etc. Most people think all these 3 are one but rather they're different components working in harmony. But now it's time to talk about the major gripe I have against the online YouTube videos.
Well I said that they get the context wrong but I didn't explicitly mentioned that as my major gripe. The major gripe doesn't just lie in the wrong context but also on not discussing the underlying philosophy and design practices that made it stood the test of time. For the uninitiated, UNIX got started in development in 1969 but got released in 1970. Even if we take 1970 as the year of Unix' birth, that has been 50 years! No other OS till this day is as widely discussed as Unix or derivates like Linux, BSD and especially GNU project which lead to free-software from which the modern open source movement started. I genuinely feel bad for those who are missing out on the cultural side of engineering which is as fascinating if not more than the computers themselves.
After using Linux in my entire CS undergrad course and watching such videos, I come to the conclusion.
Most people teaching about UNIX don't know about it because they neither lived in it nor internalised the UNIX philosophy.
It's no brainer that most people use Windows on desktop which has a radically different architecture than Unix which is why most see Unix not much with utility almost to the point, where I've seen absurd claims like you can't use it get things done in real world. I find it absurd because I can do all my daily tasks on Linux which is a Unix clone. While this may sounds as gate keeping, it is worth noting that to design things well, we must look at good designs as guiding examples. In case of OS and software design, Unix marks the spot as it solves the very primal problems in software development and expands from it.
This one is covered the least despite people talking about the year of origin and crediting the original creators (can't say the same about Linux as a good chunk of work has been done by Stallman for his GNU project which includes GNU core-utils, GCC, Emacs, etc.). In all of these videos, the thing I see being missed is the "why" behind the development of Unix that makes it relevant even today. The most plausible videos tell the goal of Unix was to be a multitasking (or timesharing), multiuser OS but that's half the truth. Multix was already built before Unix and doing well for its time. There's gotta be something else. And that's where the real reason for the creation of Unix comes in.
The reason why Unix was built not just to be a timesharing, multiuser operating system but also to be one that is highly productive by making the operating system simple, modular and minimal to improve the long term maintenance and create more programs and services by avoiding to duplicate the initial effort.
This is something I also observed during my internship at Amazon however, I couldn't discern it for the first time and I'm not proud of it. The products you use from Amazon aren't monolithic entities but rather a collection of several smaller services working in tandem. I guess this is also the reason why candidates with Unix knowledge are preferred goes beyond shell scripting.
For those who aren't into Unix, Unix philosophy is the software design philosophy that directs how softwares in the Unix environment are built. This unlike most software design paradigms was articulated after implementation. This means it is an informal and pragmatic philosophy not coming from a coperate higher ups telling what is the supposed to be the solution. There are many takes which you can read from the book "The Art of Unix Programming" written by the father of open-source movement, Eric S. Raymond but I'll use the summary from Doug McIlroy, the inventor behind Unix' pipelines.
This is the Unix philosophy: Write programs that do one thing and do it well. Write programs to work together. Write programs to handle text streams, because that is a universal interface.
Using the same philosophy I built my personal binge watching service that is mentioned in my blog on CLI softwares, which frankly I use it for listening to my favourite synthwave and vapourwave mixes daily. But before I tell you how it works, I want to show you the script to get you up to speed.
#!/bin/sh loc="$HOME/youtube-playlists" mpv --ytdl-format=22 $(< "$loc/$(ls $loc | dmenu -l 10)")
The program uses 3 programs ls, dmenu and mpv. All 3 of them work independently but partake in my script on invocation and this is how it works.
The shell in Unix facilitates the following things:
And that's how they work in harmony. First ls is called to list the directories, its output is redirected to dmenu via piping. Once the file is chosen but the shell finds and extracts the output feeding into mpv's arguments using the stream operator (<).
Notice how the shell is one dealing with the input and output of different programs which effectively strings them together and make them work as one entity. This is the gist behind shell scripting and the independent nature of programs following the Unix philosophy.