|Page 1 of 1|
Date of review: 28-April-2000
Type Of Review: Articles/Editorials
Services, services, services.
In particular, network services.
In January 1996, I did not yet know that Unix had grown up with the Internet, nor did I understand the standards that made the Internet possible. I routinely used telnet, ftp, and netscape, and I knew that web servers were available on Unix machines as on Windows, but I did not know how they worked. In short, I was a dumb user.
Just so I could peek under the hood and perhaps even mess with the engine a little, I desired to set up my own web server, on my own machine, up close and personal where I could mess with its config file and "master the damn Internet", whatever species of elephant it was.
And of course it took far more than 21 days for that. I told myself that if I was lucky, I would actually learn something about the Internet by the time I graduated.
So, in May 1996, after the second semester examinations had ended (which marked the end of my freshman year), and just before the special semester was to begin (which marked the Middle Of Nowhere), I did not move back home. Instead, I only went home during the weekends. (How unfilial of me! For the journey home took only an hour and a half (how big is Singapore anyway?). If I had to do this over again, I would probably have made the pilgrimage home to touch the will and give my parents a hug or two more often -- and maybe finish a few more books along the way.)
Actually, I had an official excuse not to move home. I was to work full-time on my USRP (Undergraduate Science Research Programme) project, about the efficacy (or not) of using recurrent cascade-correlation neural networks to recognize the frequency spectra of spoken chinese vowels. No big deal, but at the time a tall order for me.
And when the Middle Of Nowhere began, I was to do USRP in conjunction with a short introduction to the field of artificial intelligence. (If it isn't obvious by now, I liked to torture myself, overloading myself in an attempt to discover how much I could take. And when I discovered that I couldn't take it anymore, I'd go right on and overload myself some more, slowly working towards an existential crisis that hopefully would teach me something more about myself.)
But I didn't actually work on my academic commitments much (or enough) during that time. On hindsight, I was already burning out, but I did not want to admit it, so I just went on with life. ("Heck, this is only my freshman year! What do I mean burning out?")
And so, during the brief interim between Freshman Year and Nowhere, instead of taking a break and going on holiday like almost everybody else, I attempted to figure out how to set up a web server on my machine.
And it was a machine which happened to be running a network-aware operating system with all manner of Internet services installed on it by default. Nevermind that I had no idea what they did and how they did it; I would figure that out little by little.
Only I didn't have an actual network.
By that time, I had been using the Internet long enough to know that localhost meant the local machine, and should have reasoned that you didn't need a real network to connect from your own machine to itself. That is, I should have been able to reason it out. But I instead discovered it by accident.
It went something like this.
Because my 486 with 16MB RAM wasn't exactly having a wonderful time with big programs like Emacs and Netscape running alongside the X-server, I would open multiple lynx sessions each in a little xterm, all running on one of the machines in campus, but with the user interfaces exported to my 486 via the modem link.
Now, how did that work?
Initially, I had used a program called minicom to do the dialup and another program called term to manage the traffic over a pty (which I had seen Satra do earlier), but in later years switched to SLIP when this became popular and finally to PPP, which is more efficient and used all over the world today.
Essentially, SLIP and PPP get the modem link to speak some variant of the Internet protocol (term is a precursor to SLiRP), and the X-windows system does the rest, allowing the program to run on one machine but have the graphical user interface of that program appear on a different machine somewhere else.
Since this feature was provided by X-windows itself, it followed that each and every Unix program that used a GUI provided it too. No two ways about it. If a Unix program wanted a GUI, it used the X11 libraries. If it used the X11 libraries, it was network aware in this manner. I was impressed by the system architects at MIT, where the Unix graphical user interface was first designed and built.
(I would later discover that if you actually tried to program in X11, "impressed" wasn't exactly the right word. But that has already been addresed by the people behind Qt/KOM and Gtk/GNOME.)
X-windows allowed me to run web browsers on a nice big campus machine by the dozen while interacting with all those programs on my own 16MB 486 in the comfort of my room, almost as if they were actually running on my 486. Each application on my screen could in fact be running on some arbitrary host somewhere out in the Internet, sucking RAM or computational resources like crazy on the remote machine but requiring only a token sum at my end to support just the interface.
Lynx was in fact lightweight enough for me to run several copies locally, but that would require sending all that verbose HTML code across my narrow 14.4 modem link. Running them remotely and just sending the parsed ASCII across was significantly faster. It was in fact the fastest way I could surf the web over a lousy modem link, and using this method I could surf several sites in parallel with minimal overhead incurred on my own machine.
Here's what I'd do:
(in an xterm on my machine)
megadream $ xhost +sununx.comp.nus.edu.sg
megadream $ telnet sununx.comp.nus.edu.sg
sununx $ finger `whoami` # find out where sununx thinks I am
On since ... from xypmx-yy.comp.nus.edu.sg
sununx $ export DISPLAY=xypmx-yy:0 # send the X11 GUI over
sununx $ xterm -e lynx http://www.some.domain &
# repeat last command until golden brown
After a brief delay, the modem lights would flash the tell-tale signs of an incoming xterm, which I'd use to browse the site.
Now, because sununx is also www.comp.nus.edu.sg, I would sometimes say
in order to read the department's home page.
Being human, it wasn't long before I accidentally did that in a local xterm, and discovered to my surprise that megadream was already operating as a web server.
I promptly forgot whatever it was I was doing and set about looking for apache's configuration files.
cd /etc # I'd already figured most config files were there
ls *apache* # nope, nothing there...
ls *http* # could this be it?
... conf ...
ls httpd/conf # Is this it?
... httpd.conf ...
less httpd/conf/httpd.conf # Yep, sure looks like apache's config file
I'd thoroughly read all the comments and options in the file, inferring as much as I could about function from the english names (yes, this is why I absolutely, positively cannot stand sendmail, and also why I always pestered programming students about their variable and function names).
I hadn't realized that there was a copy of the Apache manual in /home/httpd/html/manual/, since I wasn't in the habit of using rpm to explore the system yet. So everytime I needed to refer to the manual, I'd surf the net using multiple xterms running lynx from a big machine in campus.
Somewhere in between I started searching the web for a good HTML reference, browsed some tags, got impatient, copied somebody's web page and started hacking on it with the help of the reference. The approach that I eventually settled on was the minimalist write-from-scratch-using-text-editor approach that I still use to this day, because I do not intend (and have never intended) to do web design, and because it is fast, short, sweet, and works anywhere in
any situation on any OS.
But one thing led to another, and I next found myself reading RFC1945: Hypertext Transfer Protocol 1.0 (you can browse the RFCs at http://www.ietf.org/rfc/ or consult the local mirror at ftp.nus.edu.sg/pub/docs/rfc/).
I don't remember how I discovered the RFCs, but I *think* the first RFC I read as about the File Transfer Protocol (RFC959?), and maybe it was a man page, or HOWTO, or some documentation in /usr/doc/ that referenced it. Whatever it was, I went on the net, searched for it, found it, and read it. And I was now doing the same for HTTP.
Yeah, it was amazingly boring, but heck, I wanted to get to the bottom of things, so I bothered to read it long enough to learn how HTTP actually worked.
I was surprised to find that the network engineers actually used plain ASCII commands for application level protocols. I was accustomed to trying to squeeze as much information out of as few bits as possible, but obviously these IETF folks who were designing the protocols were more interested in readability -- to the extent that even their network transactions were human readable, at least at the application protocol level.
Which made things infinitely more debuggable.
I immediately began putting this feature to use:
megadream $ telnet localhost 80
Connected to megadream.org
GET / HTTP/1.0 # I typed this followed by twice...
... # ... and the HTML for the page came in
After which all the magic behind the web suddenly disappeared forever, just as it had for FTP a few weeks earlier.
I was slowly, molecule by molecule, turning into an Internet engineer, and these were my preliminary ventures around the cot on all fours.
Without a network.
Be sure to check out the previous episodes of the How I Learned Linux series below:
Print this Review
Mail this review to your friend