Previous Next Table of Contents

10. What is Linux

10.1 What is Linux

Linux is not U**X. Linux is a POSIX based kernel, that makes any decent chip, starting with Intex 386, look like an operating system.

When combined with the many other components, such as libc, init, grep, gcc, tcp/ip, sendmail, (etc); Linux looks like a unix operating system, with loads of development tools.

When combined with the standard free X11 toolkit, and several substantial and sucessful free applications, like tcl/tk, /usr/andrew, ghostscript, TeX, it looks like a workstation.

Linux has been ported to other CPU's and architectures other than Intel, and you should check the state of progress of your favourite machine. Raven assumes you have a 486 "AT", because it's easier that way. Remember that almost everything is available as source.

If you don't have Linux, but do have FreeBSD or SVR4, most of Raven is still relevent to you.

Linux is written by and Copywrite of Linus Torvalds, available under the GPL, but with caveats about it's intended usage that make it almost Library-GPL (but not). Many people have added their driver code, which are ususally bundled with Linux (for convenience of fetching, building and maintaining them at a specific version).

Linux 2.0 is a recent release (June 96). It is a progression of 1.3.90, and substantially better then 1.2.13 (eg mmap files).

10.2 But Linux is only the kernel

This is true. Linux is just the /usr/src/linux kernel, that provides really interesting features like fd = open(filename, mode), virtual memory, device drivers, filesystems and process management.

There are many other packages that have been bundled under the Linux banner, that are required to help make Linux look like a unix system, or make Linux possible. These include utilities like "grep", system libraries like "libc", and essential programs lime "/etc/init".

Traditionally, unix systems are released as, and named after their kernel release. Hence Linux.

10.3 Is it really unix?

Not really, because that is a trade name and product. Linux is a clone of UNIX, following many of the various standards.

But it looks like unix to me.

You only have to look at the rate of its progres to realise that it is neck-and-neck with real unix, and moving very quickly. It might even become a de-facto unix, that others have to match. Probably by sheer weight of numbers. Of course, the existing unix standards are stable, so don't expect Linux to change them by much.

NOTE: Most people use UNIX to mean the official SVR4 product and the trademark, and Unix to mean the generic familly of Unix-like systems. Some people use unix to mean the generic familly, as most trade marks are names and have an initial upper case letter.

10.4 So it looks like unix?

Yes, and from several angles.

Programmers can write code that runs on Linux, that also runs on other unix varients, and clones, with few changes. Look inside a few Makefiles, and you see the different targets that that code supports.

More-over, programmers take their work home, and make it compile on Linux (or bring Linux to work, and make a home from away from home).

System administrators will often see the same utilities that run on SVR4 machines, doing the same tasks. eg when the system boots, fsck scans the disks, just like any other unix.

There are a few major libraries that aren't supported on Linux, but then they aren't supported on many other unix's, and programmers still code for the alternatives (usually).

10.5 unix - more or less

SVR4 has a kernel level streams architecture, that can be very efficient and flexible. Linux doesn't have that, but Streams in itself can be a sophisticated protocol, with many unused test cases that have to be programed for, to be complient with the standards.

SVR4 (and SVR3) has the "TLI" transport layer interface, that is supposed to abstract away from TCP (whilst running TCP/IP underneath). Linux doesn't have this, just using TCP/IP directly. However, many programmers would question the absolute value of TLI, given that it already has an installed base of required "features" and workarounds.

SVR4 has a few "TP" transaction processing libraries, that Linux doesn't have. These are sophisticated database back-ends, that are like a mini-operating system, knitted onto the core operating system.

Linux has a few features that SVR4 doesn't have. In particular loadable modules, are kernel drivers, that can be compiled and loaded without recompiling the kernel. Proprietry device drivers can be issued on a diskette, and installed without re-linking the kernel, taking about a second to load.

Linux networking has some features, such as tunneling, that other OS's could only dream of (if only the budget were available, and customer demand, ... ).

Linux filesystems are more openly documented than SVR4 VFS (by example mainly), and Linux has user_fs, where the fs can be implemented by a user level program, not a kernel module.

10.6 Development tools ?

The original killer app, is GCC, the GNU-C-Compiler. It is also a C++ compiler.

The argument for Commercial based development, falls down miserably when looking at the plain K+R C compilers that just wouldn't go away. Of course people still check that their code still compiles and works with other reference compilers, but GCC is everywhere.

GCC combined with the libraries, and standard kernel API, makes it easier to port utilities to Linux, simply by compiling them. And when the current round of testing with 64 bit addresses is done (32 bit has been common for a long time), those tools and their code will be even more portable (with even more #ifdefs in them!).

Other languages tend to be written in C (compilers), and arrive as soon as GCC can make space for them. TCL/TK for example is delivered as source, and compiles on Linux, and SunOS, and ... Suddenly you have a scripted GUI, (running tcl scripts), that you can incrementally edit and run, without exiting the application under test.

Other languages include perl, ada, eiffel, fortran, cobol, lisp, and all the libraries that have been developed for them. Either free or Commercial. There are also XBASE (from DOS dbase-III) compilers, and SQL interpreters.

As well as compilers, you (may) need libraries. Motif is commercially available, Tcl is downloadable over the net, as are many others. Basically, if your in-house system needs a library that isn't available on Linux, preferably with source, then you should be asking what would happen in 5 years, when you need to expand your system throughput, but don't have many options open to you.

10.7 Is it as good as DOS

Not quite yet. The problem at present is the number of Office Applications, and finding ones that your people are familiar with, that run well on X11, at a comparable price.

For some, it is already better, because they have learned to use it, and can do everything they want to do. They can produce TeX documents, containing gnuplot diagrams, produce WEB pages, etc etc.

It's really a question of what your application is. If you have an in-house specialised system, that is a generalised package for your trade, you need your software house to come up with a version for UNIX.

Linux can also be quite bewildering. A reasonable unix installation, requires as many directories as DOS needs files!

10.8 Source of course!

Almost everything on Linux has the source code available, either to the general public (eg Qt) or to software houses with the money. "So what?", you say, "I never read it".

You are already living within the limitations of your existing software package, so any bugs are likely to be "acceptable". Provided you can accept staying within the same hardware and OS base. But what if you _really_ want to get something fixed, or identify the fault?

It won't be cheap, but if you have the source, you can pay a software house to fix your problem(s). If you are running a dentists business on basic-82 (fiction), you can either rewrite the (source) application to port it to run on basic-97, or you can fix the problem in the basic-82 interpreter. If you are a chain of dentists (eg 20 independent sites), you can form a user group.

Even getting the specification of your existing data, may be enough to migrate most of it to a new application, running from scratch.

10.9 So it's as good as NT?

More or less, yes.

NT has it's own toolset, that differs from linux. If you use a compiled D-BASE-III application, chances are you can find a matchable engine to port your application to, on both systems.

Maybe not if you are using lots of NT windowing tools. Then you are well trapped down that avenue, and should look at converging, eg via GUI libraries, SQL common subsets, or your own C++ code.

If your service is completely independent of the OS, such as being a WEB server, Linux is already a better option, with Apache, Sendmail, FTP, SMB, NFS, ... You can transfer the WEB server functions from an NT WEB server, onto a Linux WEB server, and not notice the difference, except that both services ran faster (two CPU's etc).

However, if you start from no installed base, and have to write code either for unix, or for NT, you might be better writing it for X11, and putting an X11 (workstation) interface on the NT.

If you wrote it in TCL/TK, you could make a few diffs to make it run on Linux-X11 or WIN-311, from the same source.

10.10 What about Networking?

Linux, uses TCP/IP over ethernet (or slip, or ppp), so it is like most of the non-dos world. It can do novell (recent), it can talk to macs, it can client or serve WIN.311 shares (disk, printer, process) over TCP/IP, but not over netbios.

Linux TCP/IP is a full-feature system, and it's growth in numbers will get it tested in all sort of circumstances.

Being source, the serious hackers have devised various tricks to make TCP/IP work in new ways, such as IP_Masquerading, tunneling, etc.

Linux makes an excellent WEB server, it has a 9% presence in the world population of web servers. That is more than NT and less than Sun. Development is on-going, which means that you can run Linux-2.0 until Linux-3.0 is ready and tested (tested by you).

10.11 Linux networking interfaces

Linux supports many different ethernet cards, and with the pending huge numbers of home dial-ins, it will get some testing against the standard PPP servers.

ISDN devices are more expensive, and tested more selectively, but more cards have yet to be designed, and Linux stands as good a chance as NT or SVR4 to get supported.

Fibre links, and expensive lines, tend to come with their own expensive routers, requiring less interfacing to connect to the Linux box, or maybe is Linux-Inside!

10.12 Linux PCI Card Interfaces

Linux already supports many, many PC cards. New SCSI cards and other cards are appearing every month. The companies that sell them typically put the DOS drivers first, but the gap is closing, and becoming more obvious when it is there.

You still have to check for compatability, and reported reliability, but if that SCSI card works for you, then it works! If you don't change anything, it will still work tomorrow, and if you have to change SCSI controller completely, there are alternatives.

10.13 Multi-Media Internet WorkStation

It's really a question of finding apps that you can live with, and getting used to using them.

You can send and receive email. you can write robots to do that for you.

You can browse email, news and web pages, switching between them by clicking on URL's, and buttons.

Finding Internet tools that really suit you, can take time, but Raven (Issue-3) has a few suggestions, and you can start looking immediately.

10.14 Multi-Media Text Processing WorkStation

This means running something like TeX or roff, and the surrounding macro libraries and fonts. These convert plain text files, with embedded text commands to typeset page layouts, with kerning, characters, tables and simple drawings.

The output from either (Tex, groff), tends to be .dvi or postscript (though groff is still very good at formatting to VDU text formats, as used by the man pages.

The input to either, is one of (1) manually edited, (2) edited using a wysiwyg editor or (3) a pipeline output from a previous generating phase.

In addition, the output may be HTML text files, so these utilities are starting to act as filters, or macro processors (which is what they are).

10.15 Multi-Media Graphics WorkStation

Browsing a .dvi file, onscreen is a form of graphics, as is converting a .ps file to an X11 pbm bitmap.

Displaying a static graphic file, can be done by converting to a nearer format, and using the native display library to do it.

Graphics workbenches can can process scanned photos or generated diagrams, (from programs like gnuplot, or tcl/tk). You can then include them as attachments to messages, within printed TeX documents, or within .html pages.

3D grahics programs can take "standard texture descriptions", and wrap them around "standard triangulated surfaces" to render objects under lighting, or use Ray-Tracing to POV ray scenes.

There are standards, at various stages, and Linux has Free Libraries that can do many things. Linux differs from DOS, because in DOS you tend to get one application that does everything, with Linux you get lots of adaptors, converters and utilities. By using them together, you get a tooled-up workbench, though at times it looks like a jig-saw puzzle.

10.16 And this is Free ?

"Free" is of course a subjective term, perceived as "no restrictions within the horizon".

There are several "Free" licenses. GNU popularised GPL and Libray-GPL, (which you get the chance to read several times). Many authors have adopted the GPL, as being an established umbrella, to License their code for "Free" use.

A lot of X11 software is Contributed by large companies. It isn't GPL, but is is Free.

Some software is Released, but "not for commercial use". This is particularly troublesome, as a typical computer (wordprocessor), used for a home office, is arguably being "commercially used". Some Licences clarify that they mean "commercially sold", but you have to read them to be sure.

The GPL itself, slows down the free re-use of software, by making it difficult to re-use useful portions of code, in other packages. It could even be argued that GPL code, cannot be used in Library-GPL packages, except where "bundled" as independent objects. However, the benefits from the GPL, are already obvious, with so much software using it as a license.

Personally, if Free is so hot, why are any restrictions needed at all? Public domain means that it belongs to everybody and anybody, almost without restriction, but in this cynical world, many people think that Free -plus- marketing isn't enough, restrictions are needed to protect <something>.

10.17 Check your freedoms

With current legislation, you have to make reasonable attempt to be sure that you have Licenses for all your components.

If you are getting your software from CDROM's, the distributor has usually checked that every package is freely distributable, but they may only be watching their own back, as they are doing the distribution. You may have to comply with every package's USER License, as well as the distribution rules.

This particularly applies to CDROM's that come with a few single user Licenses. Official RedHat comes with a Commercial-X server. Others come with proprietry databases, or desktops, or combinations of packages, bundled into an affordable CD.

Some packages have the opposite sense, they are freely usable, but not distributable! They appear on FTP servers, and you download them yourself.


Previous Next Table of Contents