-- mark --
I know, this place looks somehow like dead.
That's just I had nor time nor really interesting things to post here. My techie life went smoothly, and my non-techie life is already described on another blog (sorry,I don't want to disclose which one).
Well, I installed Debian Sarge at work. I feel very happy to have been freed from the last Windows PC I had to use (It's still dual booting, but I boot Windows only once every two weeks). Installation of Sarge was very smooth (much better than the Woody installs I did), and to have Samba working I just had to install Samba, do some obvious configuration ("what's your domain name?") and type "smb:/" in Konqueror. Installing the network printer was matter of four clicks. Whoa. Who did say that Linux was hard?
As for the WM/DE, I want to try the Mezzo/Orchestra thing. It seems damn cool. I also have new opinion pieces in my mind. Hope to put them out soon.
Ok, there's nothing in this post. No content, nothing. But I wanted to mark the fact I'm still alive, whoever cares about.
A somewhat wider audience for my ego
Well, thank you for all our comments on the previous post. They made me think. I posted the output of the brainwork on OS News.
By the way, I still have to emerge KDE 3.4. I hope to see it's the big improvement all people buzzed about. I think KDE is really the definitive Linux desktop. If only it wasn't so heavy.
I'd like to see some new idea in window managers/desktop managers. I had used KDE, GNOME, WindowMaker, Fluxbox, XFCE, IceWM and Enlightment (I hate this last one really much). What else cool is there outside?
We have made Linux ready. Now let's make the users ready too. Part I.
I was lazily reading last OSnews when I got stuck into a link to this post by mr.akaimbatman. It is just one of the bazillion articles/blog posts I've read about what the Linux desktop would have to be. But now I'm tired, and so I rant.
After a long Windows-only experience, it's two years I happily use Linux on the desktop 24/7 (with the exception of the Windows XP box I still have to use at work), and it's two years I read this kind of articles. These articles just get me sick. They get me sick because they invariably miss the point. They take a lot of myths about the lack of adoption of Linux on the desktop and they try to solve them with titanic scenarios of complete desktop environment/filesystem refactoring. While there could be many nice individual ideas in their proposed solutions, they completely lack the sense of time and pragmatism.
The reason is simple. All your super ideas about rebuilding the filesystem hierarchy of Linux or merging GNOME and KDE in a single, ultra-cool, ultra-easy desktop environment can be all the best you can think about (most of time they're not, but let's give them a chance). The fact is simply that, given the number of developers and the pace of development of open source software, when all this will happen -if it will happen- we will be already be light years behind other operating systems. Period.
To make Linux a practical desktop solution for the masses we do not need to turn the guts of Linux and X upside down. This is a sin of naivete. We just need a little rational change of mind from the community, and a big effort from the software industry. Let me explain. It will take some post to do it, so prepare to read. Ok? Well, start with
I. Linux IS easy and ready for (most of the) desktop. People must just learn its basics.
We who use Linux on the desktop don't find it hard. To me Linux is pretty easy when it comes to everyday tasks -even easier than WinXP. So where's the point? When John 'PC User' Doe hears about Linux he usually hears a lot of enthusiastic geeks that assure him that Windows is shit and that Linux is the solution of all John Doe problems. Then John Doe will reply "but I heard Linux is good on servers but not on the desktop" and geeks will point him on the latest-and-greatest Mandriva or SuSE. Until now everything looks almost OK. What will happen then?
What will happen is that John Doe, assured that his new shiny Linux will be easy to use, will try to install and use it on his desktop or -much worse- on his laptop. Then John Doe will fill a hundred of forums asking for silly questions like "how do I install the drivers of the webcam I have on the webcam CD?" or "how do I play my iTunes DRM-protected files?" or "why does not my USB ADSL modem works?" More sooner than later most John Doe's will throw the towel and return to Windows, hastily disgusted from Linux.
Now, the commonplace deduction is: John Doe bad experience has shown that Linux is not ready for the desktop, because "this is hard, that is inconsistent, that else is awkward". Is this right? No. It doesn't mean Linux is perfect -it is absolutely not. But this is not the reason John Doe finds Linux difficult or even unusable.
John Doe did find Linux difficult to use becase he did not learn its basics. He did not learn about it. He never read the instruction manual. When a friend asks me to introduce him into Linux, he almost always says "hey, help me install it, then I will learn it by myself". This will just lead them into troubles. Most PC users have known just DOS and Windows, and just cannot think about something so different. They expect a drive C: or a self-extracting, graphical installer just because they never have known anything else, not because it is intrinsecally easier.
What I answer to my John Doe's that want to install Linux is: read documentation first. Please understand to the last word: f i r s t. This does not mean at all forcing them reading kilotons of man pages. This means telling them the truth: that Linux is different from Windows, and that if they do not readily know the basic differences between the systems before the switch, they will be lost, not less than I would be lost if parachuted in the center of an unknown city. Tell them to look on Google and the Wikipedia, to get familiar with the filesystem hierarchy, the distribution diversity, the meaning of magic words like "shell", "kernel", "window manager", "mount point", "package manager" and so on. They don't need to become gurus. They just need simple, clear concepts like "X is the program that controls the graphics on Linux. It is a separate program, not a part of the OS like in Windows. X itself just controls the basics, but your desktop behaviour and appearance will depend on another program, that's the window manager or desktop manager. There are quite a few, you can try some of them and then pick up the one you like, so your desktop can fit your needs and tastes much better." Tell them even to look at screenshots of KDE and GNOME, so they will be already familiar with their new graphic environment. Tell them the command line exists and that they don't have to fear it. Show them simple examples of how it works, and why it is more flexibile and fast than GUI alternatives for simple tasks like "how much space do I have on my partitions?". Only once they will have some familiarity with Linux concepts, give them a live cd. Tell them they can play with the live cd as much as they want, and they will understand the concepts they're read about and can begin to "feel" Linux under their hands.
Most people will gladly listen and understand such explanations and will actively do it if politely but firmly advised it's good for them to do so. If they say "oh,I will learn it later, just install it" explain them that it's nothing hard, but that they will be LOST if they don't learn, and that it's like to pretend to be able to drive a train just because you can drive a car. If they still insist, well, advice them not to switch, or to prepare for pain.
I know what I say because I had the luck to be prepared to Linux with this approach. When I first considered switching, I just wanted to download all Debian Woody CDs and install them. A friend of mine (the other dude writing on this blog, BTW!) warned me, pointed me to Knoppix and Mandrake and told me that things would have been easier by learning something before. So I downloaded a bunch of tutorials about Linux, I bought a couple of Linux magazines and I spent a couple of weeks by googling and reading stuff. Only after I felt ready to boot a Knoppix, and after some day spent by playing around with Knoppix I finally installed Mandrake 9.1. I remember I was almost disappointed by how easy the transition was, but I realized immediately it was so easy because I was already knowing the basics. And when I see newbies ranting on forums about how bad Linux is, everyone realized they just didn't take the time/will to learn BEFORE installing.
Now to my corollary, point I.a:
I.a: Installing Linux applications is damn easy.
I sadly laughed when I read that he still thinks installing applications on Linux is hard. This was perhaps true years ago, it is pure FUD now. Let me say it clearly: There's nothing easier in XXI-century computing than installing Linux applications with a package manager. I repeat with other words: There's nothing easier than an apt-get or an emerge that can be done on computers today. There are a billion of potentially difficult or awkward things on Linux, but installing applications is not one of these.
In fact, strange as it may seem, I started using Linux and I continue to use it almost mainly for this precise reason. With Linux I have literally thousands of applications that I can install instantly, free of charge and that I can trust because they are Open Source. Do this thought experiment. Imagine a world where Debian or Gentoo are the main operating systems of the planet, and where a new player -Windows- is struggling to conquer the desktop. Now a newbie Windows user tries to install his favourite music player. He is accustomed to XMMS, which he installed simply by (1)firing the command line (2)getting root and (3)typing "apt-get xmms install", and now wants to install Winamp. Imagine him knowing well his Debian-based distro but knowing almost nothing about Windows. I'm sure he will first look for a "Package manager" or "Software install" program on his Start menu and/or Control Panel. He won't find anything useful. He will try the command line, but nothing seems to come from there. So he will look on the Internet what kind of package management Windows has; he will find it is almost none. Amazed and disappointed, he will eventually look for Winamp on the Internet and he will download it. He will find himself with an executable file, something he has never associated with software packages. He will eventually double-click it, and he will find an overwhelmingly annoying graphical install that will ask him silly questions like "where do you want to install me?", and he will stare in confusion by understanding that no /usr/bin exists on Windows. If he's clever he will eventually understand the logic of the process: he will find it also awkward and unnecessarily complex. I expect also him quite upset when he will understand there's no "emerge -Du world" that will help him upgrade all his software, but that he will have to do program-by-program, painfully.
So here's explained the ridicule of this assertion by akaimbatman (the author of the post linked above): Package management is one of those concepts that seems great on the outset, but fails in practice. The issue is that each package has a complex chain of dependencies unique to itself. In order to be certain that a package is compatible with all installations, all combinations of installed packages must be tested! As it is unlikely that anyone would go through so much trouble, the incompatibilities between packages accumulate, and before long the packaging system is rejecting new installs. And that's assuming that a graphical installer exists!
If a graphical installer does not exist, then life becomes even more difficult for the end user. Instead of launching a GUI and selecting the applications he wants, the user must open a terminal and begin typing cryptic commands for which he has no training for.Many proponents of packaging systems downplay these issues by stating that packaging errors don't exist on system XYZ (despite proof to the contrary), and that if the user is running Linux he should be "smart enough" to know how to use the command line. Such statements are just silly. Users want the computer to make their lives easier. Any barrier thrown in their way will only drive them to a different platform. Unfortunately, package managers still drive most Linux desktop distributions..
You're utterly wrong. The packaging system is one of the biggest strenghts of Linux, because it makes installing application easy. Take your Windows fellow and let him see you can start from zero to a fully Internet-aware desktop by typing something like "apt-get firefox gaim xchat thunderbird gftp amule install" instead that downloading and installing a thousand of separate applications from their respective web pages. Oh, your friend does not like typing? Well, let him point-and-click on Guitoo or Mandrake graphic urpmi or something similar. But don't be ridicolous tellimg me that typing "emerge gaim" is difficult, please. The fact the user has no training for typing "emerge foo" IS WRONG, not the package manager. Users need to learn, THEN to use the system. Would you click on an executable installer, if no one ever told you that's the way to install software?
Mr.akaimbatman for some reason also tells me that package managers don't work, that they fail continuously. This is pure FUD. Packaging errors are incredibly rare, despite theorical arguments for the contrary. In 2 years of Linux I just found one real package management error, in an obscure bioinformatic Gentoo ebuild. Mailing the package maintainer solved the issue in 48 hours. That's what open source is for.
We should tell to newbie that want to switch how powerful are the package management systems of Linux. Surely it would be nice to have them somehow unified (but why don't we write a wrapper around all three main package managers? It would be even easier, and transparent, and it wouldn't involve improbable revolutions). But they work. They are damn easy and powerful. Tell our friends Linux is better, because of package management.
I.a.1.: Dealing with non-packaged applications
There is still the problem of applications not included as packages of your current distribution. Finding software is still much easier than on Windows, because you can look for it on Freshmeat or SourceForge or on Savannah, that is, on centralized repositories. The problem is installing tarballs. There are solutions for this problem, that this time belong to the Linux community instead of the end user maybe, but they're don't require any massive rewriting of core components.
First solution, write a compiling packages helper. This requires not much more than being a simple text-based and/or GUI-based thing that gently unpackages the tarball, executes "./configure", "make" and "make install" (or,better,"checkinstall"), and gently prompts any error encountered in a comprehensible way. (Hey, I just found the Python project I was looking for to cut my teeth on!) This still requires advanced feedback from the end user if something goes wrong, or if customization is needed, but if everything is right it would be nothing harder than an apt-get or a double-click installer.
Second solution, distribute static binaries. If dependency hell is your problem, this is perhaps the best solution. I actually love the shared library concept, but I can see that has drawbacks. For big, common libraries like GTK or QT, they can actually be something that the user don't want to install properly, because he/she won't need them except than for one single app, and he/she wants to be sure that single app just works (so we don't want to go into things like "it works with GTK 2.2.3 but not 2.4.1"). For obscure, little dependencies it can be a hell to find them, and it's sad many good apps fall into oblivion just because they depend on a bunch of libnotinstalledonanysystem .so. Building static binaries would solve it: moreover static binaries will run happily compiled with their older libs if newer, not retro-compatible ones are already installed, allowing to avoid contorsionisms like installing KDE 2 libs on a KDE 3 system, for instance. I think it would be foolish to install and distribute static binaries only, but they should be presented as an option by all free software developers (and commercial too).
The second part of akaimbatman critics the filesystem hierarchy of Linux. Here I declare:
I.b: The filesystem structure of Linux is no problem for the end user.
I can't see why the end user should see the directory structure of Linux as a problem (or at least as a major problem). Most of the time they will use only their /home directory, and that's actually one of the things that makes Linux easier and friendlier than Windows. They have no problem to know where is their program, because they will know it is almost always is /usr/bin/program, if they ever need to. Again, it is simply matter of "hey,newbie Linux user,read about the Linux directory structure! You will find it is really rational and simple: all general configuration files are in /etc, all everyday program binaries are in /usr/bin and so on". It is also much friendlier to know your cdrom drive is always /mnt/cdrom than an odd D:, E: or Q:. Again, it is nothing hard if you take the time to understand it before actually using it instead of diving into it expecting that every OS on the planet must be a Windows clone. All advices to change directory structure and all the GoboLinuxes of this world are plain useless, desperate attempts to make of Linux filesystem a fake Windows filesystem. There's nothing bad about both filesystems: they are just different, they need to be understood, and they are out of the user way most of time. We can always try to improve a bit the current scheme, but in general it has nothing bad or awkward. Init scripts structures (SysV or BSD) are surely something that can be improved in the sense of clarity and easiness, for example. But for most part all criticism I've read is basically "This is not Windows", and as such is of no importance.
Ok, enough for now. Of course there are things with Linux that are not good, but they're simply not always the ones that are believed to (and definitely NOT what akaimbatman thinks they are). We're just smashing some involuntary FUD here. New chapters will follow.
Athlon64 - Reggio Emilia Core
Ok, second post! I want critrographic engine in AMD64 CPU! Via C7 has it and AMD should have it too...So I could secure old my data with 128-bit long key! AH AH AH AH
ok, first post, this is just the common test
welcome to this new tech blog about OSS and Linux and similar things. there are a bazillion over there, but this is even more crap.