- log file name changed from "log.txt" to "dwlog.txt" (so admins who forgot where they put Deadwood can more easily find dwlog.txt)
- Date and time added to Windows dwlog.txt logfile
- log file flushed (updated) whenever there is a second of inactivity (if the server is busy, the log file won't get flushed, but will get flushed when idle)
- INSTALL file changed to use Win32 line breaks and renamed "INSTALL.txt"; file updated to have more comprehensive startup information for CentOS 5 and a note about dwlog.txt.
- Fatal dwood2rc error now correctly noted as a dwood2rc error
- Makefile renamed Makefile.centos5 in src/; Makefile.mingw renamed Makefile.mingw310 (I'm making it clear I only support CentOS 5 and MinGW 3.1.0)
- Cleanup of Makefile for CentOS 5 duende helper
- Some compile-time warnings when compiling in Windows removed
I feel empty and betrayed by Open Source.
For years I believed the big open source lie. I believed that a group of volunteers working together on the internet could make professional-quality end-user software. I was wrong. Flat-out wrong.
What BSD and later on Linux has shown is that a group of volunteers working during their free time can make a free computer operating system which is very powerful, but very difficult to learn.
This is what Linux was when I first used it in 1995. It was no Solaris, but it was a good deal more stable than Windows 3.11 and Windows 95. It would take me hours to get X configured and working, but once it was working, it was rock-solid stable. I wasn't getting any BSODs (messages Windows shows when the system crashes).
I remember I had Windows 95 [1] and Linux in early 1996, and wanted to set up PPP (the most common way of using a modem to access the internet) so I could use Netscape. It took make about half an hour to get Netscape and PPP going in Windows 95; it took me two or three days to get Netscape and PPP to work in Linux.
However, despite the difficulty of using Linux, I preferred it because of its stability. I had one friend with a driver in Windows 95 that was constantly giving him BSODs; I told him he should use Linux more. I remember another friend who tried getting Windows 95 working on an AMD motherboard he had and him reinstalling and re-reinstalling Windows 95 and never getting the system to quite work right.
I remember feeling Microsoft was an evil monopoly because you couldn't make a motherboard or other peripheral without getting Microsoft to support it. I remember Microsoft using their monopolistic practices in the 1990s to drive Netscape out of business and hating Microsoft for doing that.
At the same time, I was resenting all of the companies who wouldn't release hardware specs for their cards so we could use them in Linux, and resenting the companies (like Real networks) who made video players without Linux support. I felt like a second-class computer citizen. There was so much I couldn't do because I used Linux; I couldn't watch videos in Real player. I couldn't use Microsoft office (and spent a good deal of money buying WordPerfect and Applixware's office suite for Linux) or easily read .doc files.
I remember people using StarOffice (since its binary was a free download) and thinking these people were cheap bastards for not getting a real office suite for Linux (I had managed to get a good job in Silicon Valley at this time, so it was nothing to spend $200 or $300 here and there to get an office suite).
Solaris was still big and the issues I had with Microsoft Solaris admins had too. So I wasn't alone in my dislike of Microsoft.
Well, in the 2000s, a lot changed. The dot-com bubble burst and I no longer worked in the tech industry. Solaris basically died, with I got a degree in computational Linguistics and moved to Mexico to discover myself.
Netscape was killed by Microsoft's monopolistic practices; by 2002 everyone was using Internet Explorer. Windows XP came out in 2002 and the instability problems that always plagued Windows 95 were once and for all resolved by having people use a version of Microsoft's server-class code on the desktop.
Linux, however, did not improve their desktop experience.
In the mid-1990s, the big distro to use was Slackware; in 1997 I went from Slackware to RedHat because it was easier to apply security patches in RedHat. RedHat was the dominant distro to use when the dot-com bubble burst in 2001.
RedHat never made a serious effort to make Linux a desktop OS; their bread and butter was selling servers.
In 2003, RedHat realized they couldn't sustain their business model giving their flagship product away, so they did the RHEL/Fedora core split. I used Fedora core for a while, then moved from Fedora Core to CentOS when this free RHEL clone became available.
To fill the vacuum for a desktop-oriented Linux, their was first Mandrake (originally a fork of RedHat 5) then later on, Mark Shuttleworth made a few billion dollars and decided to fund his own free Desktop Linux, Ubuntu.
Because of fragmentation in Linux (the eternal KDE-vs-Gnome battle), desktop development was divided between two different desktop projects (more if you count things like XFCE and Blackbox). Free software developers are more interested in "scratching their own itch" than in adding features that don't benefit themselves but benefit users of the software; this makes it more difficult to motivate people to help with end-user Desktop software.
Linux, simply put, did not have a usable end-user desktop when Windows XP came out in 2002. Microsoft won the race: Microsoft was able to make their high-quality desktop use server-class code for the underlying operating system before Linux could make their server-class operating system an end-user desktop.
Linux still does not have a usable end-user desktop today. Issues that don't matter when making a *NIX server matter a lot when making a desktop computer. It matters on the desktop that the underlying libraries and kernel change so much it's hard to make a binary-only desktop program that will continue to work for the foreseeable future. It matters on the desktop that binary-only drivers are not welcome, and that the kernel developers go out of their way to not allow these kinds of drivers.
I have looked at Linux for years, and have been using Linux since 1995, and as my only real desktop OS from 1995 until 2003. It hasn't changed. It's always been fragmented with strange bugs and a lack of discipline to give users the things they expect a desktop desktop to have. I thought in 2000 we would have the year of the Linux desktop. I realize, in 2009, that we're never going to have the year of the Linux desktop.
Firefox shows that people can make open-source professional-quality desktop software that people will use. About 30% of people surf the internet using Firefox and it didn't come out until 2004 or so. 1%, maybe 2% of people surf the internet using Linux, even though Linux has been around since 1991. In little over four years, Firefox was able to get desktop adoption Linux can only dream of.
It's sad that Linux is too fragmented to do the same thing. Ubuntu's instability (because they have to use unstable versions of software to support new hardware) is on par with Windows 3.11. I'm glad that Linux has become a viable server OS; I'm sad that Linux will never become a viable desktop OS.
- Sam
[1] I didn't use Windows 95 very much back then; Linux was my primary desktop OS and Windows 95 was on my old 486 so I could keep my skills up to date.