Very useful because the information is almost distribution agnostic as Arch will stick to upstream as much as possible; or at least that's my impression as Debian user reading their wiki.
Also: isn't the Arch wiki the new Gentoo wiki? Because that was the wiki early 2000s and, again, I've never used Gentoo!
> Also: isn't the Arch wiki the new Gentoo wiki? Because that was the wiki early 2000s and, again, I've never used Gentoo!
Exactly my thought! 20 years ago, I used Gentoo, and their wiki was the best. Somewhen the Arch wiki appeared and became better and better. At some point, I was tired of compiling for hours and switched one machine at a time to Arch, and today, the Arch wiki is the number one.
Arch and its wikin were already pretty good when it happened, but the real turning point was when the Gentoo wiki got hacked. After that, it never really recovered, and the Arch wiki must have absorbed a lot of that expertise because that's when it really took off.
as I recall anyway. can't believe it's been so long.
Gentoo's wiki is still great (& Arch's has been great for a long time), but yes, Arch's is probably improving at a faster rate. Arch is also a little more comprehensive when it comes to mainstream tech that's divergent like init & network management - Gentoo's still good here but openrc & netifrc show their influence throughout.
I get the sense the Arch wiki pages has more detail than the man pages themselves.
The wiki captures the knowledge that developers of said apps assume to be common, but don’t actually make sense unless you are bootstrapped into the paradigm.
Most man pages are written for someone who knows pretty precisely what they want to do, but don't recall which nobs to turn in the program to get that done. The Arch wiki instead focuses on someone who has a vague idea of what tools to use but doesn't know how those tools operate.
I've found that with an intermediate understanding, the Arch wiki is so much better that I often times won't even check the man pages. But on the occasions where I know the thing pretty well, they can be quite spotty, especially when it's a weird or niche tool among Arch users. So, depending on how you define "more detail", that might be an illusion.
Arch wiki is far better than most man pages. I've referred to Arch for my own non-Arch systems and when building Yocto systems. Most Arch info applies.
In the ancient days I used TLDP to learn about Linux stuff. Arch wiki is now the best doc. The actual shipped documentation on most Linux stuff is usually terrible.
GNU coreutils have man pages that are correct and list all the flags at least, but suffer from GNU jargonisms and usually a lack of any concise overview or example sections. Most man pages are a very short description of what the program does, and an alphabetic list of flags. For something as versatile and important as dd the description reads only "Copy a file, converting and formatting according to the operands" and there's not even one example of a full dd command given. Yes, you can figure it out from the man page, but it's like an 80s reference, not good documentation.
man pages for util-linux are my go-to example for bad documentation. Dense, require a lot of implicit knowledge of concepts, make references to 90s or 80s technology that are now neither relevant nor understandable to most users.
Plenty of other projects have typical documentation written by engineers for other engineers who already know this. man pipewire leaves you completely in the dark as to what the thing even does.
Credit to systemd, that documentation is actually comprehensive and useful.
Anecdotally the arch wiki expands on the vauge man pages, often with examples for cases actually used by people. And they are much more easily accessible to modify and have instant gratification of publishing changes. Publishing to upstream man pages of a project, need to wait for it to trickle down.
Man pages were always intended to be concise reference material, not tutorials or full docs. More akin to commented header files or a program's --help output, before the latter became common.
(GNU info tried to be a more comprehensive CLI documentation system but never fully caught on.)
The Gentoo wiki was (is in many ways) phenomenal, and I recommend anyone interested in the inner workings of Linux at least walk through a full install from scratch - you learn a lot even just copying the instructions into the terminal.
Gentoo's source based approach was always destined to be less popular than a precompiled distro. Compile times & customization options select for a certain clientele.
All my machines still run Gentoo (I have used it for over 25 years). I just love the package manager. It has become much more low friction with the binary packages and gentoo-kernel(-bin). I regularly visit both the Gentoo and Arch documentation. They even cross reference each other and both are a great resource.
According to my experience, yes, it is. I have used Gentoo (using its wiki to install and configure), then after a few distro hops I was at Arch Linux and the wiki was a blessing and ever since I have found it (>10 years), I never needed anything else. Stuff they have on there applies specifically AND generally. Whereas Gentoo's wiki is usually specific IIRC.
I learned linux by using Arch back in the days when pacman -Syu was almost certain to break something and there was a good chance it would break something unique to your install. This was also back in the days when most were not connected to the internet 24/7 and many did not have internet, I updated when I went to the library which was generally a weekly thing but sometimes it be a month or two and the system breakage that resulted was rococo. Something was lost by Arch becoming stable and not breaking regularly, it was what drove the wiki and fixing all the things that pacman broke taught you a great deal and taught you quickly. Stability is not all that it is cracked up to be, has its uses but is not the solution to everything.
It is the sort of mentality required to reach the place in computing which linux has. Decent chance you have linux running on something you own even if you do not run it on your computer and even if you don't, you do use the internet.
I was hit with that on a remote box in another country! Should have researched the upgrade more before I did it, but it is a personal thing and not work.
However, my IPMI motherboard and FreeBSD's integrated ZFS boot environments might be considered cheating...
If you choose not to upgrade, it is stable. There is no QA department for Linux (or windows, they were let go around 2015) so someone has to endure the instability if there is to be any progress. We should all thank those who run nascent software so those who run stable distros can have stability.
You don't become a mechanic without fixing broken down cars. So in that sense, the shittier the car, the better.
My Linux story is similar. In retrospect I learned it on hard mode, because Gentoo was the first distro I used (as in really used). And Gentoo, especially back around 2004 or so, really gave you fully automatic, armour-piercing, double-barreled footguns.
That you could always just boot from the CD and start again was nice. I think I reinstalled 4-5 times the "first time" before I got it where I wanted to be.
At the same time, I suspect resources like the Arch Wiki are largely responsible for how good AI is at fixing this kind of stuff. So I'm hoping that somehow people realize this and can continue contributing good human-written content (in general).
Definitely, being unpaid LLM trainer for big corporations while nobody actually reads your work is not very encouraging. I wonder what the future will bring.
I do think we will, at some point, face a knowledge crisis because nobody will be willing to upload the new knowledge to the internet.
Then the LLM companies will notice, and they’ll start to create their own updated private training data.
But that may be a new centralization of knowledge which was already the case before the internet. I wonder if we are going to some sort of equilibrium between LLMs and the web or if we are going towards some sort of centralization / decentralization cycles.
I also have some hope that LLMs will annihilate the commercial web of "generic" content and that may bring back the old web where the point was the human behind the content (be it a web page or a discussion). But that what I’d like, not a forecast.
I wouldn't be surprised if LLM companies end up sponsoring certain platforms / news sites, in exchange for being able to use their content of course.
THe problem with LLMs is that a single token (or even a single book) isn't really worth that much. It's not like human writing, where we'll pay far more for "Harry Potter" and "The Art of Computer Programming" than some romance trash with three reads on Kindle.
This is perhaps true from the "language model" point of view, but surely from the "knowledge" point of view an LLM is prioritising a few "correct" data sources?
I wonder about this a lot when I ask LLMs niche technical questions. Often there is only one canonical source of truth. Surely it's somehow internally prioritising the official documentation? Or is it querying the documentation in the background and inserting it into the context window?
LLM companies already do this. Both Reddit and Stack Overflow turned to shit (but much more profitable shit) when they sold their archives to the AI companies for lots of money.
I kind of fear the same. At the same time I wonder if structured information will gain usefulness. Something like man pages are already a great resource for humans, but at same time could be used for autocompletion and for LLMs. Maybe not in the current format but in the same vein.
But longer form tutorials or even books with background might suffer more. I wonder how big the market of nice books on IT topics will be in the future. A wiki is probably in the worst place. It will not be changed with the MR like man pages could be and you do not get the same reward compared to publishing a book.
> nobody will be willing to upload the new knowledge to the internet
I think there will be differences based on how centralized the repository of knowledge is. Even if textbooks and wikis largely die out, I imagine individuals such as myself will continue to keep brief topic specific "cookbook" style collections for purely personal benefit. There's no reason to be averse to publishing such things to github or the like and LLMs are fantastic at indexing and integrating disparate data sources.
Historically sorting through 10k different personal diaries for relevant entries would have been prohibitive but it seems to me that is no longer the case.
Absolutely. Even though I don’t use arch (btw), the wiki is still a fantastic configuration reference for many packages: systemd, acpi, sensors, networkmanager I’ve used it for fairly recently.
You see it referenced everywhere as a fantastic documentation source. I’d love seeing that if I were a contributor
Also if it's not correct someone else will edit it. But with the LLM it's just the LLM and you, and if you correct it is not like it will automatically be updated for all the users.
I just installed Arch (EndeavourOS) and LLM did not help. The problems were new and the LLM’s answers were out-of-date. I wasted about 5 hours. Arch’s wiki and EndeavourOS’s forums were much better. YMMV
They may be preferred, but in a lot of cases they’re pretty terrible.
I had a bit of a heated debate with ChatGPT about the best way to restore a broken strange mdadm setup. It was very confidently wrong, and battled its point until I posted terminal output.
Sometimes I feel it’s learnt from the more belligerent side of OSS maintenance!
Why would you bother arguing with an LLM? If you know the answer, just walk away and have a better day. It is not like it will learn from your interaction.
The Gell-Mann effect? If you can't trust LLM to assist with troubleshooting in the domain one is very familiar (mdadm), then why trust it in another that one is less familiar such as zfs or k8s?
Arguing with an LLM is silly because you’re dealing with two adversarial effects at once:
- As the context window grows the LLM will become less intelligent [1]
- Once your conversation takes a bad turn, you have effectively “poisoned” the context window, and are asking an algorithm to predict the likely continuation of text that is itself incorrect [2]. (It emulating the “belligerent side of OSS maintenance” is probably quite true!)
If you detect or suspect misunderstanding from an LLM, it is almost always best to remove the inaccuracies and try again. (You could, for example, ask your question again in a new chat, but include your terminal output + clarifications to get ahead of the misunderstanding, similar to how you might ask a fresh Stack Overflow question).
(It’s also a lot less fun to argue with an LLM, because there’s no audience like there is in the comments section with which to validate your rhetorical superiority!)
I think it all comes down to curiosity, and I dare think that that's one of the main reasons why someone will be using Arch instead of the plethora of other distros.
Now, granted, I don't usually ask an LLM for help whenever I have an issue, so I may be missing something, but to me, the workflow is "I have an issue. What do I do?", and you get an answer: "do this". Maybe if you just want stuff to work well enough out of the box while minimizing time doing research, you'll just pick something other than Arch in the first place and be on your merry way.
For me, typically, I just want to fix an annoyance rather than a showstopping problem. And, for that, the Arch Wiki has a tremendous value. I'll look up the subject, and then go read the related pages. This will more often than not open my eyes to different possibilities I hadn't thought about, sometimes even for unrelated things.
As an example, I was looking something up about my mouse the other day and ended up reading about thermal management on my new-to-me ThinkPad (never had one before).
Depends on how AI-pilled you are. I set Claude loose on my terminal and just have it fix shit for me. My python versions got all tuckered and it did it instead of me having to fuck around with that noise.
I learned linux on debian first. The xserver (x11 or what as its old name) was not working so I had to use the commandline. I had a short debian handbook and worked through it slowly. Before that I had SUSE and a SUSE handbook with a GUI, which was totally useless. I then went on to use knoppix, kanotix, sidux, GoboLinux, eventually ended up with slackware. These days I tend to use manjaro, despite the drawback that is systemd. Manjaro kind of feels like a mix between arch and slackware. (I compile from source, so I only need a base really for the most part, excluding a few things; I tend to disable most systemd unit files as I don't really need anything systemd offers. Sadly distributions such as slackware kind of died - they are not dead, but too slow in updates, no stable releases in years, this is the hallmark of deadness.)
Slackware only does long term stable releases but Slackware current is a rolling release that does not really feel like a rolling release because of how Slackware provides a full and complete system as the base system. I avoided Slackware current for years because I did not want to deal with the hassle of rolling release, but it is almost identical in experience to using the release.
I actually got a lot of Linux knowledge from the Suse handbooks, but when I was still buying a box in the book store because of slow internet connection in the beginning of the 2000. For Linux content nowadays the Arch wiki is still one of my most used resources although I did not use Arch in years.
I believe this to be the entire ecosystem, not just Arch. It's been a long while since something like moving to 64bit happened. Or swapping out init systems.
Other good examples: Linuxthreads to NTPL (for providing pthreads), XFree86 to Xorg.
I was using Gentoo at the time, which meant recompiling the world (in the first case) or everything GUI (in the second case). With a strict order of operations to not brick your system. Back then, before Arch existed (or at least before it was well known), the Gentoo wiki was known to be a really good resource. At some point it languished and the Arch wiki became the goto.
(I haven't used Gentoo in well over a decade at this point, but the Arch wiki is useful regardless of when I'm using Arch at home or when I'm using other distros at work.)
I'm on Gentoo without the precompiled packages, except for very large applications. Gentoo wiki is not a match for Arch wiki for its sheer content and organization. But Gentoo wiki contains some stuff that Arch wiki doesn't. For example, what kernel features are needed for a certain application and functionality, and how a setup differs between systemd and other inits. I find both wikis quite informative in combination.
Arch was young in those days but I think fairly well known? we were quite vocal, opinionated and interjecting our views everywhere by the time of the Xfree86/Xorg switch. Perhaps it is just my view from being a part of it but I remember encountering the Arch reputation everywhere I went. Or maybe it is just the nostalgia influencing me.
Could be. I don't remember Arch being on my radar at that point though. But it wasn't long after I switched from Gentoo to Arch (and then Debian for a decade due to lack of stability before going back to Arch 7 years ago or so).
A few years before the Xorg thing there was also the 2.4 to 2.6 kernel switchover. I think I maybe was using Slackware at that point, and I remember building my own kernel to try out the new shiny thing. You also had to update some userspace packages if I remember correctly: things like modprobe/insmod at the very least.
Oh yeah, you just unlocked a forgotten memory. I was actually lucky there, I had a SoundBlaster Live 5.1 which worked just fine on ALSA (hardware mixing even worked out of the box). But I remember lots of other people complaining on IRC about it.
Most distros were stable well before Arch because Arch worked out most of the bugs for them and documented them on their wiki. Arch and other bleeding edge distros are still a big part of the stability of linux even if they don't break all that often anymore, they find a lot of the edge cases before they are issue for the big distros. In 2005 it was not difficult to have a stable linux install, you may have had to hunt a bit for the right hardware and it may have taken awhile to get things working but once things were working they tended to be stable. I can only think of one time Slackware broke things for me since I started using it around 2005, it taking on PulseAudio caused me some headaches but I use Slackware for audio work and am not their target so this is to be expected. Crux was rock solid for me into the 10s, nearly a decade of use and not even a hiccup.
Arch linux will still happily blow itself up if you skip updates for too long.
It's to the point where if I see 'archlinix-keyring' in my system update, I immediately abort and run through the manual process of updating keys. That's prevented any arch nuclear disasters for the last couple years
About a year ago, when I installed Arch, my first Linux distro, most things were great. However, while testing some commands in pacman, there were a bunch of Python-related packages (even though I hadn't downloaded them). Since I needed some disk space, I figured deleting them wouldn't hurt. Unfortunately, I couldn't boot again. I guess the ones related to Python were related to Hyprland and Quickshell.
I didn't really get into custom kernels until I started using Crux. A few years after I started using Arch I got sick of the rolling release and Arch's constant breakages, so I started looking into the alternatives, that brought me to Crux (which Arch was based off of) and Slackware (which was philosophically the opposite of Arch without sacrificing the base understanding of the OS). Crux would have probably won out over Slackware if it were not for the switch to 64bit, when confronted with having to recompile everything, I went with the path of least resistance. Crux is what taught me to compile a kernel, in my Arch days I was lucky when it came to hardware and only had to change a few things in the config which the Arch wiki guided me through.
Crux is a great distro for anyone ok with a source distro and I think it might be the best source distro, unlike the more common source distros, it does not do most of the work for you. Also love its influence from BSD, which came in very handy when I started to explore the BSDs and FreeBSD which is my fallback for when Patrick dies or steps back, Crux deserves more attention.
Arch always turned me off with it's rolling release schedule, and I wasn't that impressed with pacman to be honest. I used to love Slack, but they lost their way trying to compete with Ubuntu and the like. I remember thinking how ridiculous it was for mplayer to have samba as a dependency, and the community saying a full install was the intended way to run Slack. I ran it as a minimalist without issues until they started wanting to compete in the desktop space.
The best successor I've found is Alpine. It's minimal and secure by design, has an excellent package manager (I much prefer apk to pacman or xbps, or apt and rpm for that matter), has stable and LTS releases while letting people who want to be rolling release do so by running edge. People think it's only for containers but it has every desktop package anyone would need, all up to date and well maintained. Their wiki isn't at Arch's level, but it's pretty damn good in its own right.
I like alpine because it try to be relatively simple (openrc, busybox, …). My only issue is programs relying on glibc for no reason (altough you can use flatpack). But I’m using openbsd now.
I was never a fan of openbsd, a lot of the security claims are misplaced, bordering on theater. glibc support isn't so bad in Alpine, there are compatibility packages that work for most things if there isn't a flatpak.
Not OP, but used Arch for a while in 2011, and at some point doing an update moved /bin to /usr/bin or something like that and gave me an unbootable system. This was massive inconvenience and it took me many hours to un-hose that system, and I switched to Ubuntu. The Ubuntu became terrible with snaps and other user hostile software, so I switched to PopOS, then I got frustrated with out of date software and Cosmic being incomplete, and am now on Arch with KDE.
Back then I used Arch because I thought it would be cool and it's what Linux wizards use. Now Arch has gotten older, I've gotten older, and now I'm using Arch again because I've become (more of a) Linux wizard.
The silly move from /bin to /usr/bin broke lots of distros. Probably would have worked out if they'd had cp --reflink=auto --update to help ease migrations from files in /bin to /usr/bin and then just symlinked /bin to /usr/bin . However then any setups where /usr is a distinct filesystem from / would hard-require initramfs to set that up before handoff.
The python (is python2) transition was even more silly though. Breaking changes to the API and they wanted (and did!) force re-pointing the command to python3? That's still actively breaking stuff today in places that are forsaken enough to have to support python2 legacy systems.
Arch + KDE is pretty sweet. It looks gorgeous out of the box, and gives you a system that mostly just works but is still everything you love about Arch
Also not OP, I gave up Arch around 2011 as well after I wasn't able to mount a USB pendrive at the uni, as I was rushing somewhere. This was embarrassing and actually a serious issue, took some time to fix upstream and finding workaround was also annoying. This is when I gave up on it and never looked back, but I did, indeed, learn all about Linux internals from dailying Arch for 3 or so years.
Systemd was the end of Arch for me, my rarely used Arch install was massively broken by its first update in ~6 months largely because of systemd. With some work I got things sorted out and working again only to fall into a cycle of breaking the system as I discovered that systemd was very different from what I was used to and did not like me treating it like SysV. Going 6 months without updates would most likely have caused issues with Arch regardless of how stable it had gotten even without the systemd change, but my subsequent and repeated breaking of the system made me realize I no longer had any interest in learning new system stuff, I just wanted a system that would stay out of my way and let me do the things I wanted to use the system for.
I do miss Arch but there is no way I am going to keep up with updates, I will do them when I discover I can't install anything new and then things will break because it has been so long since my last update. Slackware is far more suited to my nature but it will never be Arch.
This would be back in the 00s. I would guess that Arch got stable around 2010? I was using Slackware as my primary system by then so don't know exactly when it happened, someone else can probably fill in the details. I started using Arch when it was quite new, within the first year or two.
The Arch wiki has rapidly become my go-to source for every time I need a real answer... and honestly it should just become my default for everything Linux. It's astoundingly high quality, some of the best content out there whether or not you're using Arch.
So +1000, I love their work, and all the contributors! It's so, so good, and greatly appreciated.
I also find myself using https://man.archlinux.org/ a lot. It's much more readable/user-friendly than https://man7.org plus it contains man-pages from their `extra` repo which contains a lot of popular oss tooling.
I suspect a significant amount of that is due to requiring more than one file (the binary itself) and having to learn about multiple packaging and distribution systems. That's a gigantic wall to climb compared to "put a binary on github"
A lot of developers today want to play the maintener role as well. Where the idiomatic way is to publish source, docs about building and let other people take over distribution. Software like emacs and vim just publish a tarball.
I should write a tool that converts help output to troff, even if the result wouldn't be as detailed and nice to read as a good man page it would save me the frustration of having to stab at "will i get usage docs with a -h, a --help, a -help, or running it with no args at all".
This reminds me of go cli being pretty anal about this: you type `go fmt —help`, and it recognises you want help, but instead of showing the help, it tells you to use the totally non-standard cli pattern of `go help fmt` instead.
As others have mentioned, such tools exist. However, I believe they do more harm than help. Good --help output does not make for good --man output. In particular, while man pages are terse, good ones are more than just lists of command line options, and the part of them that are a list of command line options will usually have more detail than --help. The writing of documentation is a place where I often see programmers employ automation inappropriately.
For Rust programs there's https://docs.rs/clap_mangen/0.2.31/clap_mangen/ that will generate a man page out of the help. (I am sure most programming languages have something like this). However, that's only useful if you are compiling the program (maybe distros could patch Rust programs to generate the man page during the build process)
A more general tool would be pretty good. Either for distros to call during build, after building the program proper; or for users to call.
If users are calling directly, it would be useful to, by default, show the regular man page if it exists, and only if it doesn't exist generate and display one out of --help. Also give it the same flags as man etc. In this case, we could do alias man=better_man and pretend this problem is already solved (but it's still better if distros generate it, so that they can display the man page on the web, etc)
It always struck me as a missed opportunity not to set a standard of `--man` for man page output from everything. GNU could have done that instead of their `info` nonsense.
It does expect quite particular format for --help though iirc if you want a good result. It predates the AI craze by a good 20 years, so it reliably either works or doesn't.
> Because installing a man page requires root and a writeable root fs for that matter
That's not true. The user-equivalent of the man pages directory on Linux and BSDs is `~/.local/share/man`. You can have your mandb index it too. You can achieve more complex manpath indexing setups using the `~/.manpath` file.
I've never used Arch but I can really get the vibe here. Wikis (especially toopical ones) are social media of sorts. There was a strong community around the #emacs IRC channel and emacswiki.org back in the day. About a 100 people who knew each other quite well. And an Emacs bot that could read from the wiki (pre-modern RAG I suppose) and answer questions.
I think with arch wiki it is even more than that. Before I switched to arch back then, you would consult the arch wiki for an unrelated distro, because it was (is) that good. Even the aur repository helps you alot, by checking the raw scripts, how to compile stuff. I can't make a good example but it feeled like reading vi specific wiki that helped you with plugin development for emacs.
AUR is particularly useful because Arch has really simple build scripts. They are bash with some particular function names that you need to define (like "build" and "check") and a few bits of package metadata in variables. Pretty intelligible even if you don't know the format beforehand.
Contrast that with Debian build scripts which I never managed to figure out. It's dozens of layers of helpers for "common cases" with lots of Makefile magic. Completely inscrutable if you aren't already a Debian package maintainer. Very compact though.
The quality of Arch wiki is the reason I could get into Linux. And that pretty much defined my career. So I, probably like many of us, owe a lot to the Arch Linux maintainers.
Their wiki is what sold me on Arch. I ended up there solving most of my problems on other distros, and if they can make such a fine wiki, I figured they could make a great OS (which they did).
Me too, I started with Debian but after a few weeks, I found myself being more on the Arch wiki than the Debian's one so I did the switch and never used any other distro.
I'm sorry to say this but Debian's documentation sucked a lot some years ago.
I came here to post a similar comment. I decided to use Arch because the documentation is amazing. And I wasn't disappointed. It's become my favorite distro.
The ArchWiki is indeed pretty good. I used to prefer the gentoo wiki
back in the days but I think the ArchWiki may be better at this point
in time.
It's also interesting to see that many other Linux distributions fail
to have any wiki at all, yet alone one that has high quality content.
This is especially frustrating because Google search got so worse now
that finding resources is hard. I tried to explain this problem to
different projects in general; in particular ruby-based projects tend
to have really low quality documentation (with some exceptions, e. g.
Jeremy Evans projects tend to have good quality documentation usually,
but that is a minority if you look at all the ruby projects - even
popular ones such as rack, ruby-wasm or ruby opal; horrible quality
or not even any real quality at all. And then rubyists wonder why
they lost to python ...)
Arch wiki is indeed the most informative and comprehensive of all, so much so that users of any distro should find it useful too. Two other distro wikis with smaller, but useful content are Gentoo's and Debian's. Gentoo's speciality in my opinion, is that it contains some lower level information like the required kernel features, and difference between setups using systemd and other inits. Debian wiki contains some information that's related to standards, development, packaging and quality control. These make them useful, despite the availability of the Arch wiki.
Though not distro wikis, there's also a wealth of information on the Linux documentation site and the kernel newbies site. A lot of useful information is also present on Stack Overflow. I just wish that they hadn't shot themselves in the foot by alienating their contributors like this.
Other documentation sources like BSDs' are a bit more organized than that of Linux's, thanks to their strong emphasis on documentation. I wish Linux documentation was a more integrated single source, instead of being scattered around like this. It would have required more effort and discipline regarding documentation. Nevertheless, I guess that I should be grateful for these sources and the ability to leverage them. While I do rely on LLMs occasionally for solutions, I'm not very found of them because they're often very misguided, ill advised and lack the tiny bits of insight and wisdom that often accompany human generated documentation. It would be a disaster if the latter just faded into oblivion due to the over reliance on LLMs.
I used to be in awe of the Arch Wiki, then I found the way FreeBSD and OpenBSD does documentation. I found that Arch Wiki mostly has trick and gotchas explanations, which is fine for a wiki. But sometimes I would prefer that Distros provide a centralized documentation about the base systems they provide.
I think part of the ArchWiki’s strength is that it treats documentation as first-class infrastructure. There is a shared expectation that if you solve something nontrivial, you upstream it into the wiki in a reasonably neutral, upstream-oriented way. That creates compounding returns over time. It also helps that Arch has a relatively coherent user base with similar assumptions about init systems, packaging, and defaults.
Many other distributions fragment their knowledge across mailing lists, forum posts, bug trackers, and random blog entries. That worked when search engines were good at surfacing niche technical content. With current search quality, especially the SEO noise layer, the absence of a canonical, well-curated wiki becomes very visible.
A thanks from me too! I do not use Arch, but still use the wiki as a primary reference to understand various tools. Two recent examples were CUPS and SANE:
Genuinely, the wiki, and the AUR are the two killer features that keep me on Arch (not that I have any reasons to change). Arch is an incredibly polished distro, and is a real pleasure to use.
I just hope they have robust backups and disaster-recovery plans, as Gentoo Wiki once had a terrible data loss, and it was like the burning of the Alexandria Library, I feel that put the distro to a decline. I don't use Arch (I used Gentoo in those times), but these collaborative knowledge bases are too precious to be lost.
It's a wiki. Maybe you lose the edit history and stuff like that, but the actual content which is what matters should be very easy to recreate from those sources.
I'm still somehow surprised at the implicit culture quality (concise, precise, extensive) of that wiki, because it seems there was no strictly enforced rules on how to create it. Similar-minded people recognized the quality and flocked to make it grow.
I don't even use Arch, but I agree that their Wiki is awesome. Unless my problem is super obscure (and sometimes even then), I can nearly always find an answer there. But the best part is that it seems to be never incorrect, unlike essentially every other result in Google.
I agree. It reads like a cook book rather than a dictionary of tech specs. No spam getting in in the way of getting things running and getting things right; If you need details you can go to individual package docs from maintainers and project docs from devs, no need for misaligned redundancy. It is also pretty comprehensive, or at least I have not missed anything yet. And up to date. So, in my opinion, the best distro documentation I know of. And I like their community process too: The most trustworthy and reliable I have seen so far without a big corporation backing it up, except for maybe Debian. Let's keep the donations going, these good people deserve it!
NixOS. Having a config-defined system is a bit too different at first, but really nice when it comes to system reproducibility, and being able to roll back.
It made maintaining my laptop + workstations the "same" a breeze, although it took a bit to learn and settle into something that works for me. It seems today things are easier for newcomers, but Nix Flakes are still "experimental", and thus the documentation on things might seem confusing or misleading sometimes.
I really admire the maintainers' discipline with respects to grooming quality edits and fostering a welcoming environment. Incredibly patient folks in the interactions I've had.
I used Gentoo back in the day and the wiki was good, I even contributed to it at times. Eventually I switched distro (didn't want to spend all my time compiling), and a few years later I went to look at the wiki and it had become much worse.
Do you know what the story was there, what happened? Why was it deleted?
As a Debian user I find myself more in the Archwiki.
Indeed one of the top resources for power users and sysadmins.
The Debian wiki has improved (from a total mess to the occasion helpful content).
Sadly it's orders of magnitudes away from the rigorous approach of the Archwiki.
> Indeed one of the top resources for power users and sysadmins
Back when I was just starting out with Ubuntu, the Arch wiki was super helpful to gain better understanding of various things I came across. I think the wiki in general is useful to anyone who wants to understand things deeper, not just power users and sysadmins :)
I also use ArchWiki as my personal software configuration journal. I know I'll be back to it when I'm going to have to re-install or re-configure something, so I make sure to record any new info I discover, worked out super well for me so far.
Not to worry: I try a lot of distros and still use the Arch wiki regardless. There are some things that differ between distros, but it's pretty generally applicable:)
I'll bite. How does a wiki targeted at users of a specific GNU/Linux distribution, a distribution which has made the express decision to be orientated towards technical users and not provide user-friendly tools for its configuration, exemplify how "Linux" (i.e. any GNU/Linux distribution) is broken on desktop?
I agree. Every time I visit the arch wiki or forums for that matter its typically due to a failure of the way the software is.
For example instead of the OS noticing that zstd was not supported, it would always use a zstd compressed initramfs image and would require the user to manually configure a supported compression their kernel supported. I don't understand why they thought it was a good idea to break my install for something that should be easy to do automatically. One could say that there is value in the forum having information on how to fix my system, but this isn't something I should have ever seen in the first place.
It exemplifies how complicated a "combine software to make your own user space" system is.
I've been running Ubuntu this or that since 2007. Desktops, laptops, work computers, personal computers, servers. There has been some BS to deal with, but frankly with common hardware it's exactly the same as any other system. Desktop runtime with web browser support. Except that you can do whatever you want, if you choose.
The idea of Arch was that it's supposed to be hard mode, if that's even true anymore. Any non-tech person I've showed my computer is like "oo, what is that?" I say "it's a desktop environment, here's the web browser." And that's all there is to it.
Very useful because the information is almost distribution agnostic as Arch will stick to upstream as much as possible; or at least that's my impression as Debian user reading their wiki.
Also: isn't the Arch wiki the new Gentoo wiki? Because that was the wiki early 2000s and, again, I've never used Gentoo!
> Also: isn't the Arch wiki the new Gentoo wiki? Because that was the wiki early 2000s and, again, I've never used Gentoo!
Exactly my thought! 20 years ago, I used Gentoo, and their wiki was the best. Somewhen the Arch wiki appeared and became better and better. At some point, I was tired of compiling for hours and switched one machine at a time to Arch, and today, the Arch wiki is the number one.
Arch and its wikin were already pretty good when it happened, but the real turning point was when the Gentoo wiki got hacked. After that, it never really recovered, and the Arch wiki must have absorbed a lot of that expertise because that's when it really took off.
as I recall anyway. can't believe it's been so long.
Gentoo's wiki is still great (& Arch's has been great for a long time), but yes, Arch's is probably improving at a faster rate. Arch is also a little more comprehensive when it comes to mainstream tech that's divergent like init & network management - Gentoo's still good here but openrc & netifrc show their influence throughout.
I get the sense the Arch wiki pages has more detail than the man pages themselves.
The wiki captures the knowledge that developers of said apps assume to be common, but don’t actually make sense unless you are bootstrapped into the paradigm.
Most man pages are written for someone who knows pretty precisely what they want to do, but don't recall which nobs to turn in the program to get that done. The Arch wiki instead focuses on someone who has a vague idea of what tools to use but doesn't know how those tools operate.
I've found that with an intermediate understanding, the Arch wiki is so much better that I often times won't even check the man pages. But on the occasions where I know the thing pretty well, they can be quite spotty, especially when it's a weird or niche tool among Arch users. So, depending on how you define "more detail", that might be an illusion.
Arch wiki is far better than most man pages. I've referred to Arch for my own non-Arch systems and when building Yocto systems. Most Arch info applies.
In the ancient days I used TLDP to learn about Linux stuff. Arch wiki is now the best doc. The actual shipped documentation on most Linux stuff is usually terrible.
GNU coreutils have man pages that are correct and list all the flags at least, but suffer from GNU jargonisms and usually a lack of any concise overview or example sections. Most man pages are a very short description of what the program does, and an alphabetic list of flags. For something as versatile and important as dd the description reads only "Copy a file, converting and formatting according to the operands" and there's not even one example of a full dd command given. Yes, you can figure it out from the man page, but it's like an 80s reference, not good documentation.
man pages for util-linux are my go-to example for bad documentation. Dense, require a lot of implicit knowledge of concepts, make references to 90s or 80s technology that are now neither relevant nor understandable to most users.
Plenty of other projects have typical documentation written by engineers for other engineers who already know this. man pipewire leaves you completely in the dark as to what the thing even does.
Credit to systemd, that documentation is actually comprehensive and useful.
Anecdotally the arch wiki expands on the vauge man pages, often with examples for cases actually used by people. And they are much more easily accessible to modify and have instant gratification of publishing changes. Publishing to upstream man pages of a project, need to wait for it to trickle down.
Man pages were always intended to be concise reference material, not tutorials or full docs. More akin to commented header files or a program's --help output, before the latter became common.
(GNU info tried to be a more comprehensive CLI documentation system but never fully caught on.)
man pages got replaced by --help in many, many cases.
GNU info was an interesting experiment but it got replaced by online wikis.
The Gentoo wiki was (is in many ways) phenomenal, and I recommend anyone interested in the inner workings of Linux at least walk through a full install from scratch - you learn a lot even just copying the instructions into the terminal.
> Also: isn't the Arch wiki the new Gentoo wiki? Because that was the wiki early 2000s and, again, I've never used Gentoo!
It is, didn't Gentoo suffer some sort of data loss which made it lose its popularity?
Gentoo's source based approach was always destined to be less popular than a precompiled distro. Compile times & customization options select for a certain clientele.
All my machines still run Gentoo (I have used it for over 25 years). I just love the package manager. It has become much more low friction with the binary packages and gentoo-kernel(-bin). I regularly visit both the Gentoo and Arch documentation. They even cross reference each other and both are a great resource.
I think the reference was to Gentoo's wiki, which was indeed hacked and lost data iirc.
But yes, comparing distros themselves, Gentoo will not out compete streamlined and prepackaged distros in the broader adoption metrics.
The wikis themselves are largely distro agnostic and exceptionally useful for everyone on Linux though.
According to my experience, yes, it is. I have used Gentoo (using its wiki to install and configure), then after a few distro hops I was at Arch Linux and the wiki was a blessing and ever since I have found it (>10 years), I never needed anything else. Stuff they have on there applies specifically AND generally. Whereas Gentoo's wiki is usually specific IIRC.
> Also: isn't the Arch wiki the new Gentoo wiki? Because that was the wiki early 2000s and, again, I've never used Gentoo!
man came here to say the same.
used gentoo for all of 5 minutes in 2005 but the wiki was amazing and I referenced it repeatedly for other things.
generally heard the same about the arch wiki, too
I learned linux by using Arch back in the days when pacman -Syu was almost certain to break something and there was a good chance it would break something unique to your install. This was also back in the days when most were not connected to the internet 24/7 and many did not have internet, I updated when I went to the library which was generally a weekly thing but sometimes it be a month or two and the system breakage that resulted was rococo. Something was lost by Arch becoming stable and not breaking regularly, it was what drove the wiki and fixing all the things that pacman broke taught you a great deal and taught you quickly. Stability is not all that it is cracked up to be, has its uses but is not the solution to everything.
>>Something was lost by Arch becoming stable and not breaking regularly
Only a Linux user would consider the instability of a Linux distro to be a good thing.
It is the sort of mentality required to reach the place in computing which linux has. Decent chance you have linux running on something you own even if you do not run it on your computer and even if you don't, you do use the internet.
If your goal is to learn how it works this was great, a new challenge every day.
Perhaps we need a chaosmonkey Linux distro.
Also FreeBSD did this well recently, migrating libc and libsys in the wrong order so you have no kernel API. That was fun.
I was hit with that on a remote box in another country! Should have researched the upgrade more before I did it, but it is a personal thing and not work.
However, my IPMI motherboard and FreeBSD's integrated ZFS boot environments might be considered cheating...
If you choose not to upgrade, it is stable. There is no QA department for Linux (or windows, they were let go around 2015) so someone has to endure the instability if there is to be any progress. We should all thank those who run nascent software so those who run stable distros can have stability.
You don't become a mechanic without fixing broken down cars. So in that sense, the shittier the car, the better.
My Linux story is similar. In retrospect I learned it on hard mode, because Gentoo was the first distro I used (as in really used). And Gentoo, especially back around 2004 or so, really gave you fully automatic, armour-piercing, double-barreled footguns.
Gentoo foot guns were (are) the best!
That you could always just boot from the CD and start again was nice. I think I reinstalled 4-5 times the "first time" before I got it where I wanted to be.
I've contributed 32 edits (1 new page) in the past 10 years, so despite being stable, there are still many things to add and fix!
Sadly, the edit volume will likely drop as LLMs are now the preferred source for technical Linux info/everything...
At the same time, I suspect resources like the Arch Wiki are largely responsible for how good AI is at fixing this kind of stuff. So I'm hoping that somehow people realize this and can continue contributing good human-written content (in general).
> So I'm hoping that somehow people realize this and can continue contributing good human-written content (in general).
AI walled-gardens break the feedback loop: authors seeing view-counts and seeing "[Solved] thank you!" messages helps morale.
Definitely, being unpaid LLM trainer for big corporations while nobody actually reads your work is not very encouraging. I wonder what the future will bring.
I do think we will, at some point, face a knowledge crisis because nobody will be willing to upload the new knowledge to the internet.
Then the LLM companies will notice, and they’ll start to create their own updated private training data.
But that may be a new centralization of knowledge which was already the case before the internet. I wonder if we are going to some sort of equilibrium between LLMs and the web or if we are going towards some sort of centralization / decentralization cycles.
I also have some hope that LLMs will annihilate the commercial web of "generic" content and that may bring back the old web where the point was the human behind the content (be it a web page or a discussion). But that what I’d like, not a forecast.
I wouldn't be surprised if LLM companies end up sponsoring certain platforms / news sites, in exchange for being able to use their content of course.
THe problem with LLMs is that a single token (or even a single book) isn't really worth that much. It's not like human writing, where we'll pay far more for "Harry Potter" and "The Art of Computer Programming" than some romance trash with three reads on Kindle.
This is perhaps true from the "language model" point of view, but surely from the "knowledge" point of view an LLM is prioritising a few "correct" data sources?
I wonder about this a lot when I ask LLMs niche technical questions. Often there is only one canonical source of truth. Surely it's somehow internally prioritising the official documentation? Or is it querying the documentation in the background and inserting it into the context window?
LLM companies already do this. Both Reddit and Stack Overflow turned to shit (but much more profitable shit) when they sold their archives to the AI companies for lots of money.
I kind of fear the same. At the same time I wonder if structured information will gain usefulness. Something like man pages are already a great resource for humans, but at same time could be used for autocompletion and for LLMs. Maybe not in the current format but in the same vein.
But longer form tutorials or even books with background might suffer more. I wonder how big the market of nice books on IT topics will be in the future. A wiki is probably in the worst place. It will not be changed with the MR like man pages could be and you do not get the same reward compared to publishing a book.
> nobody will be willing to upload the new knowledge to the internet
I think there will be differences based on how centralized the repository of knowledge is. Even if textbooks and wikis largely die out, I imagine individuals such as myself will continue to keep brief topic specific "cookbook" style collections for purely personal benefit. There's no reason to be averse to publishing such things to github or the like and LLMs are fantastic at indexing and integrating disparate data sources.
Historically sorting through 10k different personal diaries for relevant entries would have been prohibitive but it seems to me that is no longer the case.
Absolutely. Even though I don’t use arch (btw), the wiki is still a fantastic configuration reference for many packages: systemd, acpi, sensors, networkmanager I’ve used it for fairly recently.
You see it referenced everywhere as a fantastic documentation source. I’d love seeing that if I were a contributor
Also if it's not correct someone else will edit it. But with the LLM it's just the LLM and you, and if you correct it is not like it will automatically be updated for all the users.
I just installed Arch (EndeavourOS) and LLM did not help. The problems were new and the LLM’s answers were out-of-date. I wasted about 5 hours. Arch’s wiki and EndeavourOS’s forums were much better. YMMV
They may be preferred, but in a lot of cases they’re pretty terrible.
I had a bit of a heated debate with ChatGPT about the best way to restore a broken strange mdadm setup. It was very confidently wrong, and battled its point until I posted terminal output.
Sometimes I feel it’s learnt from the more belligerent side of OSS maintenance!
Why would you bother arguing with an LLM? If you know the answer, just walk away and have a better day. It is not like it will learn from your interaction.
The Gell-Mann effect? If you can't trust LLM to assist with troubleshooting in the domain one is very familiar (mdadm), then why trust it in another that one is less familiar such as zfs or k8s?
Maybe GP knew the proposed solution couldn't have worked, without knowing the actual solution?
Arguing with an LLM is silly because you’re dealing with two adversarial effects at once:
- As the context window grows the LLM will become less intelligent [1] - Once your conversation takes a bad turn, you have effectively “poisoned” the context window, and are asking an algorithm to predict the likely continuation of text that is itself incorrect [2]. (It emulating the “belligerent side of OSS maintenance” is probably quite true!)
If you detect or suspect misunderstanding from an LLM, it is almost always best to remove the inaccuracies and try again. (You could, for example, ask your question again in a new chat, but include your terminal output + clarifications to get ahead of the misunderstanding, similar to how you might ask a fresh Stack Overflow question).
(It’s also a lot less fun to argue with an LLM, because there’s no audience like there is in the comments section with which to validate your rhetorical superiority!)
1 - https://news.ycombinator.com/item?id=44564248 2 - https://news.ycombinator.com/item?id=43991256
> It was very confidently wrong, and battled its point
The "good" news is a lot of newer LLMs are grovelling, obsequious yes-men.
I think it all comes down to curiosity, and I dare think that that's one of the main reasons why someone will be using Arch instead of the plethora of other distros.
Now, granted, I don't usually ask an LLM for help whenever I have an issue, so I may be missing something, but to me, the workflow is "I have an issue. What do I do?", and you get an answer: "do this". Maybe if you just want stuff to work well enough out of the box while minimizing time doing research, you'll just pick something other than Arch in the first place and be on your merry way.
For me, typically, I just want to fix an annoyance rather than a showstopping problem. And, for that, the Arch Wiki has a tremendous value. I'll look up the subject, and then go read the related pages. This will more often than not open my eyes to different possibilities I hadn't thought about, sometimes even for unrelated things.
As an example, I was looking something up about my mouse the other day and ended up reading about thermal management on my new-to-me ThinkPad (never had one before).
Depends on how AI-pilled you are. I set Claude loose on my terminal and just have it fix shit for me. My python versions got all tuckered and it did it instead of me having to fuck around with that noise.
I'm not there yet. Not on my work system anyway.
Seen too many batshit answers from chatgpt when I know the answer but don't remember the exact command.
I learned linux on debian first. The xserver (x11 or what as its old name) was not working so I had to use the commandline. I had a short debian handbook and worked through it slowly. Before that I had SUSE and a SUSE handbook with a GUI, which was totally useless. I then went on to use knoppix, kanotix, sidux, GoboLinux, eventually ended up with slackware. These days I tend to use manjaro, despite the drawback that is systemd. Manjaro kind of feels like a mix between arch and slackware. (I compile from source, so I only need a base really for the most part, excluding a few things; I tend to disable most systemd unit files as I don't really need anything systemd offers. Sadly distributions such as slackware kind of died - they are not dead, but too slow in updates, no stable releases in years, this is the hallmark of deadness.)
Slackware only does long term stable releases but Slackware current is a rolling release that does not really feel like a rolling release because of how Slackware provides a full and complete system as the base system. I avoided Slackware current for years because I did not want to deal with the hassle of rolling release, but it is almost identical in experience to using the release.
> The xserver (x11 or what as its old name)
It was XFree86 until around mid 00s after which the X.org fork took over.
I actually got a lot of Linux knowledge from the Suse handbooks, but when I was still buying a box in the book store because of slow internet connection in the beginning of the 2000. For Linux content nowadays the Arch wiki is still one of my most used resources although I did not use Arch in years.
> Arch becoming stable and not breaking regularly
I believe this to be the entire ecosystem, not just Arch. It's been a long while since something like moving to 64bit happened. Or swapping out init systems.
Other good examples: Linuxthreads to NTPL (for providing pthreads), XFree86 to Xorg.
I was using Gentoo at the time, which meant recompiling the world (in the first case) or everything GUI (in the second case). With a strict order of operations to not brick your system. Back then, before Arch existed (or at least before it was well known), the Gentoo wiki was known to be a really good resource. At some point it languished and the Arch wiki became the goto.
(I haven't used Gentoo in well over a decade at this point, but the Arch wiki is useful regardless of when I'm using Arch at home or when I'm using other distros at work.)
I'm on Gentoo without the precompiled packages, except for very large applications. Gentoo wiki is not a match for Arch wiki for its sheer content and organization. But Gentoo wiki contains some stuff that Arch wiki doesn't. For example, what kernel features are needed for a certain application and functionality, and how a setup differs between systemd and other inits. I find both wikis quite informative in combination.
Arch was young in those days but I think fairly well known? we were quite vocal, opinionated and interjecting our views everywhere by the time of the Xfree86/Xorg switch. Perhaps it is just my view from being a part of it but I remember encountering the Arch reputation everywhere I went. Or maybe it is just the nostalgia influencing me.
Could be. I don't remember Arch being on my radar at that point though. But it wasn't long after I switched from Gentoo to Arch (and then Debian for a decade due to lack of stability before going back to Arch 7 years ago or so).
A few years before the Xorg thing there was also the 2.4 to 2.6 kernel switchover. I think I maybe was using Slackware at that point, and I remember building my own kernel to try out the new shiny thing. You also had to update some userspace packages if I remember correctly: things like modprobe/insmod at the very least.
2.6 was also the switch from OSS to Alsa, which caused some fun, Alsa really was not ready for prime time.
Oh yeah, you just unlocked a forgotten memory. I was actually lucky there, I had a SoundBlaster Live 5.1 which worked just fine on ALSA (hardware mixing even worked out of the box). But I remember lots of other people complaining on IRC about it.
Most distros were stable well before Arch because Arch worked out most of the bugs for them and documented them on their wiki. Arch and other bleeding edge distros are still a big part of the stability of linux even if they don't break all that often anymore, they find a lot of the edge cases before they are issue for the big distros. In 2005 it was not difficult to have a stable linux install, you may have had to hunt a bit for the right hardware and it may have taken awhile to get things working but once things were working they tended to be stable. I can only think of one time Slackware broke things for me since I started using it around 2005, it taking on PulseAudio caused me some headaches but I use Slackware for audio work and am not their target so this is to be expected. Crux was rock solid for me into the 10s, nearly a decade of use and not even a hiccup.
Arch linux will still happily blow itself up if you skip updates for too long.
It's to the point where if I see 'archlinix-keyring' in my system update, I immediately abort and run through the manual process of updating keys. That's prevented any arch nuclear disasters for the last couple years
> back in the days when pacman -Syu was almost certain to break something and there was a good chance it would break something unique to your install
This was still the case when I switched to arch in like 2016 lol
Not to mention that they broke EAC only a few years ago.
About a year ago, when I installed Arch, my first Linux distro, most things were great. However, while testing some commands in pacman, there were a bunch of Python-related packages (even though I hadn't downloaded them). Since I needed some disk space, I figured deleting them wouldn't hurt. Unfortunately, I couldn't boot again. I guess the ones related to Python were related to Hyprland and Quickshell.
This brings back memories, same here!
I even bookmarked a page to remember how to rebuild the kernel because I can always expect it breaking.
I didn't really get into custom kernels until I started using Crux. A few years after I started using Arch I got sick of the rolling release and Arch's constant breakages, so I started looking into the alternatives, that brought me to Crux (which Arch was based off of) and Slackware (which was philosophically the opposite of Arch without sacrificing the base understanding of the OS). Crux would have probably won out over Slackware if it were not for the switch to 64bit, when confronted with having to recompile everything, I went with the path of least resistance. Crux is what taught me to compile a kernel, in my Arch days I was lucky when it came to hardware and only had to change a few things in the config which the Arch wiki guided me through.
Crux is a great distro for anyone ok with a source distro and I think it might be the best source distro, unlike the more common source distros, it does not do most of the work for you. Also love its influence from BSD, which came in very handy when I started to explore the BSDs and FreeBSD which is my fallback for when Patrick dies or steps back, Crux deserves more attention.
Arch always turned me off with it's rolling release schedule, and I wasn't that impressed with pacman to be honest. I used to love Slack, but they lost their way trying to compete with Ubuntu and the like. I remember thinking how ridiculous it was for mplayer to have samba as a dependency, and the community saying a full install was the intended way to run Slack. I ran it as a minimalist without issues until they started wanting to compete in the desktop space.
The best successor I've found is Alpine. It's minimal and secure by design, has an excellent package manager (I much prefer apk to pacman or xbps, or apt and rpm for that matter), has stable and LTS releases while letting people who want to be rolling release do so by running edge. People think it's only for containers but it has every desktop package anyone would need, all up to date and well maintained. Their wiki isn't at Arch's level, but it's pretty damn good in its own right.
I like alpine because it try to be relatively simple (openrc, busybox, …). My only issue is programs relying on glibc for no reason (altough you can use flatpack). But I’m using openbsd now.
I was never a fan of openbsd, a lot of the security claims are misplaced, bordering on theater. glibc support isn't so bad in Alpine, there are compatibility packages that work for most things if there isn't a flatpak.
I had somebody’s pgp key break something yesterday :) learned about arch-key ring.
I have started using Arch in 2016 and it was stable back then. Are you describing an earlier era?
Not OP, but used Arch for a while in 2011, and at some point doing an update moved /bin to /usr/bin or something like that and gave me an unbootable system. This was massive inconvenience and it took me many hours to un-hose that system, and I switched to Ubuntu. The Ubuntu became terrible with snaps and other user hostile software, so I switched to PopOS, then I got frustrated with out of date software and Cosmic being incomplete, and am now on Arch with KDE.
Back then I used Arch because I thought it would be cool and it's what Linux wizards use. Now Arch has gotten older, I've gotten older, and now I'm using Arch again because I've become (more of a) Linux wizard.
The silly move from /bin to /usr/bin broke lots of distros. Probably would have worked out if they'd had cp --reflink=auto --update to help ease migrations from files in /bin to /usr/bin and then just symlinked /bin to /usr/bin . However then any setups where /usr is a distinct filesystem from / would hard-require initramfs to set that up before handoff.
The python (is python2) transition was even more silly though. Breaking changes to the API and they wanted (and did!) force re-pointing the command to python3? That's still actively breaking stuff today in places that are forsaken enough to have to support python2 legacy systems.
Arch + KDE is pretty sweet. It looks gorgeous out of the box, and gives you a system that mostly just works but is still everything you love about Arch
Also not OP, I gave up Arch around 2011 as well after I wasn't able to mount a USB pendrive at the uni, as I was rushing somewhere. This was embarrassing and actually a serious issue, took some time to fix upstream and finding workaround was also annoying. This is when I gave up on it and never looked back, but I did, indeed, learn all about Linux internals from dailying Arch for 3 or so years.
> This was also back in the days when most were not connected to the internet 24/7 and many did not have internet
That does sound significantly longer ago then 2016 ;)
The switch to systemd is the last time I FUBARed my system. 2012 it looks like?? I simply did not even remotely understand what I was doing.
Systemd was the end of Arch for me, my rarely used Arch install was massively broken by its first update in ~6 months largely because of systemd. With some work I got things sorted out and working again only to fall into a cycle of breaking the system as I discovered that systemd was very different from what I was used to and did not like me treating it like SysV. Going 6 months without updates would most likely have caused issues with Arch regardless of how stable it had gotten even without the systemd change, but my subsequent and repeated breaking of the system made me realize I no longer had any interest in learning new system stuff, I just wanted a system that would stay out of my way and let me do the things I wanted to use the system for.
I do miss Arch but there is no way I am going to keep up with updates, I will do them when I discover I can't install anything new and then things will break because it has been so long since my last update. Slackware is far more suited to my nature but it will never be Arch.
> I simply did not even remotely understand what I was doing
Why do I miss the stupid unconscious bravery of those days :)
This would be back in the 00s. I would guess that Arch got stable around 2010? I was using Slackware as my primary system by then so don't know exactly when it happened, someone else can probably fill in the details. I started using Arch when it was quite new, within the first year or two.
> Something was lost by Arch becoming stable and not breaking regularly
...a smooth sea never made a skilled sailor
The Arch wiki has rapidly become my go-to source for every time I need a real answer... and honestly it should just become my default for everything Linux. It's astoundingly high quality, some of the best content out there whether or not you're using Arch.
So +1000, I love their work, and all the contributors! It's so, so good, and greatly appreciated.
A lot of pages are very bare or outdated, it should absolutely not be anything besides the default Arch wiki.
I also find myself using https://man.archlinux.org/ a lot. It's much more readable/user-friendly than https://man7.org plus it contains man-pages from their `extra` repo which contains a lot of popular oss tooling.
unfortunately there's a trend lately where many newer cli tools don't have a man page. they put up a --help and think it suffices
even though there are tools to automatically generate man pages those days
I suspect a significant amount of that is due to requiring more than one file (the binary itself) and having to learn about multiple packaging and distribution systems. That's a gigantic wall to climb compared to "put a binary on github"
A lot of developers today want to play the maintener role as well. Where the idiomatic way is to publish source, docs about building and let other people take over distribution. Software like emacs and vim just publish a tarball.
I should write a tool that converts help output to troff, even if the result wouldn't be as detailed and nice to read as a good man page it would save me the frustration of having to stab at "will i get usage docs with a -h, a --help, a -help, or running it with no args at all".
This reminds me of go cli being pretty anal about this: you type `go fmt —help`, and it recognises you want help, but instead of showing the help, it tells you to use the totally non-standard cli pattern of `go help fmt` instead.
Reminds me of the rage of doing `man gnutool` and getting something complaining about how GNU info was where to go.
c-x alt-meta-shift eat-flaming-death
As others have mentioned, such tools exist. However, I believe they do more harm than help. Good --help output does not make for good --man output. In particular, while man pages are terse, good ones are more than just lists of command line options, and the part of them that are a list of command line options will usually have more detail than --help. The writing of documentation is a place where I often see programmers employ automation inappropriately.
For Rust programs there's https://docs.rs/clap_mangen/0.2.31/clap_mangen/ that will generate a man page out of the help. (I am sure most programming languages have something like this). However, that's only useful if you are compiling the program (maybe distros could patch Rust programs to generate the man page during the build process)
A more general tool would be pretty good. Either for distros to call during build, after building the program proper; or for users to call.
If users are calling directly, it would be useful to, by default, show the regular man page if it exists, and only if it doesn't exist generate and display one out of --help. Also give it the same flags as man etc. In this case, we could do alias man=better_man and pretend this problem is already solved (but it's still better if distros generate it, so that they can display the man page on the web, etc)
It always struck me as a missed opportunity not to set a standard of `--man` for man page output from everything. GNU could have done that instead of their `info` nonsense.
This already exists: https://man.archlinux.org/man/extra/help2man/help2man.1.en
It does expect quite particular format for --help though iirc if you want a good result. It predates the AI craze by a good 20 years, so it reliably either works or doesn't.
Then again, the built-in help can not be seperated from the binary and be missing at run-time.
I agree. If it can be launched from the command line, it deserves a man page.
Because installing a man page requires root and a writeable root fs for that matter
> Because installing a man page requires root and a writeable root fs for that matter
That's not true. The user-equivalent of the man pages directory on Linux and BSDs is `~/.local/share/man`. You can have your mandb index it too. You can achieve more complex manpath indexing setups using the `~/.manpath` file.
It doesn't need root if you set MANPATH.
That's great! I didn't know that Arch had online manpages too. I frequently use https://manpages.debian.org/ for similar reasons.
I've never used Arch but I can really get the vibe here. Wikis (especially toopical ones) are social media of sorts. There was a strong community around the #emacs IRC channel and emacswiki.org back in the day. About a 100 people who knew each other quite well. And an Emacs bot that could read from the wiki (pre-modern RAG I suppose) and answer questions.
I think with arch wiki it is even more than that. Before I switched to arch back then, you would consult the arch wiki for an unrelated distro, because it was (is) that good. Even the aur repository helps you alot, by checking the raw scripts, how to compile stuff. I can't make a good example but it feeled like reading vi specific wiki that helped you with plugin development for emacs.
AUR is particularly useful because Arch has really simple build scripts. They are bash with some particular function names that you need to define (like "build" and "check") and a few bits of package metadata in variables. Pretty intelligible even if you don't know the format beforehand.
Contrast that with Debian build scripts which I never managed to figure out. It's dozens of layers of helpers for "common cases" with lots of Makefile magic. Completely inscrutable if you aren't already a Debian package maintainer. Very compact though.
Arch wiki is something special. It is astounding how diverse and detailed (and yet concise) it is.
The quality of Arch wiki is the reason I could get into Linux. And that pretty much defined my career. So I, probably like many of us, owe a lot to the Arch Linux maintainers.
Their wiki is what sold me on Arch. I ended up there solving most of my problems on other distros, and if they can make such a fine wiki, I figured they could make a great OS (which they did).
Yep. Wiki and AUR completeness are hard to pass by.
Me too, I started with Debian but after a few weeks, I found myself being more on the Arch wiki than the Debian's one so I did the switch and never used any other distro.
I'm sorry to say this but Debian's documentation sucked a lot some years ago.
I was definitely the same way at one point but it's worth mentioning that the wiki remains a valuable resource even if you aren't using Arch itself.
e.g., NixOS just links to the archwiki page here for help with systemd timers: https://nixos.wiki/wiki/Systemd/Timers
I came here to post a similar comment. I decided to use Arch because the documentation is amazing. And I wasn't disappointed. It's become my favorite distro.
The ArchWiki is indeed pretty good. I used to prefer the gentoo wiki back in the days but I think the ArchWiki may be better at this point in time.
It's also interesting to see that many other Linux distributions fail to have any wiki at all, yet alone one that has high quality content. This is especially frustrating because Google search got so worse now that finding resources is hard. I tried to explain this problem to different projects in general; in particular ruby-based projects tend to have really low quality documentation (with some exceptions, e. g. Jeremy Evans projects tend to have good quality documentation usually, but that is a minority if you look at all the ruby projects - even popular ones such as rack, ruby-wasm or ruby opal; horrible quality or not even any real quality at all. And then rubyists wonder why they lost to python ...)
Arch wiki is indeed the most informative and comprehensive of all, so much so that users of any distro should find it useful too. Two other distro wikis with smaller, but useful content are Gentoo's and Debian's. Gentoo's speciality in my opinion, is that it contains some lower level information like the required kernel features, and difference between setups using systemd and other inits. Debian wiki contains some information that's related to standards, development, packaging and quality control. These make them useful, despite the availability of the Arch wiki.
Though not distro wikis, there's also a wealth of information on the Linux documentation site and the kernel newbies site. A lot of useful information is also present on Stack Overflow. I just wish that they hadn't shot themselves in the foot by alienating their contributors like this.
Other documentation sources like BSDs' are a bit more organized than that of Linux's, thanks to their strong emphasis on documentation. I wish Linux documentation was a more integrated single source, instead of being scattered around like this. It would have required more effort and discipline regarding documentation. Nevertheless, I guess that I should be grateful for these sources and the ability to leverage them. While I do rely on LLMs occasionally for solutions, I'm not very found of them because they're often very misguided, ill advised and lack the tiny bits of insight and wisdom that often accompany human generated documentation. It would be a disaster if the latter just faded into oblivion due to the over reliance on LLMs.
I used to be in awe of the Arch Wiki, then I found the way FreeBSD and OpenBSD does documentation. I found that Arch Wiki mostly has trick and gotchas explanations, which is fine for a wiki. But sometimes I would prefer that Distros provide a centralized documentation about the base systems they provide.
I think part of the ArchWiki’s strength is that it treats documentation as first-class infrastructure. There is a shared expectation that if you solve something nontrivial, you upstream it into the wiki in a reasonably neutral, upstream-oriented way. That creates compounding returns over time. It also helps that Arch has a relatively coherent user base with similar assumptions about init systems, packaging, and defaults.
Many other distributions fragment their knowledge across mailing lists, forum posts, bug trackers, and random blog entries. That worked when search engines were good at surfacing niche technical content. With current search quality, especially the SEO noise layer, the absence of a canonical, well-curated wiki becomes very visible.
A thanks from me too! I do not use Arch, but still use the wiki as a primary reference to understand various tools. Two recent examples were CUPS and SANE:
https://wiki.archlinux.org/title/CUPS
https://wiki.archlinux.org/title/SANE
I don't use Arch either, but both of those links show as visited in my browser!
What a concentration of knowledge. It's not always my first click for a given problem, but it's often my last.
Genuinely, the wiki, and the AUR are the two killer features that keep me on Arch (not that I have any reasons to change). Arch is an incredibly polished distro, and is a real pleasure to use.
I just hope they have robust backups and disaster-recovery plans, as Gentoo Wiki once had a terrible data loss, and it was like the burning of the Alexandria Library, I feel that put the distro to a decline. I don't use Arch (I used Gentoo in those times), but these collaborative knowledge bases are too precious to be lost.
https://news.ycombinator.com/item?id=44900319
I download the Kiwix copy of the Arch wiki every year or two, as my offline source of linux knowledge in case I find myself offline for some reason.
Wouldn't everything be on the internet archive? And common crawl?
Being on the internet archive and being able to pick up from a restored backup are two very different things
It's a wiki. Maybe you lose the edit history and stuff like that, but the actual content which is what matters should be very easy to recreate from those sources.
I am not exaggerating slightly when I say that arch wiki taught me how to use Linux
I’ve always been dabbling in Linux since 2007 but I never really felt productive in it until i discovered arch. And it’s outstanding wiki
I'm still somehow surprised at the implicit culture quality (concise, precise, extensive) of that wiki, because it seems there was no strictly enforced rules on how to create it. Similar-minded people recognized the quality and flocked to make it grow.
I can see that too, this might be a reason for its success, all the articles are very straightforward and full of paths you may encounter
I don't even use Arch, but I agree that their Wiki is awesome. Unless my problem is super obscure (and sometimes even then), I can nearly always find an answer there. But the best part is that it seems to be never incorrect, unlike essentially every other result in Google.
I agree. It reads like a cook book rather than a dictionary of tech specs. No spam getting in in the way of getting things running and getting things right; If you need details you can go to individual package docs from maintainers and project docs from devs, no need for misaligned redundancy. It is also pretty comprehensive, or at least I have not missed anything yet. And up to date. So, in my opinion, the best distro documentation I know of. And I like their community process too: The most trustworthy and reliable I have seen so far without a big corporation backing it up, except for maybe Debian. Let's keep the donations going, these good people deserve it!
I don't use Arch anymore, yet I still find myself reading their wiki from time to time. It's a phenomenal resource.
What are you using now?
NixOS. Having a config-defined system is a bit too different at first, but really nice when it comes to system reproducibility, and being able to roll back.
It made maintaining my laptop + workstations the "same" a breeze, although it took a bit to learn and settle into something that works for me. It seems today things are easier for newcomers, but Nix Flakes are still "experimental", and thus the documentation on things might seem confusing or misleading sometimes.
Nix Flakes are around for years and still experimental?
Yup. I wish they graduate soon, they have been great for a long while. I'm not sure what's blocking them today.
I just use them like they are stable and will deal with any breakages later. I doubt any breakage will be too bad and that it'd even affect me.
The word experimental has no meaning here anymore. I forgot the last time that someone brought that up.
I really admire the maintainers' discipline with respects to grooming quality edits and fostering a welcoming environment. Incredibly patient folks in the interactions I've had.
Arch wiki is one of the reasons I stick to Arch as my daily driver
Aside, but it's pretty neat that the author has been semi-regularly posting on their blog for over 20 years.
Neat? I would use other words. Fraudsters rely on marketing. https://danielpocock.com/en/matthias-fsfe-analogous-identity...
That guy has a history of harassment in Debian, and is not a credible source.
gentoo forums & wiki initially were the goto place until it was deleted.
I used Gentoo back in the day and the wiki was good, I even contributed to it at times. Eventually I switched distro (didn't want to spend all my time compiling), and a few years later I went to look at the wiki and it had become much worse.
Do you know what the story was there, what happened? Why was it deleted?
It was? Did they bring it back because I do see stuff on their forums..
I do prefer gentoo wiki over arch wiki from time to time as things feel less cluttered to me but that's just my opinion.
As a Debian user I find myself more in the Archwiki. Indeed one of the top resources for power users and sysadmins.
The Debian wiki has improved (from a total mess to the occasion helpful content). Sadly it's orders of magnitudes away from the rigorous approach of the Archwiki.
> Indeed one of the top resources for power users and sysadmins
Back when I was just starting out with Ubuntu, the Arch wiki was super helpful to gain better understanding of various things I came across. I think the wiki in general is useful to anyone who wants to understand things deeper, not just power users and sysadmins :)
I've had a Mac rather than a Linux machine at home for the last five or so years. Before that the arch wiki saved the day for me so many times.
Reading this has me looking for a junker laptop on eBay.
It’s one of the best resources out there which helped me learn linux
I also use ArchWiki as my personal software configuration journal. I know I'll be back to it when I'm going to have to re-install or re-configure something, so I make sure to record any new info I discover, worked out super well for me so far.
I switched from Fedora in 2013 because the ArchWiki was answering all of my questions. It’s very, very good.
aaaaand you can download it: https://bbs.archlinux.org/viewtopic.php?id=94201
Also available as a .zim file for Kiwix (which also includes packages of offline wikipedia <100GB). https://browse.library.kiwix.org/viewer#archlinux_en_all_max...
Me too. I tried various of distros before, archwiki is the best thing. I learned so much Linux knowledge from it.
Not to worry: I try a lot of distros and still use the Arch wiki regardless. There are some things that differ between distros, but it's pretty generally applicable:)
I for one, find that blue Arch Linux sweater he's wearing to be 10/10. Super cute.
ArchWiki is great. Lot's of useful details for any Linux user.
This wiki exemplifies how broken Linux (on desktop) is and it's weird Linux fans ignore this fact.
I'll bite. How does a wiki targeted at users of a specific GNU/Linux distribution, a distribution which has made the express decision to be orientated towards technical users and not provide user-friendly tools for its configuration, exemplify how "Linux" (i.e. any GNU/Linux distribution) is broken on desktop?
(I use Arch btw)
I agree. Every time I visit the arch wiki or forums for that matter its typically due to a failure of the way the software is.
For example instead of the OS noticing that zstd was not supported, it would always use a zstd compressed initramfs image and would require the user to manually configure a supported compression their kernel supported. I don't understand why they thought it was a good idea to break my install for something that should be easy to do automatically. One could say that there is value in the forum having information on how to fix my system, but this isn't something I should have ever seen in the first place.
https://archlinux.org/news/moving-to-zstandard-images-by-def...
I wish there was a wiki like that for Windows.
It exemplifies how complicated a "combine software to make your own user space" system is.
I've been running Ubuntu this or that since 2007. Desktops, laptops, work computers, personal computers, servers. There has been some BS to deal with, but frankly with common hardware it's exactly the same as any other system. Desktop runtime with web browser support. Except that you can do whatever you want, if you choose.
The idea of Arch was that it's supposed to be hard mode, if that's even true anymore. Any non-tech person I've showed my computer is like "oo, what is that?" I say "it's a desktop environment, here's the web browser." And that's all there is to it.