Are there any risks or disadvantages to building software from source, compared to installing a package? Can it mess with my system in any way?
I usually avoid it because I’ve found it to be a faff and often doesn’t work anyway but in a couple of cases it has been necessary.
Think about it this way: you’re downloading someone else’s code and running it on your system. The OS doesn’t care: it will give it access to everything your user has access to, but won’t give access to anything else.
So (under the caveat below) the software won’t be able to mess with your system because your user generally can’t mess with your system. However, you still need to trust the software, since it will be able to access e.g. your saved passwords, SSH keys, install a keylogger, etc. In comparison, the binary packages can be seen as safer, because they have more “eyes” on them, and there is more time between the code being published and you running that code on your system.
Caveat: if you run something like
sudo make install, then, of course the risk is way higher, and the package definitely will be able to mess with your system up to and including destroying it.Gentoo user…but assuming now it’s about building on distributions that don’t automate it like gentoo.
Disadvantages:
- No easy way to uninstall again. Some build systems generate lists of files that were installed or even some uninstall rules but that requires to either keep the build directory with the source-code around or to make backups of the necessary build files for proper uninstall. And in some build systems there are no helpers for uninstalling at all.
- Similar…updating doesn’t guarantee to remove all traces of the previous version. If the new build overwrites every file of the previous version…fine. But if an update version doesn’t need files anymore that were installed in previous versions those will usually not get remove from your system and stick around.
- In general lack of automatism for updates
- Compiling takes time
- You are responsible for dealing with ABI breakages in dependencies. In most cases the source-code you compile will depend on other libraries. Either those come from your distro or you also build them from source…but in both cases you are responsible for rebuilding a package if an update to one of the dependencies breaks the ABI.
- You need build-time dependencies and depending on your distro also -devel packages installed. If source-code you install needs an assembler to build you have to install that assembler which wouldn’t be necessary if you installed the binary (You can remove those build-dependencies again of course until you need to rebuild). Similar for -devel packages for libraries from your distro…if the source-code depends on a library coming from your distro it also needs the header files, pkgconfig files and other development relevant files of that library installed which many distros split out in own -devel packages and that aren’t necessary for binaries.
- You have to deal with compile flags and settings. It’s up to you to set the optimization level, architecture and similar for your compiler in environment variables. Not a big deal but still something someone has to look into at the start.
- You have to deal with compile-time options and dependencies. The build-systems might tell you what packages are missing but you have to “translate” their errors into what to install with your package manager and do it yourself. Same for the detection of the build systems…you have to read the logs and possibly reconfigure source-code after installing some dependencies if he build systems turned off features you want because of lacking dependencies.
- Source-code and building need disk space so make sure you have enough free. Similar with RAM…gentoo suggests 2GB of ram for each --job of make/ninja but that’s for extreme cases, you usually can get away with less than 2GB per job.
Of course you also gain a lot of advantages…but that wasn’t asked ;)
You can “escape” most of the mentioned disadvantages by using a distro like gentoo that automates much of this. It’s probably worth a look if you plan on doing this regularly.
edit:typos
Gentoo user here.
Of course I always build every package from source because that’s how Gentoo works.
Well, you get well optimized software for your specific cpu and architecture that often will not run on a different CPU. At the cost of lots of time.
For big ones like Firefox or rust I always choose the prebuilt ones… But everything else is from sources.
Also, another great advantage is to customize package features to your likings, like disable an audio backend or enable another, and such.
you get well optimized software for your specific cpu and architecture
That’s really cool. How does that work?
The irony is that big things like Firefox can get the most advantages from building for your specific CPU variant, especially if you use them frequently.
Yeah, but only when you build with lto+pgo which will take even longer
https://wiki.gentoo.org/wiki/Project:Mozilla/Firefox_Benchmarks_2025_Q1
spend an hour building to save 1ms per page load
Look I don’t have heat in the winter so I compile Firefox for various processors to keep my bedroom warm okay?
The only potential downside is that software is not handled by your package manager, so uninstalling or upgrading can be pain. But there are ways around it like source based package managers or manually building binary packages and then installing them.
The best would be to ask a Gentoo user. :D
Disadvantage (besides the update procedure mentioned by the other answers here) is, it might take lot of time, download lot of dependencies and files and need additional space on your drive to compile. It can be a hassle to install and setup the required tools and libraries too. This highly depends on the project itself if its worth it. In example nobody in their right mind wants to compile their web browser (Firefox, Chromium, whatever) themselves (sorry if I offended someone with that. :D). But a simple and short C program is as simple as running
makecommand in example (given the dependencies are installed, which are most likely for simple programs after a few programs have been compiled).Most of the time you don’t need to compile software. Especially if you trust the source or its in the official repositories of your distribution.
Can it mess with my system in any way?
Depends on what you mean by that.
Just convenience. That’s what packages provide. There’s no special magic under the hood in most cases as a downside to packages, and in most cases for specific projects, this is why stacks have containers, because you set the build steps to include the things you need in a pragmatic way, but now have to mess with static files on a filesystem.
The only disadvantage is that you have to manually update, unless you’ve installed it from the aur.
The main disadvantage is that it’s less automated, and also you don’t get automatic updates without any other package management system in place. If you’re using something like e.g. source packages from the AUR then that solves both those problems and there’s no downsides (beyond extra computational power/time you spend waiting) so long as the package maintainer does their job correctly.
Can it mess with my system in any way?
Not… really? I guess if you’re downloading random tarballs off the internet and running make install without checking the integrity or trustworthiness of what you’re downloading then you could get a virus. But if you’re certain the source you’re getting is legitimate, then I suppose the only way building from source could “mess up your system” is if you mess up your system libraries or something whilst trying to install dependencies.
It has security advantages but it is slower and requires your computer to do more work.
What are the security advantages?
You can disable functionality that you don’t use or want (code that is not used cannot be exploited).
You can enable hardware/kernel specific security mitigations.
You can know what source code corresponds to the generated binary.






