- cross-posted to:
- gaming@lemmy.ml
- linux@lemmy.ml
- cross-posted to:
- gaming@lemmy.ml
- linux@lemmy.ml
We are excited to announce that Arch Linux is entering into a direct collaboration with Valve. Valve is generously providing backing for two critical projects that will have a huge impact on our distribution: a build service infrastructure and a secure signing enclave. By supporting work on a freelance basis for these topics, Valve enables us to work on them without being limited solely by the free time of our volunteers.
This opportunity allows us to address some of the biggest outstanding challenges we have been facing for a while. The collaboration will speed-up the progress that would otherwise take much longer for us to achieve, and will ultimately unblock us from finally pursuing some of our planned endeavors. We are incredibly grateful for Valve to make this possible and for their explicit commitment to help and support Arch Linux.
These projects will follow our usual development and consensus-building workflows. [RFCs] will be created for any wide-ranging changes. Discussions on this mailing list as well as issue, milestone and epic planning in our GitLab will provide transparency and insight into the work. We believe this collaboration will greatly benefit Arch Linux, and are looking forward to share further development on this mailing list as work progresses.
I am curious as per the secure enclave part. Does it mean that they will be signing binaries? Does it mean that we will get secure boot support without self signing? Does it mean that there will be a signing system for the anticheats???
Package signing is used to make sure you only get packages from sources you trust.
Every Linux distro does it and it’s why if you add a new source for packages you get asked to accept a key signature.
For a long time, the keys used for signing were just files on disk, and you protected them by protecting the server they were on, but they were technically able to be stolen and used to sign malicious packages.
Some advanced in chip design and cost reductions later, we now have what is often called a “secure enclave”, “trusted platform module”, or a general provider for a non-exportable key.
It’s a little chip that holds or manages a cryptographic key such that it can’t (or is exceptionally difficult) to get the signing key off the chip or extract it, making it nearly impossible to steal the key without actually physically stealing the server, which is much easier to prevent by putting it in a room with doors, and impossible to do without detection, making a forged package vastly less likely.
There are services that exist that provide the infrastructure needed to do this, but they cost money and it takes time and money to build it into your system in a way that’s reliable and doesn’t lock you to a vendor if you ever need to switch for whatever reason.
So I believe this is valve picking up the bill to move archs package infrastructure security up to the top tier.
It was fine before, but that upgrade is expensive for a volunteer and donation based project and cheap for a high profile company that might legitimately be worried about their use of arch on physical hardware increasing the threat interest.
That sounds awesome. I never understood how a TPM can figure out if an attacker can get the keys if the tpm is on the same machine. Does it check independently the signature of the application that asked for the keys?
Depends on the vendor for the specifics. In general, they don’t protect against an attacker who has gained persistent privileged access to the machine, only against theft.
Since the key either can’t leave the tpm or is useless without it (some tpms have one key that it can never return, and will generate a new key and return it encrypted with it’s internal key. This means you get protection but don’t need to worry about storage on the chip), the attacker needs to remain undetected on the server as long as they want to use it, which is difficult for anyone less sophisticated than an advanced persistent threat.
The Apple system, to its credit, does a degree of user and application validation to use the keys. Generally good for security, but it makes it so if you want to share a key between users you probably won’t be using the secure enclave.
Most of the trust checks end up being the tpm proving itself to the remote service that’s checking the service. For example, when you use your phones biometrics to log into a website, part of that handshake is the tpm on the phone proving that it’s made by a company to a spec validated by the standards to be secure in the way it’s claiming.
I would guess it’s for secure boot support.