Skip to content

Tsigs

Goal

Not actually in detail tor consensys transparency, would like to make it broader. It's just one example of what can be build with the new tool we have.

(The tool is https://www.sigsum.org/)

And want to figure out: other things we can do in our vicinity / tor space.

Agenda

  1. Transparent signatures
    • Property: discoverable signature
    • If someone makes such a signature, everyone can figure out the signature has been made. 2.1. Example usage, filippo's age download verification uses this. Want to talk about why it's valuable. 2.2. The example of Tor consensus transparency, a bit more complicated because more moving parts.
  2. What more could we do?

Running notes

(Part 1.)

Transparency logs, used where today?

  • Certificate Transparency
  • ...

Append-only log, 'public ledger'. A thing that you can only add entries to, and anyone can see it. Cryptographically, it's efficient to prove it's append-only. Also efficient to prove that something is included. (Note: log can change, but such change is then detectable. So we're talking about detection, not prevention wrt. append-only property.)

If you put something into a tlog, then you get back a 'proof of inclusion'.

  • Math proof that the item is in the log
  • Plus a couple of signatures (based on metadata about the log and so-called witnesses). Verifier chooses which logs/witnesses to depend on. I.e., you can build your own verification policy. Idea is that witness set should be diverse, to make it 'hard' for them to collude against you.

If a signature is made with my key, anyone can see that a signature has been done. Including me as a signer. So called "key usage transparency". Idea is that you know when you use your key, and you would like to know if someone broke into your computer and did a signature; or stole your key and signed, etc.


(Part 2.1.) This is what filippo does for age. He includes not only an ordinary signature if you know the public key. He also includes one of these transparent signatures.

When filippo's monitor finds a signature for his key and he knows it wasn't him, then he can start acting on it (revoke the key, etc).

Has been doing it for a few releases now, it's an early test for end users. We know end users often don't verify anything, but some people do. Would be interested to know what you think about it as a user.

Simon Josefsson also is doing this for some Debian source code release stuff.

TL;DR: view it as an ordinary signature, with some additional properties. In particular 'discoverability'. Also note the transparent signature is self contained, i.e., as a user you never go and talk directly to the tlog. I.e., the proof here is 'offline verifiable', just like a regular signature.


(Part 2.2)

Tor consensus, small number of directory authorities, say k. To make a consensus (basically a list of all relays), need a majority of DAs to sign the consensus.

So every DA has a signing key. This key needs to be guarded.

Would be nice to know if someone managed to steal the key, or get onto the computer and ask the HSM for signatures or similar.

So this is where tor consensus transparency comes from, key usage transparency for the keys used by DAs.

Was implemented after last Tor gathering, Roger and others were interested and started poc:ing.

Today 2 out of the k are submitting to a single tlog.

They submit the consensus document without the signature.

Client downloads tor consensus: its the consensus + signatures, and the signatures are not tlogged for $reasons.

There's a monitor that queries the tlog.

There's an archive for all consensus documents (already existed since before, collector.torproject.org), and the monitor scrapes this as well.

What the monitor does:

  • Sends email to a few people, including a metric-alerts list at tor (anyone can subscribe and linus can also set you up to receive emails directly)

Monitor sees something in tlog, then also expects to see it in collector.tpo. And the other way around, if it sees something in collector.tpo it should also be discoverable in the log. Mismatch = something is wrong, alert alert.

The case "in collector but not in the log": In a great future we have clients and relays to not only verify the regular signatures but also the transparent signatures. Then one who steal or use a key will have to put it in a translog since otherwise it will not be a valid consensus. That's how we force the attacker to translog, which helps us detect that there are attacks happening.

Getting to this fail-closed verification at clients will take a while. But we can get started and run this.

A DA that publishes a consensus without transloggning, that's also a breach of the contract and a monitor can detect that.

It can also be detected that e.g. a DA signs twice in an hour, which is not expected to happen.

Question: not thinking of whether collector is working or not, is it fair to say this transparency system for consensus parts serves three use cases: 1. alerting people who run DAs for cases where their key gets abused 2. we can detect if DAs do something weird with keys, like double signing within an hour. 3. at some point we want to protect client from using a consensus that is not appropriate for whatever reasons?

Comment: the obvious attack today is, anyone that can produce a valid consensus (with enough keys) -> then no one would know. So this gives visibility into what goes out. As long as consensus users do verify.

Comment: yes without user verification, we don't get all the properties. But you could still detect if a key is used in an inappropriate way that's logged.

Comment: another use case (that is not key usage transparency) is archive transparency. You can't change the archive after the fact.

Question: maybe out of scope, but out of curiousity: how often is there disagreement between DAs on consensus documents?

It happens that one of the k is missing every second week or so. And they might have an outage of some sort, connectivity or hw/sw issue, or time sync issue or similar. Ops stuff. Or they might not have renewed their medium-term key, that's between 3-12 months.

Question: post quantum story to this?

In the current thing we're using, no. We use Ed25519 keys and SHA256. Opinionated v1 design. Note: this is not part of a handshake and you can record and later etc... Would require active attack right now. But would imagine there is a future version with PQ stuff.

Comment: Chrome is working on something like this, Merkle tree certificates.

IETF work in PLANTS group.

What's the biggest overhead in tor consensus? * Signatures, the other proof stuff is trim:able so it's not redundant across all DAs.

Question: how much can we grow the consensus without breaking something? It has to be weighted against the benefits of course.

Comment: would distribute this in extrainfo be enough? Why can't that be done? Content-addressable hash, in consensus you have that and clients need to fetch it.

But then everyone still has to fetch it, so not gaining anything.

Question: what if: it's possible to fetch this for each relay that you need micro descriptor for, so itemize into smaller chunks?

Need the signatures to verify first before the chunks can be used.

...some CT questions. Comment: privacy-preserving and incrementally deployable certificate transparency in Tor, PETS paper.

Attack vector: DDoS witnesses, your updates get withhold. And didn't have smart ideas.

But you already have this, DAs signing infra.

But we control this! As opposed to 30 witnesses.

Note: it's up to us to decide who the witnesses are. You can even run your own log. So it's not conceptually different, it's just client policy change.

Note: the initial work of bryan ford was not as flexible wrt. policy.

Comment: if anyone wants to try running a witness behind tor, that would be very cool. The protocol allows for this, no need to have a stable public IP address.

Links: * conensus transparency project: https://gitlab.torproject.org/linus/tor-consensus-transparency


(Part 3. What more can we do?)

Releasing Tor Browser with transparency? (Downloading)

Autoupdate of Tor Browser with transparency?

ctor, since a couple of years ago we have reproducible tor tarballs. It's usually 2-of-3 david/nickm/ahf confirming reporobuilding and we could log here. * ceremony : now we have a commit we wanna release, then we start a pipeline in gitlab which generates a tarball based on this. then locally, we run our local part which takes the git version; checks it out; builds tarball checks it is the same as CI, and then prompts to make the signature. And we publish the signatures. And once that is done, we poke debian maintainers and a little while after forum post with release details.

(What directory authorities do today wrt translogging: we have a program that takes consensus, makes a hash of it, makes an https request to one or more logs, and then wait for getting a proof back. That is the critical part that requires at least one of these multiple logs to be up and available.)

https://gitlab.torproject.org/linus/tor/-/tree/tlog-consensus/contrib/dirauth-tools/consensus-transparency

Note: if we remove monitors, we achieved nothing. If no one looks at the logs, it's just a complicated way of doing things. Monitoring is often glossed over, but this is the core. The log is not the core, and witnesses is just the root of trust for ensuring the log is behaving correctly. Monitors keep an eye out for anomolies -- are there any claims that are false wrt. what is logged?

Question: who are the ppl that act as the monitor?

Right now only ln5 in tor consensus transparency. A bit hacky, proof of concept. But if you're adventurous and you're up for pain, set it up and try! Long run we should have something better. It would be best if everyone that has skin in the game runs one monitor themselves.

Question: is it pain because code not great or something else?

Because the code is not great, it's a couple of bash scripts. The reason for that is: if you wanna do simplest version, it's curl, bash, ...; so this is why wanted to do it in a bash script before doing it in a proper language. Now everyone can see that there's no magic, just text files, awk, grep, etc.

ahf/geko considers trying to run a monitor.

https://gitlab.torproject.org/linus/consensus-transparency-monitor

Question: get this going for the bridge authority? It doesn't vote just makes bridge dictator decisions. So a separate consensus. And would think it makes more sense there, since already single point of failure? And bsd-george would probably be happy to submit?

Question: we have spare servers, anything we should run? If you want to run a monitor, ln5 would be happy to help onboard. Also if you wanna run a witness (maybe one that is submitting exclusively over Tor) -> we are also happy to help onboard.

https://git.glasklar.is/sigsum/admin/ansible/-/tree/main/roles/litewitness

So lots of opportunities to help with operational things. The gain here is also, you all will throw heat on ln5 when things don't work etc. That is valuable, i.e., bug reports on setup instructions etc.

Question: is this the future, or is there a thing emerging the next 10y? Is there something else that we will need to switch to in the future?

Value of the system not only depends on monitor, but also the witnesses. The key to real value is witnesses, because without diversity there we're back to where we started with single trusted party.

Fortunate: we didn't give up on finding a common format to do witnessing. We've pushed that for 2 years with Google... Google alerady have CT, go checksum database, etc. So they have stuff in production and are in on witnessing. Sigstore is also partly google driven, and we're in constant discussion with them. We might be able to get things into one thing in the future.

witness-network.org <-- rgdd is one of the maintainers, the other two are Al Cutter and filippo.

Comment: had a grant we ended up saying no to, dependency stuff very different from ctor. We're now members of rust foundation supply chain security stuff. This is something we could talk to some of them about. Could be interesting for when we release. Publish information about what we've been building. #1 feedback from 3rd party packagers, there are so many dependencies in tor's rust stuff...

Comment: interesting that you mentioned Go checksum db. See also gopherwatch by mjl-. Monitor that sends you email when stuff happens with your go modules.

Comment: 5-7 may tlog stuff in stockholm if anyone is interested, see info: * https://git.glasklar.is/sigsum/project/documentation/-/blob/main/archive/2026-03-02-community-meetup-in-may-info

Comment: if anyone wants to see another usage application, we have a 'sign-if-logged' application for Tillitis TKey. Signs messages if they are already in a translog, makes it possible to get translog benefits in legacy systems that don't understand translogs. * https://git.glasklar.is/sigsum/apps/sign-if-logged