Quantcast
Channel: Dhole Moments
Viewing all articles
Browse latest Browse all 67

Don’t Use Session (Signal Fork)

$
0
0

Last year, I outlined the specific requirements that an app needs to have in order for me to consider it a Signal competitor.

Afterwards, I had several people ask me what I think of a Signal fork called Session. My answer then is the same thing I’ll say today:

Don’t use Session.

The main reason I said to avoid Session, all those months ago, was simply due to their decision to remove forward secrecy (which is an important security property of cryptographic protocols they inherited for free when they forked libsignal).

Lack of forward secrecy puts you in the scope of Key Compromise Impersonation (KCI) attacks, which serious end-to-end encryption apps should prevent if they want to sit at the adults table. This is why I don’t recommend Tox.

And that observation alone should have been enough for anyone to run, screaming, in the other direction from Session. After all, removing important security properties from a cryptographic security protocol is exactly the sort of thing a malicious government would do (especially if the cover story for such a change involves the introduction of swarms and “onion routing”–which computer criminals might think sounds attractive due to their familiarity with the Tor network).

Unfortunately, some people love to dig their heels in about messaging apps. So let’s take a closer look at Session.

I did not disclose this blog post privately to the Session developers before pressing publish.

I do not feel that cryptographic issues always require coordinated disclosure with the software vendor. As Bruce Schneier argues, full disclosure of security vulnerabilities is a “damned good idea”.

I have separated this blog post into two sections: Security Issues and Gripes.

Security Issues

  1. Insufficient Entropy in Ed25519 Keys
  2. In-Band Negotiation for Message Signatures
  3. Using Public Keys as AES-GCM Keys

Insufficient Entropy in Ed25519 Keys

One of the departures of Session from Signal is the use of Ed25519 rather than X25519 for everything.

Ed25519 Keypairs generated from their KeyPairUtilities object only have 128 bits of entropy, rather than the ~253 bits (after clamping) you’d expect from an Ed25519 seed.

fun generate(): KeyPairGenerationResult {
    val seed = sodium.randomBytesBuf(16)
    try {
        return generate(seed)
    } catch (exception: Exception) {
        return generate()
    }
}

fun generate(seed: ByteArray): KeyPairGenerationResult {
    val padding = ByteArray(16) { 0 }
    val ed25519KeyPair = sodium.cryptoSignSeedKeypair(seed + padding)

As an implementation detail, they encode a recovery key as a “mnemonic” (see also: a gripe about their mnemonic decoding).

Does This Matter?

You might think that clearing the highest 128 bits of the Ed25519 seed is fine for one of the following reasons:

  1. It’s hashed with SHA512 before clamping.
  2. Ed25519 only offers 128 bits of security.
  3. Some secret third (and possibly unreasonable) argument.

It’s true that Ed25519 targets the 128-bit security level, if you’re focused on the security of the Elliptic Curve Discrete Logarithm Problem (ECDLP).

Achieving 128 bits of security in this model requires 256-bit secrets, since the best attack against the ECDLP finds a discrete logarithm in \sqrt{n} guesses.

Additionally, having 256-bit secrets makes the multi-user security of the scheme easy to reason about, whereas 128-bit secrets makes it a lot harder. (This mostly comes up in criticism of AES, which has a 128-bit block size.)

When your secret only has 2^{128} possible values, your multi-user security is no longer as secure as Ed25519 expects. This smaller probability space for the seed opens the door to batch attacks that should otherwise not be possible against Ed25519.

UPDATE (2025-01-18)

The original version of this blog post went on to say that it should be possible for an attack to recover the secret key from a public key in only 2^{64} work. It also speculated that an adversary like the NSA would find this an attractive opportunity for a “NOBUS” backdoor: Only them and an adversary with similar computing resources could hope to exploit it.

There is still an attack here, of course, but I’ll save that for my follow-up post since Session decided to issue a lame response.

In-Band Negotiation for Message Signatures

If you thought the previous issue was mitigated by the use of Ed25519 signatures on each message, don’t worry, the Session developers screwed this up too!

// 2. ) Get the message parts
val signature = plaintextWithMetadata.sliceArray(plaintextWithMetadata.size - signatureSize until plaintextWithMetadata.size)
val senderED25519PublicKey = plaintextWithMetadata.sliceArray(plaintextWithMetadata.size - (signatureSize + ed25519PublicKeySize) until plaintextWithMetadata.size - signatureSize)
val plaintext = plaintextWithMetadata.sliceArray(0 until plaintextWithMetadata.size - (signatureSize + ed25519PublicKeySize))
// 3. ) Verify the signature
val verificationData = (plaintext + senderED25519PublicKey + recipientX25519PublicKey)
try {
    val isValid = sodium.cryptoSignVerifyDetached(signature, verificationData, verificationData.size, senderED25519PublicKey)
    if (!isValid) { throw Error.InvalidSignature }
} catch (exception: Exception) {
    Log.d("Loki", "Couldn't verify message signature due to error: $exception.")
    throw Error.InvalidSignature
}

What this code is doing (after decryption):

  1. Grab the public key from the payload.
  2. Grab the signature from the payload.
  3. Verify that the signature on the rest of the payload is valid… for the public key that was included in the payload.

Congratulations, Session, you successfully reduced the utility of Ed25519 to that of a CRC32!

Art: AJ

How does this reduce the Ed25519 signature to a CRC32?

I was being flippant, but it is entirely counterproductive.

In a secure protocol, you first establish the public key for your recipient out-of-band. This is generally what handshake protocols are for, in messaging apps. In TLS contexts, we use certificates for the same reason.

Then, once you have a trusted public key, you use that to validate a signature on the message. Because you have a separate mechanism and process for figuring out which public key is correct, a valid signature is proof that the message is authentic.

What Session did was YOLO it, and shove their public key in the message when encrypting it, and then just blindly trust it on the decrypt path.

If an attacker can arbitrarily substitute public keys in chosen plaintexts, the Ed25519 signature doesn’t really prove anything beyond “this wasn’t tampered with in-transit”.

Which is the exact same guarantee you’d get out of CRC32. Hence, the flippant remark!

This line from the above snippet is the Achilles heel of this protocol (whitespace added to remove scrollbars):

val senderED25519PublicKey = plaintextWithMetadata.sliceArray(
  plaintextWithMetadata.size - (
    signatureSize + ed25519PublicKeySize
  ) until plaintextWithMetadata.size - signatureSize
)

But it’s also even dumber than that: The encryption that wraps all of this is authenticated encryption (XChaCha20-Poly1305 and XSalsa20-Poly1305 in different places), which means a CRC32-alike isn’t even necessary, so this is just a waste of resources.

In short, this Ed25519 signature is pure security theater. Let’s move on.

Using Public Keys As AES-GCM Keys

I wasn’t entirely sure whether this belongs in the “gripes” section or not, because it’s so blatantly stupid that there’s basically no way Quarkslab would miss it if it mattered.

Update (2025-01-18): It turns out, this one did belong in the gripes section and there is no security issue. I’ll explain more in the follow-up blog post.

When encrypting payloads for onion routing, it uses the X25519 public key… as a symmetric key, for AES-GCM. See, encryptPayloadForDestination().

val result = AESGCM.encrypt(plaintext, x25519PublicKey)
deferred.resolve(result)

Session also does this inside of encryptHop().

val plaintext = encode(previousEncryptionResult.ciphertext, payload)
val result = AESGCM.encrypt(plaintext, x25519PublicKey)

In case you thought, maybe, that this is just a poorly named HPKE wrapper… nope!

 /**
 * Sync. Don't call from the main thread.
 */
internal fun encrypt(plaintext: ByteArray, symmetricKey: ByteArray): ByteArray {
    val iv = Util.getSecretBytes(ivSize)
    synchronized(CIPHER_LOCK) {
        val cipher = Cipher.getInstance("AES/GCM/NoPadding")
        cipher.init(Cipher.ENCRYPT_MODE, SecretKeySpec(symmetricKey, "AES"), GCMParameterSpec(gcmTagSize, iv))
        return ByteUtil.combine(iv, cipher.doFinal(plaintext))
    }
}

This obviously doesn’t encrypt it such that only the recipient (that owns the secret key corresponding to the public key) can decrypt the message. It makes it to where anyone that knows the public key can decrypt it.

I wonder if this impacts their onion routing assumptions?

Why should I trust session?

(…)

When using Session, your messages are sent to their destinations through a decentralised onion routing network similar to Tor (with a few key differences) (…)

Session FAQs

What should Session have done instead?

If you have an X25519 public key, and you want to encrypt data against that X25519 public key (such that only the possessor of the corresponding secret key can decrypt it), you can just use libsodium’s crypto_box_seal() and crypto_box_seal_open() APIs and call it a day.

I’ve seen freshman cryptography projects that screw up less badly, and they tend to encrypt with RSA directly.

Gripes

Some of these aren’t really security issues, but are things I found annoying as a security engineer that specializes in applied cryptography.

  1. Mnemonic Decoding Isn’t Constant-Time
  2. Unsafe Use of SecureRandom on Android

Mnemonic Decoding Isn’t Constant-Time

The way mnemonics are decoded involves the modulo operator, which implicitly uses integer division (which neither Java nor Kotlin nor Swift implement in constant-time).

return wordIndexes.windowed(3, 3) { (w1, w2, w3) ->
    val x = w1 + n * ((n - w1 + w2) % n) + n * n * ((n - w2 + w3) % n)
    if (x % n != w1.toLong()) throw DecodingError.Generic
    val string = "0000000" + x.toString(16)
    swap(string.substring(string.length - 8 until string.length))
}.joinToString(separator = "") { it }

This isn’t a real security problem, but I did find it annoying to see in an app evangelized as “better than Signal” on privacy forums.

Unsafe Use of SecureRandom on Android

The recommended way to get secure random numbers on Android (or any Java or Kotlin software, really) is simply new SecureRandom(). If you’re running a service in a high-demand environment, you can take extra care to make a thread-local instance of SecureRandom. But a local RNG for a single user isn’t that.

What does Session do? They use SHA1PRNG, of course.

public static byte[] getSecretBytes(int size) {
  try {
    byte[] secret = new byte[size];
    SecureRandom.getInstance("SHA1PRNG").nextBytes(secret);
    return secret;
  } catch (NoSuchAlgorithmException e) {
    throw new AssertionError(e);
  }
}

And again here.

SecureRandom secureRandom = SecureRandom.getInstance("SHA1PRNG");

Why would anyone care about this?

On modern Android devices, this isn’t a major concern, but the use of SHA1PRNG used to be a source of vulnerabilities in Android apps. (See also: this slide deck.)

Closing Thoughts

There are a lot of Session’s design decisions that are poorly specified in their Whitepaper and I didn’t look at. For example, how group messaging keys are managed.

When I did try to skim that part of the code, I did find a component where you can coerce Android clients into running a moderately expensive Argon2 KDF by simply deleting the nonce from the message.

val isArgon2Based = (intermediate["nonce"] == null)
if (isArgon2Based) {
    // Handle old Argon2-based encryption used before HF16

That’s hilarious.

Cryptography nerds should NOT be finding the software that activists trust with their privacy hilarious.

So if you were wondering what my opinion on Session is, now you know: Don’t use Session. Don’t let your friends use Session.

If you’re curious about the cryptography used by other messaging apps, please refer to this page that collects my blogs about this topic.

Addendum (2025-01-18)

The Session team responded to my findings with a blog of their own, but they neglected to cite my blog (or an archived snapshot, even) for their readers to cross-reference.

You will be interested in reading Session Round 2 next.


Viewing all articles
Browse latest Browse all 67

Trending Articles