Status of cryptonite

I see that cryptonite has bee archived and set as read only on github. I don’t see anything on the readme discussing it’s development status.

Has the project been abandoned? Are there replacement libraries available? Is GitHub - apotheca/botan: Low-level bindings for Botan stable enough for production use yet?

2 Likes

I’m surprised that there doesn’t seem to have been a change to the README before archiving, which explains the status, nor a public announcement about it. I’m sure there are very good reasons for archiving but it will probably leave a lot of people puzzled without further information!

EDIT: Never mind, I misunderstood what this repo was.

The author has quit Haskell and archived all his repositories: Ld. Vincent Hanquez 牛角包 (@vincenthz): "I have archived all my #haskell repositories (about 100 of packages/repos). time to let go." | nitter

5 Likes

Maintained fork is at GitHub - kazu-yamamoto/crypton: lowlevel set of cryptographic primitives for haskell

4 Likes

Oh whoops, thanks. I misunderstood the organization name haskell-crypto to be haskell-cryptography. The latter is an ongoing project, led by Hecate and others

2 Likes

In addition to the crypton fork, there is also another haskell-cryptography project to develop bindings to Botan (an audited third-party open-source cryptography library), that I started as a potential replacement / backend for crypton/ite's handwritten C. It is not ready for deployment yet, though, so use of crypton is still suggested.

3 Likes

I was able to use the cryptonite library for aes-256-cbc symmetric encryption. There is a lot of knowledge and useful stuff in there. Think perhaps the scope is pushing up against the kitchen sink approach. In my opinion the type safety of crypto in pure haskell has huge payoff compared to bindings to other languages.

2 Likes

I think the type safety of Haskell is a big payoff even for bindings to other languages, especially to an audited library. I suspect that crypton/ite's lack of provenance contributes to the dearth of significant Haskell cryptography applications; simply put, it would not pass the necessary muster of security and regulatory requirements for many real-world use cases, even if it does return the correct values. Cryptographic provenance is not unlike functional purity in many ways, and I feel its value here cannot be understated.

This is not to dissuade anyone from pure Haskell crypto* - rather, one of the reasons I am heavily interested in cryptographic typeclasses is to first make it irrelevant whether the backend is in pure haskell or bindings to another language, which would in turn also make it easier to audit.

* For one, I am interested in it too - I have this notion of cryptographic combinators which I think would work really well for pure haskell crypto, but I haven’t really developed it yet.

3 Likes

I think the aspiration of libraries that wrap FFI correctly is valid but it seems that a whole category of uncertainty is brought in by that choice. This opinion is enforced by my current frustration:

I’m currently trying to work with FFI bindings (to secp256k1) and getting non-deterministic behavior that is breaking my brain. How can cryptographic functions work part of the time? And how can a pointer of one size/format work in one place and not another!

verify :: Text -> Hex64 -> Hex32 -> IO Bool 
verify t sig pub = do 
    (hash32, 32) <- getPtr . un32 . hasht $ t
    (p64, 64) <- parsePub pub >>= getPtr . un64
    (sig64, 64) <- getPtr . un64 $ sig
    (== 1) <$> ecdsaVerify ctx sig64 hash32 p64

verify32 :: Text -> Hex64 -> Hex32 -> IO Bool 
verify32 t sig pub = do 
    (hash32, 32) <- getPtr . un32 . hasht $ t
    (p32, 32) <- getPtr . un32 $ pub
    (sig64, 64) <- getPtr . un64 $ sig
    (== 1) <$> ecdsaVerify ctx sig64 hash32 p32

from here - These functions only differ in the key size and format. One works to validate the signature coming from my lightning node and the other works for signature from my own code and both only work part of the time. I’m feeling out of my depth.

1 Like

Oh boy do I empathize with you here. The FFI layer can definitely get in the way and make it difficult to determine whether a flaw is present in the FFI or the bound library, and I have experienced this with Botan.

However, in this case, your current frustration (at least, the non-determinism) isn’t maybe isn’t FFI-related, because (EC)DSA is not (necessarily) a deterministic algorithm :grimacing:

You see, many cryptographic operations are non-deterministic, and require a random nonce that must never be reused (because it is a number only used once).Ideally, this value should be exposed as an input, and the resulting function would be pure and deterministic. However, nonce mismanagement has caused many security lapses, and so some libraries remove user nonce handling in one of two ways:

  1. they make a variant that generates the nonce non-deterministically and attaches it to the cryptext output, thus obviating user handling but also resulting in a different output each time

  2. they make a variant that derives the nonce deterministically based on the key and plaintext (eg, using HMAC) and attaches it to the cryptext output, resulting in the same output each time

Note that attaching the nonce makes the cryptext longer, which may also be one of the issues you are experiencing.

This is all and well, except that sometimes the library doesn’t bother calling it something different, and so it can be hard to tell exactly which variant you are using - and this is on top of the fact that many algorithms already have families that must be distinguished. This is one of the reason why test vectors are so important - it helps you know exactly which variant you are using.

This stackexchange post may be relevant to your issues.

4 Likes

hmmm thanks for considering. libsecp256k1 should have deterministic signatures and regardless this is the verification which must be deterministic to be correct. I don’t want to hijack this thread anymore, I’ll just stare at my impossible test results and adjust my opinion about whether or not this is all just a stupid simulation anyway . . .

1 Like

Oof yeah that does sound wrong - best of luck figuring it out.