rustcrypto / elliptic-curves Goto Github PK
View Code? Open in Web Editor NEWCollection of pure Rust elliptic curve implementations: NIST P-224, P-256, P-384, P-521, secp256k1, SM2
Collection of pure Rust elliptic curve implementations: NIST P-224, P-256, P-384, P-521, secp256k1, SM2
The Field
trait implementation has a todo!()
stub for Field::sqrt
in the following crates:
We're interested in supporting hash2curve for k256 and p256 but these require arithmetic on field elements (not scalars), and the field elements are not exposed.
I'd like to cut releases of the following crates relatively soon (next few days / week):
elliptic-curve
v0.5.0ecdsa
v0.7.0k256
v0.4.0p256
v0.4.0p384
v0.3.0These will be the first releases supporting an integrated ECDSA implementation as well as initial (and rather rudimentary) support for implementing algorithms generically over elliptic curve groups.
If there's anything else you'd like to get into these releases, let me know, otherwise I plan on doing it soon.
When I say that, I'm also looking for small items rather than more significant changes. There's plenty of bigger issues to address (e.g. eliminating fallible APIs via eliminating the invalid representations that cause them in PublicKey
/SecretKey
) but I'd rather not make any bigger items release blockers at this point and we can continue making large and breaking changes in the next release.
Hello, thank you for the wonderful work that has been put into building this library!
As I understand, a PublicKey
by default is stored in the compressed form (33 bytes).
On obtaining a byte array for the public key, I get an array of length 33
let mut rng = rand::thread_rng();
let secret_key = SecretKey::generate(rng);
let public_key = PublicKey::from_secret_key(&secret_key, true)
.expect("invalid secret key");
// length 33 (compressed form)
let public_key_bytes = public_key.as_bytes()
When I try to construct the publicKey
back from these bytes:
let public_key = PublicKey::from_bytes(&public_key_bytes);
I get an error:
error[E0277]: arrays only have std trait implementations for lengths 0..=32
--> ethers-core/src/types/crypto/keys.rs:308:60
|
308 | let public_key = PublicKey::from_bytes(&public_key_bytes)
| ^^^^^^ the trait `std::array::LengthAtMost32` is not implemented for `[u8; 33]`
|
= note: required because of the requirements on the impl of `std::convert::AsRef<[u8]>` for `[u8; 33]`
= note: required because of the requirements on the impl of `std::convert::AsRef<[u8]>` for `&[u8; 33]`
= note: required by `elliptic_curve::weierstrass::public_key::PublicKey::<C>::from_bytes`
Basically, the from_bytes
expects a byte array of length 32.
What am I missing here? Any help would be appreciated.
The mixed addition formulas from the Renes-Costello-Batina paper assume Z2=1, which is not the case for the point at infinity.
As a result, point addition can return an incorrect result if the second point is in affine coordinates:
let g = ProjectivePoint::generator();
let o = AffinePoint::identity();
let p = g + o;
assert_eq!(p, g);
-> We get p != g
.
ecdsa/std
is missing from the std
feature. As a result, the Error
type does not impl std::error::Error
when the std
feature is selected
Presently, the k256
crate implements montgomery mulmod for multiplication/reduction. @hdevalence brought up a point in #19 that there may be a method that leverages the special form of the secp256k1 modulus to improve multiplication performance.
Performance of a scalar multiplication for the k256
crate:
test tests::k256_scalar_mul ... bench: 197,252 ns/iter (+/- 2,838)
It would be nice to see a significant performance increase for this.
It feels like, since we know that it is nonzero, there should be some fn invert(&self) -> NonZeroScalar
, or at least -> Scalar
, instead of -> CtOption<Scalar>
, which will never have the None
variant. Isn't guaranteed invertibility one of the major uses of a NonZeroScalar
type?
I have a rather fast implementation of the base field for Cortex-M4/M33 microcontrollers, wrapping stealing the assembly routines in https://github.com/Emill/P256-cortex-ecdh/blob/master/P256-cortex-m4-ecdh-speedopt-gcc.s. Would there be interest to include platform-specific arithmetic implementations in this crate, or should I focus on a "lean and mean" MCU fork?
Some complications are:
[u32; 8]
for base fieldbuild.rs
, possibly overridableqemu-tests
sub-libraryproptest
in dev-dependencies
breaks the build for no_std
due Cargo feature additivity (byteorder
with default features, as so often), not sure how to fix this :/hex
is std
This relates to other decisions as well, such as in #54 whether to enforce correct but slow constant-time scalar inversion, or open the door to non-constant time implementations which might be abused.
The elliptic_curve::scalar::Scalar
type is presently generic around the number of bytes needed to represent it:
pub struct Scalar<Size: ArrayLength<u8>>(_);
However ideally it would be generic around a given curve's order, such that attempting to instantiate a scalar which is outside the order would result in an error.
It seems like doing that might require defining an Order
type, and making Order
one of the associated types for elliptic curves, and Scalar
generic around an Order
.
TBH I'm not sure if this is a bug in k256
or a Rust miscompilation bug.
We've found that in certain cases k256
compressed point serialization and deserialization do not round trip correctly. In particular the sign of y
will for certain rare points, be flipped.
This is using k256
0.10.2, rustc
1.58.1
use k256::elliptic_curve::group::GroupEncoding;
use k256::elliptic_curve::sec1::FromEncodedPoint;
pub fn deserialize(bytes: &[u8]) -> Option<k256::ProjectivePoint> {
match k256::EncodedPoint::from_bytes(bytes) {
Ok(ept) => {
let apt = k256::AffinePoint::from_encoded_point(&ept);
if bool::from(apt.is_some()) {
Some(k256::ProjectivePoint::from(apt.unwrap()))
} else {
None
}
}
Err(_) => None,
}
}
pub fn serialize(pt: k256::ProjectivePoint) -> Vec<u8> {
pt.to_affine().to_bytes().to_vec()
}
fn main() {
let bits =
hex::decode("024b395881d9965c4621459ad2ec12716fa7f669b6108ad3b8b82b91644fb44808").unwrap();
let pt = deserialize(&bits).unwrap();
let pt_bytes = serialize(pt);
assert_eq!(bits, pt_bytes);
}
In release mode (only), this fails:
$ cargo run --release
Compiling k256-bug v0.1.0 (/home/jack/sw/k256-bug)
Finished release [optimized] target(s) in 0.58s
Running `target/release/k256-bug`
thread 'main' panicked at 'assertion failed: `(left == right)`
left: `[2, 75, 57, 88, 129, 217, 150, 92, 70, 33, 69, 154, 210, 236, 18, 113, 111, 167, 246, 105, 182, 16, 138, 211, 184, 184, 43, 145, 100, 79, 180, 72, 8]`,
right: `[3, 75, 57, 88, 129, 217, 150, 92, 70, 33, 69, 154, 210, 236, 18, 113, 111, 167, 246, 105, 182, 16, 138, 211, 184, 184, 43, 145, 100, 79, 180, 72, 8]`', src/main.rs:31:5
So far we have not been able to reproduce this in either debug mode or with coverage guided fuzzing enabled.
One of the goals of the elliptic-curve
crate is to be able to write code that's generic over (at least Weierstrass) elliptic curve types.
Now that the p256
and k256
crates have arithmetic implementations (thanks @str4d and @tuxxy!), an open question is how to write code that's generic over the underlying AffinePoint
and/or ProjectivePoint
types.
It seems like AffinePoint
and ProjectivePoint
could benefit from having traits which at the very least are bounded by the point arithmetic they must support.
It also seems like there needs to be some trait connecting the two of them which can be used to access the From
impls and arithmetic (e.g. Add
) between the two different point types.
Finally, it seems like there needs to be a trait a weierstrass::Curve
type can impl which provides that curve's AffinePoint
and ProjectivePoint
types as associated types.
Curious if people think this is a good idea and what those traits might look like.
cc @tuxxy
It'd be good to (optionally) expose arithmetic operations on scalars.
@nickray has this for p256
feature-guarded: nickray@3e45bde
Not sure if it is OK to upstream this?
In an existing piece of code I have, I am using secp256k1
(link) to generate a recoverable ECDSA signature, and everything is fine: the signature is correct and accepted.
Now I want to move away from secp256k1
and instead depend on k256
, but I have issues producing the correct signature. The nice thing is that it is deterministic, so it's easy to test.
I prepared a minimal example to demonstrate the problem:
use elliptic_curve::sec1::ToEncodedPoint;
use k256::ecdsa::{recoverable, signature::Signer, signature::Verifier};
use secp256k1::{Message, Secp256k1, SecretKey, PublicKey};
use tiny_keccak::Keccak;
fn main() {
// Generate signing key and verifying key for k256
let signing_key = k256::ecdsa::SigningKey::from_bytes(&hex::decode("22AABB811EFCA4E6F4748BD18A46B502FA85549DF9FA07DA649C0A148D7D5530").unwrap()).unwrap();
let verifying_key = signing_key.verifying_key();
let my_message = "546869732069732061206d657373616765".to_string();
let decoded = hex::decode(&my_message).unwrap();
let mut to_hash = format!("Message length: {}", decoded.len())
.as_bytes()
.to_vec();
to_hash.extend(decoded);
// keccak256 of the message
let mut keccak = Keccak::new_keccak256();
let mut hash = [0u8; 32];
keccak.update(&to_hash);
keccak.finalize(&mut hash);
let keccak1 = hex::encode(hash);
println!("Keccak256: {}", keccak1);
// Sign (recoverable) with secp256k1
let message = Message::from_slice(&hash).expect("32 bytes");
let secp_context = Secp256k1::new();
// init a SecretKey from k256's SigningKey
let secp256key = SecretKey::from_slice(signing_key.to_bytes().as_slice()).unwrap();
let sig = secp_context.sign_recoverable(&message, &secp256key);
let (rec_id, r_and_s) = sig.serialize_compact();
let mut data_arr = [0; 65];
data_arr[0..64].copy_from_slice(&r_and_s[0..64]);
data_arr[64] = rec_id.to_i32() as u8;
let secp256k1_signature = hex::encode(data_arr);
println!("With secp256k1: {} <-- CORRECT", secp256k1_signature);
// Verify with secp256k1
let pubkey = PublicKey::from_secret_key(&secp_context, &secp256key);
let recovered_pubkey = secp_context.recover(&message, &sig).unwrap();
let pk1 = hex::encode(pubkey.serialize_uncompressed());
let pk2 = hex::encode(recovered_pubkey.serialize_uncompressed());
assert_eq!(pk1, pk2);
let signature = secp256k1::Signature::from_compact(&r_and_s).unwrap();
assert!(secp_context.verify(&message, &signature, &pubkey).is_ok());
assert!(secp_context.verify(&message, &signature, &recovered_pubkey).is_ok());
// Sign (recoverable) with k256
let signature: recoverable::Signature = signing_key.sign(&to_hash);
let signature_bytes: [u8;65] = signature.as_ref().try_into().unwrap();
let k256_signature = hex::encode(signature_bytes);
println!("With k256: {}", k256_signature);
// Verify with k256
let pk3 = hex::encode(verifying_key.to_encoded_point(false).to_bytes());
assert_eq!(pk1, pk3);
assert!(verifying_key.verify(&to_hash, &signature).is_ok());
println!("OK.");
// The two signatures should be identical
assert_eq!(secp256k1_signature, k256_signature); // <-- this fails
}
And this is my Cargo.toml
[package]
name = "sign_problem"
version = "0.1.0"
edition = "2021"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
elliptic-curve = "0.11.5"
hex = "0.4.3"
k256 = { version = "0.10.2", features = ["ecdsa", "keccak256"] }
secp256k1 = { version = "0.20.3", features = ["recovery", "rand"] }
serde = "1.0.59"
serde_asn1_der = "0.7.4"
serde_json = "1.0.59"
sha3 = "0.10.1"
tiny-keccak = "1.4"
One easy-to-spot difference is that secp256k1
expects a hash, while k256
takes a message and applies the hash internally, if I am not mistaken. But by using the keccak256
signature, the correct hash function should be applied.
I saw #525 but I don't think I need support for signing pre-hashed messages, if keccak256 is computed for me by the library.
Am I doing something wrong?
I'm not able to find any way to get public key corresponding to any secret key. Is there a way to obtain it?
Hi,
Is there a way to compute and verify p384 signatures yet?
p384
has an ecdsa
feature, but I couldn't get anything done with it.
ProjectiveArithmetic
is not implemented for NistP384
. The arithmetic
feature of the elliptic_curve
crate isn't set, so I wasn't even able to manually compute a public key.
Is it possible to use it in a similar way as p256
and k256
? Or to use it for ECDSA at all? Or is it still a work in progress?
Thanks for your help :)
Project Wycheproof maintains a large set of cryptographic test vectors.
It would be good to at least test k256
and p256
against the ECDSA test vectors:
Similar to JwkParameters
or AlgorithmParameters
I propose adding a Voprf
parameter that adds the suite ID
, see the RFC.
It could look like this:
pub trait Voprf<H: HashMarker> {
const SUITE_ID: u16;
}
NistP256
& co could then implement this trait.
The name is actually not very good, it should probably be VoprfCipherSuiteId
or something like that.
Related: #497.
Hi,
Since version 0.5, Secp256k1 signatures cannot be created nor verified due to DigestPrimitive
not being implemented.
The example code from the documentation fails:
| let signature: Signature = signing_key.sign(message);
| ^^^^ the trait `ecdsa::hazmat::DigestPrimitive` is not implemented for `Secp256k1`
|
= note: required because of the requirements on the impl of `PrehashSignature` for `ecdsa::Signature<Secp256k1>`
= note: required because of the requirements on the impl of `Signer<ecdsa::Signature<Secp256k1>>` for `SigningKey`
I'm struggling understanding how I can sign a byte array (or a slice) containing an already hashed message.
Given that I have a byte array containing a 32 bytes hash, I would like to perform something like:
let hash_buf = [0u8; 32];
let signing_key = SigningKey::random(&mut OsRng);
let sig: recoverable::Signature = signing_key.sign_digest_buf(&hash_buf);
I see that there is the signing_key.sign_digest()
. But that is not what I want since I don't have a Digest
implementation here.
Thank you
Occurred during this run:
https://github.com/RustCrypto/elliptic-curves/pull/91/checks?check_run_id=916415645
Relevant info:
2020-07-27T21:17:53.6343449Z thread 'arithmetic::field::tests::fuzzy_negate' panicked at 'assertion failed: res.0[9] >> 22 == 0', k256/src/arithmetic/field/field_10x26.rs:189:9
2020-07-27T21:17:53.6343850Z proptest: Saving this and future failures in /home/runner/work/elliptic-curves/elliptic-curves/k256/proptest-regressions/arithmetic/field.txt
2020-07-27T21:17:53.6344027Z proptest: If this test was run on a CI system, you may wish to add the following line to your copy of the file. (You may need to create it.)
2020-07-27T21:17:53.6344180Z cc d8f4e497110e5a92b4d2ed68e8526777dbf8363cbba2f76a0999b0fb39402eb5
2020-07-27T21:17:53.6344836Z thread 'arithmetic::field::tests::fuzzy_negate' panicked at 'Test failed: assertion failed: res.0[9] >> 22 == 0; minimal failing input: a = FieldElement(FieldElementImpl { value: FieldElement10x26([0, 0, 0, 0, 0, 0, 0, 0, 0, 4194301]), magnitude: 1, normalized: true })
2020-07-27T21:17:53.6345008Z successes: 187
2020-07-27T21:17:53.6345123Z local rejects: 0
2020-07-27T21:17:53.6345237Z global rejects: 0
2020-07-27T21:17:53.6345495Z ', k256/src/arithmetic/field.rs:429:5
2020-07-27T21:17:53.6345564Z
2020-07-27T21:17:53.6345630Z
2020-07-27T21:17:53.6345719Z failures:
2020-07-27T21:17:53.6345844Z arithmetic::field::tests::fuzzy_negate
2020-07-27T21:17:53.6345918Z
2020-07-27T21:17:53.6346041Z test result: FAILED. 45 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out
I used to be able to do sec1::UncompressedPointSize<NistP256>
, but UncompressedPointSize
was removed. I might be missing something, but it looks like there isn't a way of getting the size of an uncompressed point at a type/const level.
Rac For now there is AEAD, but, I'll remove it. Maybe you will find some for yours.
For example when you use ProjectivePoint::identity()
, the size that to_bytes()
will attempt to convert with clone_from_slice
will only be 1, but it expects 33.
The last of the GLV patents in the US (US7110538B2) expired on September 25th, 2020 (and the European patents have been expired for awhile). IANAL, but from what I'm gathering endomorphisms are no longer patent-encumbered. Relevant story:
bitcoin-core/secp256k1
is now switching to the endomorphism implementation by default:
I think it would make sense for us to do the same.
is there any scope to add ecdsa signing algorithm ?
It seems that some Serde serializers have issues when serializing VerifyingKey
s and Signature
s. They all serialize fine, but Serde returns an error for certain serializers when deserializing back to the original type.
I've tested it with three serializers, and it seems to be a bit of a hit or miss if it works. It's a bit verbose, but I've included tests that reproduce the issue. I would normally just include a Rust Playground link, but they don't support k256
.
#[cfg(test)]
mod test {
use k256::ecdsa::{Signature, SigningKey, signature::Signer};
use rand::rngs::OsRng;
use serde::{Deserialize, Serialize};
use std::fmt::Debug;
fn ser_deser_test<'a, D, S, T, U>(ser: S, deser: D, val: T)
where
D: Fn(&U) -> Result<T, String>,
S: Fn(&T) -> Result<U, String>,
T: Debug + Deserialize<'a> + PartialEq + Serialize,
{
let ser = ser(&val).unwrap();
let deser = deser(&ser).expect("Deserialization failure");
assert_eq!(val, deser);
}
fn create_fake_sig() -> Signature {
let k = SigningKey::random(&mut OsRng);
let msg: Vec<_> = (0..100).collect();
k.sign(&msg)
}
#[test]
fn json_signing_key() {
ser_deser_test(
|val| serde_json::to_string(val).map_err(|err| err.to_string()),
|str| serde_json::from_str(str).map_err(|err| err.to_string()),
SigningKey::random(&mut OsRng).verifying_key(),
);
}
#[test]
fn json_signature() {
ser_deser_test(
|val| serde_json::to_string(val).map_err(|err| err.to_string()),
|str| serde_json::from_str(str).map_err(|err| err.to_string()),
create_fake_sig(),
);
}
#[test]
fn toml_signing_key() {
ser_deser_test(
|val| toml::to_string(val).map_err(|err| err.to_string()),
|str| toml::from_str(str).map_err(|err| err.to_string()),
SigningKey::random(&mut OsRng).verifying_key(),
);
}
#[test]
fn toml_signature() {
ser_deser_test(
|val| toml::to_string(val).map_err(|err| err.to_string()),
|str| toml::from_str(str).map_err(|err| err.to_string()),
create_fake_sig(),
);
}
#[test]
fn bincode_signing_key() {
ser_deser_test(
|val| bincode::serialize(val).map_err(|err| err.to_string()),
|bytes| bincode::deserialize(bytes).map_err(|err| err.to_string()),
SigningKey::random(&mut OsRng).verifying_key(),
);
}
#[test]
fn bincode_signature() {
ser_deser_test(
|val| bincode::serialize(val).map_err(|err| err.to_string()),
|bytes| bincode::deserialize(bytes).map_err(|err| err.to_string()),
create_fake_sig(),
);
}
}
Dependencies:
[dependencies]
k256 = {version = "0.10.4", features = ["serde", "pem"] }
bincode = "1.3.3"
serde_json = "1.0.59"
rand = {version = "0.8.5", features = ["getrandom"] }
toml = "0.5.8"
serde = "1.0"
serde_yaml = "0.8"
With results:
test test::bincode_signing_key ... ok
test test::json_signature ... ok
test test::bincode_signature ... ok
test test::json_signing_key ... FAILED ("panicked at 'Deserialization failure: invalid type: sequence, expected a borrowed byte array at line 1 column 1")
test test::toml_signing_key ... FAILED ("panicked at 'Deserialization failure: "expected a right bracket, found a comma at line 1 column 4")
test test::toml_signature ... FAILED ("panicked at 'Deserialization failure: "expected an equals, found eof at line 1 column 131")
I'm not very experienced with cryptography, so let me know if I'm doing something that I shouldn't be. 🙂
Now that both the p256
and k256
crates have point/field arithmetic implementations, it'd be good to cut a new release.
I'm fine with cutting a new release based on what's currently on master
(and following up on issues like #22 after that). If there aren't any objections, I can go ahead and do that.
I was previously using p256::AffinePoint
in version 0.4 to represent pubkeys. I suppose that's still possible, but it looks like there's now more explicit support for ECDH operations.
My issue is that elliptic_curve::ecdh::PublicKey
is defined to be an EncodedPoint
, which means that no validation is performed upon deserialization (at least as far as I can tell from the from_bytes
function). However, in SECG page 12 (and also in the HPKE spec) it says when doing uncompressed point deserialization, you need to check that the coordinates represent a point on the curve.
Not sure what the best fix is here. Although a simple one would be to just make type PublicKey = AffinePoint
instead. This would work because the constructor of AffinePoint
does the ECDH "partial validation" in SECG section 3.2.3. Namely, it checks that the point is on the curve and that it's not O. For non-prime curves, it might be desirable to make PublicKey
its own type so that a user can opt into partial or full validation (the distinction doesn't exist for prime curves because all full-validation additionally checks is membership in the prime subgroup, which is trivially the case in prime order curves).
In order to utilize this library in my library it would be amazing if at least PublicKey would implement Serialize and Deserialize. The underlaying elliptical-curve crate seems to have a serde feature. However I don't not if this is enough to implement these traits for all curves. Currently I'm working with a very ugly workaround.
I'm coding a function to verify P256 signature in a build target that doesn't have getrandom
and no-std,
error: target is not supported, for more information see: https://docs.rs/getrandom/#unsupported-targets
--> /Users/xxx/.cargo/registry/src/github.com-1ecc6299db9ec823/getrandom-0.2.3/src/lib.rs:224:9
|
224 | / compile_error!("target is not supported, for more information see: \
225 | | https://docs.rs/getrandom/#unsupported-targets");
| |_________________________________________________________________________^
with the dependencies p256 = { version = "0.9.0", features = ["ecdsa"], default-features = false }
.
Is there any way to get rid of the getrandom
dependency as I just need the verify functionality which unlikely uses random number.
It's been pretty convenient having the elliptic-curve
crate in the same repo as the curve implementations so far.
However, all of the other trait crates are presently in https://github.com/RustCrypto/traits
Aside from the consistency issue, there's a notable practical one: if we land RustCrypto/signatures#96 to reverse the relationship between the ecdsa
crate and the curve implementation crates, there's now a circular dependency between crates in https://github.com/RustCrypto/elliptic-curves and the ecdsa
crate in https://github.com/RustCrypto/signatures
ecdsa
depends on elliptic-curve
k256
, p256
, p384
depend on both the elliptic-curve
and ecdsa
cratesThis would make it difficult to use [patch.crates-io]
directives to make incremental progress on unreleased versions of crates.
The most straightforward solution I can think of is to eliminate this circular dependency by moving elliptic-curve
into the traits repo. That would allow both ecdsa
and the curve implementations to use a [patch.crates-io]
directive resolving to elliptic-curve
in https://github.com/RustCrypto/traits
Clone the following repo https://github.com/ggutoski/k256_test and run cargo test
a few times until the test fails. Output:
thread 'tests::k256' panicked at 'assertion failed: `(left == right)`
left: `31`,
right: `32`', /Users/gus/.cargo/registry/src/github.com-1ecc6299db9ec823/generic-array-0.14.4/src/lib.rs:559:9
stack backtrace:
0: rust_begin_unwind
at /rustc/2fd73fabe469357a12c2c974c140f67e7cdd76d0/library/std/src/panicking.rs:493:5
1: core::panicking::panic_fmt
at /rustc/2fd73fabe469357a12c2c974c140f67e7cdd76d0/library/core/src/panicking.rs:92:14
2: <&generic_array::GenericArray<T,N> as core::convert::From<&[T]>>::from
at /Users/gus/.cargo/registry/src/github.com-1ecc6299db9ec823/generic-array-0.14.4/src/lib.rs:559:9
3: <T as core::convert::Into<U>>::into
at /Users/gus/.rustup/toolchains/stable-x86_64-apple-darwin/lib/rustlib/src/rust/library/core/src/convert/mod.rs:539:9
4: generic_array::GenericArray<T,N>::from_slice
at /Users/gus/.cargo/registry/src/github.com-1ecc6299db9ec823/generic-array-0.14.4/src/lib.rs:541:9
5: k256_test::tests::k256
at ./src/main.rs:50:19
6: k256_test::tests::k256::{{closure}}
at ./src/main.rs:26:5
7: core::ops::function::FnOnce::call_once
at /Users/gus/.rustup/toolchains/stable-x86_64-apple-darwin/lib/rustlib/src/rust/library/core/src/ops/function.rs:227:5
8: core::ops::function::FnOnce::call_once
at /rustc/2fd73fabe469357a12c2c974c140f67e7cdd76d0/library/core/src/ops/function.rs:227:5
note: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.
test tests::k256 ... FAILED
This test creates ECDSA signatures in another library and then imports into k256 via FieldBytes::from_slice
. I don't know whether it's possible to trigger this failure using only k256. My guess is the only problem in k256 is that arguments passed to from_slice
are not properly sanitized.
EDIT: It might be my fault for giving malformed bytes. Is there a helper function to sanitize bytes before giving them to FieldBytes::from_slice
?
cargo build
Compiling libsecp256k1 v0.7.0 (/home/ubuntu/libsecp256k1)
error: could not compile libsecp256k1
Caused by:
process didn't exit successfully: rustc --crate-name libsecp256k1 --edition=2018 src/lib.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type lib --emit=dep-info,metadata,link -C embed-bitcode=no -C debuginfo=2 --cfg 'feature="default"' --cfg 'feature="hmac"' --cfg 'feature="hmac-drbg"' --cfg 'feature="sha2"' --cfg 'feature="static-context"' --cfg 'feature="std"' --cfg 'feature="typenum"' -C metadata=1357dc0dc97834f0 -C extra-filename=-1357dc0dc97834f0 --out-dir /home/ubuntu/libsecp256k1/target/debug/deps -C incremental=/home/ubuntu/libsecp256k1/target/debug/incremental -L dependency=/home/ubuntu/libsecp256k1/target/debug/deps --extern arrayref=/home/ubuntu/libsecp256k1/target/debug/deps/libarrayref-dfd5feafb83e6c19.rmeta --extern base64=/home/ubuntu/libsecp256k1/target/debug/deps/libbase64-259e0c79731ac582.rmeta --extern digest=/home/ubuntu/libsecp256k1/target/debug/deps/libdigest-24716e0bdbc1377d.rmeta --extern hmac_drbg=/home/ubuntu/libsecp256k1/target/debug/deps/libhmac_drbg-a16a258c68f0b580.rmeta --extern libsecp256k1_core=/home/ubuntu/libsecp256k1/target/debug/deps/liblibsecp256k1_core-1af814787eb8552a.rmeta --extern rand=/home/ubuntu/libsecp256k1/target/debug/deps/librand-cd9b6b6d26de7e5e.rmeta --extern serde=/home/ubuntu/libsecp256k1/target/debug/deps/libserde-ac0ca606ed95f59c.rmeta --extern sha2=/home/ubuntu/libsecp256k1/target/debug/deps/libsha2-7d35ecf8ad6acdc9.rmeta --extern typenum=/home/ubuntu/libsecp256k1/target/debug/deps/libtypenum-81ca94bc9a0f2863.rmeta
(signal: 9, SIGKILL: kill)
Further errors
{"message":"environment variable OUT_DIR
not defined","code":null,"level":"error","spans":[{"file_name":"src/lib.rs","byte_start":1602,"byte_end":1617,"line_start":53,"line_end":53,"column_start":59,"column_end":74,"is_primary":true,"text":[{"text":" unsafe { ECMultContext::new_from_raw(include!(concat!(env!("OUT_DIR"), "/const.rs"))) };","highlight_start":59,"highlight_end":74}],"label":null,"suggested_replacement":null,"suggestion_applicability":null,"expansion":{"span":{"file_name":"src/lib.rs","byte_start":1602,"byte_end":1617,"line_start":53,"line_end":53,"column_start":59,"column_end":74,"is_primary":false,"text":[{"text":" unsafe { ECMultContext::new_from_raw(include!(concat!(env!("OUT_DIR"), "/const.rs"))) };","highlight_start":59,"highlight_end":74}],"label":null,"suggested_replacement":null,"suggestion_applicability":null,"expansion":null},"macro_decl_name":"env!","def_site_span":{"file_name":"/rustc/c8dfcfe046a7680554bf4eb612bad840e7631c4b/library/core/src/macros/mod.rs","byte_start":30584,"byte_end":30747,"line_start":887,"line_end":890,"column_start":5,"column_end":6,"is_primary":false,"text":[],"label":null,"suggested_replacement":null,"suggestion_applicability":null,"expansion":null}}}],"children":[],"rendered":"\u001b[0m\u001b[1m\u001b[38;5;9merror\u001b[0m\u001b[0m\u001b[1m: environment variable OUT_DIR
not defined\u001b[0m\n\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m--> \u001b[0m\u001b[0msrc/lib.rs:53:59\u001b[0m\n\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m|\u001b[0m\n\u001b[0m\u001b[1m\u001b[38;5;12m53\u001b[0m\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m| ...\u001b[0m\u001b[0m_from_raw(include!(concat!(env!("OUT_DIR"), "/const.rs"))) };\u001b[0m\n\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m| \u001b[0m\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;9m^^^^^^^^^^^^^^^\u001b[0m\n\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m|\u001b[0m\n\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m= \u001b[0m\u001b[0m\u001b[1mnote\u001b[0m\u001b[0m: this error originates in the macro env
(in Nightly builds, run with -Z macro-backtrace for more info)\u001b[0m\n\n"}
{"message":"environment variable OUT_DIR
not defined","code":null,"level":"error","spans":[{"file_name":"src/lib.rs","byte_start":1958,"byte_end":1973,"line_start":59,"line_end":59,"column_start":62,"column_end":77,"is_primary":true,"text":[{"text":" unsafe { ECMultGenContext::new_from_raw(include!(concat!(env!("OUT_DIR"), "/const_gen.rs"))) };","highlight_start":62,"highlight_end":77}],"label":null,"suggested_replacement":null,"suggestion_applicability":null,"expansion":{"span":{"file_name":"src/lib.rs","byte_start":1958,"byte_end":1973,"line_start":59,"line_end":59,"column_start":62,"column_end":77,"is_primary":false,"text":[{"text":" unsafe { ECMultGenContext::new_from_raw(include!(concat!(env!("OUT_DIR"), "/const_gen.rs"))) };","highlight_start":62,"highlight_end":77}],"label":null,"suggested_replacement":null,"suggestion_applicability":null,"expansion":null},"macro_decl_name":"env!","def_site_span":{"file_name":"/rustc/c8dfcfe046a7680554bf4eb612bad840e7631c4b/library/core/src/macros/mod.rs","byte_start":30584,"byte_end":30747,"line_start":887,"line_end":890,"column_start":5,"column_end":6,"is_primary":false,"text":[],"label":null,"suggested_replacement":null,"suggestion_applicability":null,"expansion":null}}}],"children":[],"rendered":"\u001b[0m\u001b[1m\u001b[38;5;9merror\u001b[0m\u001b[0m\u001b[1m: environment variable OUT_DIR
not defined\u001b[0m\n\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m--> \u001b[0m\u001b[0msrc/lib.rs:59:62\u001b[0m\n\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m|\u001b[0m\n\u001b[0m\u001b[1m\u001b[38;5;12m59\u001b[0m\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m| ...\u001b[0m\u001b[0m_from_raw(include!(concat!(env!("OUT_DIR"), "/const_gen.rs"))) };\u001b[0m\n\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m| \u001b[0m\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;9m^^^^^^^^^^^^^^^\u001b[0m\n\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m|\u001b[0m\n\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m= \u001b[0m\u001b[0m\u001b[1mnote\u001b[0m\u001b[0m: this error originates in the macro env
(in Nightly builds, run with -Z macro-backtrace for more info)\u001b[0m\n\n"}
{"message":"aborting due to 2 previous errors","code":null,"level":"error","spans":[],"children":[],"rendered":"\u001b[0m\u001b[1m\u001b[38;5;9merror\u001b[0m\u001b[0m\u001b[1m: aborting due to 2 previous errors\u001b[0m\n\n"}
This is a tracking issue for additional elliptic curves we could potentially implement.
Note that we are presently focusing on short Weierstrass curves which can be implemented using the primeorder
crate. There are many other Rust projects implementing newer curve forms, such as bls12_381
, curve25519-dalek
and ed448-goldilocks
.
I'm updating from k256/p256 0.9 to 0.10, and the as_scalar_bytes()
method has been removed with the switch to the new scalar backing type. Ideally I'm looking for a way to access the BE byte representation without creating copies that aren't zeroized, but the FieldBytes
type is just a GenericArray
. Is there a good way to do this?
Hello,
I have a question on the Rust idiom.
I found the rust idom on the new constructor like below
"new is the constructor convention in Rust, and users expect it to exist"
from https://rust-lang.github.io/api-guidelines/interoperability.html
IMHO, your EC apis on the new constructor do not follow Rust idiom.
"pub fn Signer::new(secret_key: &SecretKey) -> Result<Self, Error>"
It tells me, construct may be exist or not.
When running the following code with the Python3 cryptography library:
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.asymmetric import ec, utils
from cryptography.hazmat.primitives import hashes
attest = ec.derive_private_key(0x32c9f75d73b5c5ce890daa575ad508429acd95d5d3d7773ff2d09cd0a5f3d6cd, ec.SECP256R1(), default_backend())
data = bytes([163, 121, 166, 246, 238, 175, 185, 165, 94, 55, 140, 17, 128, 52, 226, 117, 30, 104, 47, 171, 159, 45, 48, 171, 19, 210, 18, 85, 134, 206, 25, 71, 65, 0, 0, 0, 0, 248, 160, 17, 243, 140, 10, 77, 21, 128, 6, 23, 17, 31, 158, 220, 125, 0, 40, 32, 27, 13, 94, 181, 116, 99, 237, 92, 107, 88, 61, 77, 89, 195, 249, 80, 51, 218, 175, 2, 206, 23, 3, 198, 183, 94, 25, 106, 48, 166, 10, 0, 0, 0, 0, 0, 0, 0, 0, 165, 1, 2, 3, 38, 32, 1, 33, 88, 32, 135, 246, 4, 67, 55, 184, 84, 105, 81, 144, 148, 109, 22, 144, 253, 104, 167, 81, 89, 93, 84, 187, 187, 77, 134, 73, 77, 206, 241, 144, 133, 211, 34, 88, 32, 86, 111, 67, 255, 4, 111, 211, 242, 129, 66, 116, 151, 203, 35, 108, 24, 37, 217, 200, 244, 230, 38, 93, 112, 230, 168, 131, 232, 207, 157, 65, 16, 78, 187, 155, 59, 173, 254, 35, 48, 209, 67, 146, 109, 243, 103, 182, 99, 76, 244, 154, 222, 205, 14, 4, 197, 222, 176, 30, 4, 143, 68, 110, 132])
sig = attest.sign(data, ec.ECDSA(hashes.SHA256()))
output = utils.decode_dss_signature(sig)
print("Signature: r: " + str(hex(output[0])) + " s: " + str(hex(output[1])))
I get this output:
Signature: r: 0x85e35de0bad8c6ffc347f58bebc051701ad08d8a6bc662401250e08d32334c48 s: 0x176743776c0ea6f8aa0d3ff87a17be98a4d31c2d069f982374fb2ce56b7eafa7
When running this code
const ATTESTATION_KEY: [u8; 32] = [
0x32, 0xc9, 0xf7, 0x5d, 0x73, 0xb5, 0xc5, 0xce, 0x89, 0x0d, 0xaa, 0x57, 0x5a, 0xd5, 0x08, 0x42,
0x9a, 0xcd, 0x95, 0xd5, 0xd3, 0xd7, 0x77, 0x3f, 0xf2, 0xd0, 0x9c, 0xd0, 0xa5, 0xf3, 0xd6, 0xcd,
]
const data: [u8] = [163, 121, 166, 246, 238, 175, 185, 165, 94, 55, 140, 17, 128, 52, 226, 117, 30, 104, 47, 171, 159, 45, 48, 171, 19, 210, 18, 85, 134, 206, 25, 71, 65, 0, 0, 0, 0, 248, 160, 17, 243, 140, 10, 77, 21, 128, 6, 23, 17, 31, 158, 220, 125, 0, 40, 32, 27, 13, 94, 181, 116, 99, 237, 92, 107, 88, 61, 77, 89, 195, 249, 80, 51, 218, 175, 2, 206, 23, 3, 198, 183, 94, 25, 106, 48, 166, 10, 0, 0, 0, 0, 0, 0, 0, 0, 165, 1, 2, 3, 38, 32, 1, 33, 88, 32, 135, 246, 4, 67, 55, 184, 84, 105, 81, 144, 148, 109, 22, 144, 253, 104, 167, 81, 89, 93, 84, 187, 187, 77, 134, 73, 77, 206, 241, 144, 133, 211, 34, 88, 32, 86, 111, 67, 255, 4, 111, 211, 242, 129, 66, 116, 151, 203, 35, 108, 24, 37, 217, 200, 244, 230, 38, 93, 112, 230, 168, 131, 232, 207, 157, 65, 16, 78, 187, 155, 59, 173, 254, 35, 48, 209, 67, 146, 109, 243, 103, 182, 99, 76, 244, 154, 222, 205, 14, 4, 197, 222, 176, 30, 4, 143, 68, 110, 132]
print!("ATTESTATION_KEY: 0x");
for d in ATTESTATION_KEY.iter() {
print!("{:02x}", d);
}
println!("");
let secret_key = SecretKey::from_bytes(&ATTESTATION_KEY).unwrap();
let signer = SigningKey::from(&secret_key);
let sig = signer.sign(data);
println!("sig: {:?}", sig);
print!("r: 0x");
for d in sig.r().as_ref().to_bytes().iter() {
print!("{:02x}", d);
}
println!("");
print!("s: 0x");
for d in sig.s().as_ref().to_bytes().iter() {
print!("{:02x}", d);
}
println!("");
I see this:
ATTESTATION_KEY: 0x32c9f75d73b5c5ce890daa575ad508429acd95d5d3d7773ff2d09cd0a5f3d6cd
sig: ecdsa::Signature<NistP256>([67, 125, 8, 103, 193, 16, 122, 186, 244, 233, 1, 154, 248, 252, 223, 74, 146, 92, 138, 232, 168, 162, 59, 76, 245, 85, 132, 94, 77, 161, 236, 163, 176, 12, 147, 133, 176, 234, 179, 216, 31, 177, 101, 31, 175, 34, 140, 130, 27, 149, 176, 187, 198, 2, 110, 27, 157, 165, 86, 234, 55, 245, 44, 143])
r: 0x437d0867c1107abaf4e9019af8fcdf4a925c8ae8a8a23b4cf555845e4da1eca3
s: 0xb00c9385b0eab3d81fb1651faf228c821b95b0bbc6026e1b9da556ea37f52c8f
What what I can tell the Rust and Python code should be doing the same thing, but I get different r
and s
values at the end.
Currently using the arithmetic feature requires atomics. Compilling on a platform without atomics (such as RV32IMC) results in this error:
error[E0432]: unresolved imports `core::sync::atomic::AtomicBool`, `core::sync::atomic::AtomicI16`, `core::sync::atomic::AtomicI32`, `core::sync::atomic::AtomicI8`, `core::sync::atomic::AtomicIsize`, `core::sync::atomic::AtomicPtr`, `core::sync::atomic::AtomicU16`, `core::sync::atomic::AtomicU32`, `core::sync::atomic::AtomicU8`, `core::sync::atomic::AtomicUsize`
--> /home/alistair/.cargo/registry/src/github.com-1ecc6299db9ec823/radium-0.3.0/src/lib.rs:29:11
|
29 | self, AtomicBool, AtomicI16, AtomicI32, AtomicI8, AtomicIsize, AtomicPtr, AtomicU16, AtomicU32,
| ^^^^^^^^^^ ^^^^^^^^^ ^^^^^^^^^ ^^^^^^^^ ^^^^^^^^^^^ ^^^^^^^^^ ^^^^^^^^^ ^^^^^^^^^ no `AtomicU32` in `sync::atomic`
| | | | | | | |
| | | | | | | no `AtomicU16` in `sync::atomic`
| | | | | | no `AtomicPtr` in `sync::atomic`
| | | | | no `AtomicIsize` in `sync::atomic`
| | | | no `AtomicI8` in `sync::atomic`
| | | no `AtomicI32` in `sync::atomic`
| | no `AtomicI16` in `sync::atomic`
| no `AtomicBool` in `sync::atomic`
30 | AtomicU8, AtomicUsize, Ordering,
| ^^^^^^^^ ^^^^^^^^^^^ no `AtomicUsize` in `sync::atomic`
| |
| no `AtomicU8` in `sync::atomic`
error: aborting due to previous error
For more information about this error, try `rustc --explain E0432`.
error: could not compile `radium`.
To learn more, run the command again with --verbose.
make: *** [Makefile:141: flash-opentitan] Error 101
cargo tree
reports that Radium comes from:
├── ecdsa v0.8.0
│ ├── elliptic-curve v0.6.0
│ │ ├── bitvec v0.18.3
│ │ │ ├── funty v1.0.1
│ │ │ ├── radium v0.3.0
│ │ │ └── wyz v0.2.0
I have opened an issue for Radium: ferrilab/radium#3 but it would also be great if these libraries could stop depending on it.
This is a tracking issue for supporting the new Schnorr signature algorithm proposed in BIP-340:
https://github.com/bitcoin/bips/blob/master/bip-0340.mediawiki
Hi! This might be an entirely silly question, but is there any reason why the basepoint() functions in the p256 crate cannot be const fn
?
Edit: Also, somewhat tangential question, is there any reason why these implementations do not have precomputed multiples of the basepoint as tables for speeding up scalar multiplication?
Thanks for all your hard work in building and maintaining these libraries!
This might be a dumb question but I couldn't find a neat way to achieve this probably due to my lack of knowledge in Rust.
let alice_secret = EphemeralSecret::random(&mut OsRng);
let bob_secret = EphemeralSecret::random(&mut OsRng);
let bob_pk_bytes = EncodedPoint::from(bob_secret.public_key());
let bob_public = PublicKey::from_sec1_bytes(bob_pk_bytes.as_ref())
let alice_shared = alice_secret.diffie_hellman(&bob_public);
let a = alice_shared.as_bytes().as_slice(); /// [f2, 422, ...]
let shared_secret_hex_string = ?
How can I properly make a hexadecimal string out of this byte array or the "secret" as in 8a03e0dab7e8...
?
Does this library have an API that I can't find or is there some idiomatic way to do the conversion in Rust?
I'm not familiar with SEC-1 encoding for public keys, so I hope I don't get this horribly wrong. But as far as I understand SEC-1 encoding is DER encoded format.
Is it also possible to create a VerificationKey from a 65 byte 0x04 prefixed public key (Ethereum format) or 0x02/0x03 prefixed compressed public key (Cosmos SDK format)?
See also https://github.com/bitcoin-core/secp256k1/blob/24d1656/include/secp256k1.h#L329-L348
Using the following dependencies:
[dependencies.elliptic-curve]
version="*"
features = ["pem", "arithmetic", "std", "zeroize", "pkcs8"]
[dependencies.k256]
version="*"
features=["ecdsa", "std", "pkcs8", "pem", "arithmetic"]
And the following trivial test:
#[test]
fn test_a_key() {
use k256::pkcs8::FromPrivateKey;
use k256::SecretKey;
let key = "-----BEGIN PRIVATE KEY-----
A PRIVATE KEY WHOSE CONTENTS ARE NOT IN THIS MESSAGE
-----END PRIVATE KEY-----";
let keypair = SecretKey::from_pkcs8_pem(key).unwrap();
}
I get the following panic:
test test_a_key ... FAILED
failures:
---- test_a_key stdout ----
thread 'test_a_key' panicked at 'called `Result::unwrap()` on an `Err` value: Asn1(Error { kind: UnknownOid { oid: ObjectIdentifier(1.3.132.0.10) }, position: None })', src/lib.rs:70:50
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
I've tried tracing through the code ( I'm a Rust noob, so forgive me if I've missed obvious points ), and it's my understanding that the above is definitely not intended; that is to say, the SecretKey
impl for k256
specifies that the OID in question ( 1.3.132.0.10
) should always be known. Is this a bug, or have I attempted to use from_pkcs8_pem
incorrectly here?
Hi, It's my first time to install k256
by inserting k256 = "0.7.2"
into my Cargo.toml .
error[E0277]: the trait bound `Secp256k1: CheckSignatureBytes` is not satisfied
--> /home/username/.cargo/registry/src/github.com-1ecc6299db9ec823/k256-0.7.2/src/ecdsa/recoverable.rs:253:26
|
253 | impl From<Signature> for super::Signature {
| ^^^^^^^^^^^^^^^^ the trait `CheckSignatureBytes` is not implemented for `Secp256k1`
|
::: /home/username/.cargo/registry/src/github.com-1ecc6299db9ec823/ecdsa-0.10.2/src/lib.rs:138:33
|
138 | pub struct Signature<C: Curve + CheckSignatureBytes>
| ------------------- required by this bound in `ecdsa::Signature`
error[E0277]: the trait bound `Secp256k1: CheckSignatureBytes` is not satisfied
--> /home/username/.cargo/registry/src/github.com-1ecc6299db9ec823/k256-0.7.2/src/ecdsa/recoverable.rs:80:27
|
80 | pub fn new(signature: &super::Signature, recovery_id: Id) -> Result<Self, Error> {
| ^^^^^^^^^^^^^^^^^ the trait `CheckSignatureBytes` is not implemented for `Secp256k1`
|
::: /home/username/.cargo/registry/src/github.com-1ecc6299db9ec823/ecdsa-0.10.2/src/lib.rs:138:33
|
138 | pub struct Signature<C: Curve + CheckSignatureBytes>
| ------------------- required by this bound in `ecdsa::Signature`
error[E0277]: the trait bound `Secp256k1: CheckSignatureBytes` is not satisfied
--> /home/username/.cargo/registry/src/github.com-1ecc6299db9ec823/k256-0.7.2/src/ecdsa/recoverable.rs:254:32
|
254 | fn from(sig: Signature) -> Self {
| ^^^^ the trait `CheckSignatureBytes` is not implemented for `Secp256k1`
|
::: /home/username/.cargo/registry/src/github.com-1ecc6299db9ec823/ecdsa-0.10.2/src/lib.rs:138:33
|
138 | pub struct Signature<C: Curve + CheckSignatureBytes>
| ------------------- required by this bound in `ecdsa::Signature`
error: aborting due to 3 previous errors
For more information about this error, try `rustc --explain E0277`.
error: could not compile `k256`
running cargo build
command produces above error.
Anyone knows the solution?
This is a tracking issue for various performance optimizations that could potentially be applied to the k256
and p256
crates. It's not entirely clear whether any of these will actually improve performance, and some are mutually exclusive.
Note: these optimizations could be applied to secp256k1 public key recovery as well (i.e. k256::ecdsa::recoverable::Signature::recover_verify_key
)
Both k256
and p256
are presently using constant-time algorithms for scalar inversions.
p256
implements Scalar::invert_vartime
using Stein's algorithm. The same algorithm could be applied to k256
as well.
ECDSA verification involves computing aP + bQ
, which is presently performed in both k256
and p256
using two scalar multiplications and a point addition.
For signature verification, this can be reduced to one scalar multiplication by using Shamir's Trick, which performs additions inside the chain instead of outside. With a precomputed P + Q
, the performance is only slightly slower than a single scalar multiplication.
Some generic arithmetic for this is implemented in the group
crate.
Semi-related: this resulted in a ~30% speedup for Zcash trial decryption: zcash/librustzcash#332
Hi, I signed a message with Metamask on the Kovan Network.
Now I try to verify this signature based on this test:
elliptic-curves/k256/src/ecdsa/recoverable.rs
Line 342 in 90412fe
#[test]
fn test_verify() {
use hex_literal::hex;
use k256::{
ecdsa::{recoverable::Signature},
EncodedPoint,
};
let account = hex!("63f9a92d8d61b48a9fff8d58080425a3012d05c8");
let message = b"0x63f9a92d8d61b48a9fff8d58080425a3012d05c82kl75awln2l";
let signature = hex!("7097b9a0810d13e60182f261708c2c260e86ea09853cd48b184a5a4b4ea08c02087bd86b8cdee3aec88ac365ec533c9a111e5b5586c4941a790ca46970c3f3801c");
//let signature = hex!("46c05b6368a44b8810d79859441d819b8e7cdc8bfd371e35c53196f4bcacdb5135c7facce2a97b95eacba8a586d87b7958aaf8368ab29cee481f76e871dbd9cb01");
println!("signature len: {}", signature.len());
let sig = Signature::try_from(&signature[..]);
println!("sig: {:?}", sig);
let pk = sig.unwrap().recover_verify_key(message).unwrap();
assert_eq!(&account, EncodedPoint::from(&pk).as_bytes());
}
Running the tests I get an error at sig.unwrap()
as sig is an Err here.
(The commented signature works with Signature::try_from(&signature[..])
cargo test web3::tests::test_verify -- --nocapture
running 1 test
signature len: 65
sig: Err(signature::Error {})
thread 'web3::tests::test_verify' panicked at 'called `Result::unwrap()` on an `Err` value: signature::Error {}', src/web3.rs:75:22
What I already found out here
elliptic-curves/k256/src/ecdsa/recoverable.rs
Line 280 in 90412fe
I read some texts that this recovery id is also based on the chain id, but I have no idea, how to get the right byte here.
Maybe someone has an idea how to solve it, or must other values than 0 and 1 also be allowed?
Thanks,
Christian
Is there a plan to expose FieldElement
for P-256?
(motivation: I am looking to implement a Trait in Curv)
While working on voprf and opaque-ke I noticed that a lot of implementations and traits could be removed if curve25519-dalek would support necessary traits from elliptic-curves, like Curve
and ProjectiveArithmetic
. Using some of these already uncovered serious bugs and other issues.
It might be possible to implement some of it through PR's or wrappers for ed25519 and x25519, but it will be much harder for ristretto255. Among other problems in the library, the dependency maintenance story isn't ideal, as seen by the many PR's to update rand.
Specifically, voprf is interested in support for ristretto255 arithmetic, including hash2curve. For opaque-ke x25519 and ristretto255 support for DH is a requirement.
I am myself no cryptographer and sadly can only contribute, but not actually implement something like it.
The k256
crate uses lazy normalization of field elements. While not a user-facing concern as we deliberately encapsulate FieldElement
, there is a potential for bugs in code in k256
itself in the event one "forgets" to normalize a field element prior to returning it as the result of some computation. See #530 as an example (although there have been others).
One potential way to eliminate this class of bugs is to use separate types for normalized vs non-normalized field elements. For example, we could introduce something like LazyFieldElement
on which all of the arithmetic operations are defined, while retaining FieldElement
for the normalized form and defining all serialization operations on that.
LazyFieldElement::normalize
could return FieldElement
, and we could additionally have bidirectional From
conversions between the two.
Things would be a bit tricky in regard to the the ff::Field
traits. I imagine for compatibility reasons they would need to be impl'd on FieldElement
, converting to a LazyFieldElement
to perform the arithmetic, and then calling normalize()
to get a FieldElement
again as a result, which would mean that the performance benefits of lazy normalization wouldn't be available under such an API. However, that is the only safe usage pattern since the traits are designed to abstract over different fields/field implementations, and not all of them have lazy normalization.
cc @fjarri
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.