Git Product home page Git Product logo

hampi's People

Contributors

bryancoxwell avatar gabhijit avatar gth828r avatar its-just-nans avatar nathaniel-bennett avatar nplrkn avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

hampi's Issues

potential issue in handling 2's complement-binary-integer

From @huixue

I think there are 2 issues:

1: handling 2's-complement-binary-integer VS non-negative-binary-integer
X.691 section 10.7 "Encoding of a semi-constrained whole number" is trying to encode into non-negative-binary-integer, as in 10.7.4.
X.691 section 10.8 "Encoding of an unconstrained whole number" is trying to encode into 2's-complement-binary-integer, as in 10.8.3.

However, in your implementation, decode_unconstrained_whole_number() and decode_semi_constrained_whole_number() are both using data.decode_bits_as_integer(), which essentially uses

self.bits[self.offset..self.offset + bits].load_be::() as i128

which is treating the bits as u128 and tries to decode, there's no decoding for negative numbers, which could exist in 2's-complement-binary-integer, in 10.8 unconstrained whole number.

Am I missing something?

`encode_integer()` input limit

Here's the encode_integer() signature:

pub fn encode_integer(
    data: &mut AperCodecData,
    lb: Option<i128>,
    ub: Option<i128>,
    is_extensible: bool,
    value: i128,
    extended: bool,
) -> Result<(), AperCodecError> {...}

input value is i128.

What if the to-be-encoded integer is beyond i128 range?

Using the values from the PDU (S1AP)

          I believe something like this can be used - 
    use asn1_codecs::{aper::AperCodec, PerCodecData};

    let decode_str = "0004001A00000300000005C0098134A200080004800411EA000240020000";
    let decode_hex = hex::decode(decode_str).unwrap();
    let mut codec_data = PerCodecData::from_slice_aper(&decode_hex);
    let s1ap_pdu = s1ap::S1AP_PDU::aper_decode(&mut codec_data);

    eprintln!("s1ap_pdu: {:#?}", s1ap_pdu.unwrap());

This is the generated output -

s1ap_pdu: InitiatingMessage(
    InitiatingMessage {
        procedure_code: ProcedureCode(
            4,
        ),
        criticality: Criticality(
            0,
        ),
        value: Id_HandoverCancel(
            HandoverCancel {
                protocol_i_es: HandoverCancelProtocolIEs(
                    [
                        HandoverCancelProtocolIEs_Entry {
                            id: ProtocolIE_ID(
                                0,
                            ),
                            criticality: Criticality(
                                0,
                            ),
                            value: Id_MME_UE_S1AP_ID(
                                MME_UE_S1AP_ID(
                                    159462562,
                                ),
                            ),
                        },
                        HandoverCancelProtocolIEs_Entry {
                            id: ProtocolIE_ID(
                                8,
                            ),
                            criticality: Criticality(
                                0,
                            ),
                            value: Id_eNB_UE_S1AP_ID(
                                ENB_UE_S1AP_ID(
                                    266730,
                                ),
                            ),
                        },
                        HandoverCancelProtocolIEs_Entry {
                            id: ProtocolIE_ID(
                                2,
                            ),
                            criticality: Criticality(
                                1,
                            ),
                            value: Id_Cause(
                                RadioNetwork(
                                    CauseRadioNetwork(
                                        0,
                                    ),
                                ),
                            ),
                        },
                    ],
                ),
            },
        ),
    },
)

You can look at examples/tests/13-ngap.rs for some of this usage. Maybe I will I use this as a test for s1ap decode in examples/tests/12-s1ap.rs.

Originally posted by @gabhijit in #76 (comment)

asn1 compiler failing to parse O-RAN KPM v3 specification

I am running asn1-compiler 0.5.8 to generate code for the O-RAN KPM v3 specification, and it is failing with the following:

[2023-08-07T01:45:33Z INFO  asn1_compiler::compiler] Processing file: "kpm.asn1"
[2023-08-07T01:45:33Z WARN  asn1_compiler::parser::asn::types::constructed::choice] No tokens consumed in 1 iterations of the loop
[2023-08-07T01:45:33Z WARN  asn1_compiler::parser::asn::types::constructed::choice] No tokens consumed in 2 iterations of the loop
[2023-08-07T01:45:33Z ERROR asn1_compiler::parser::asn::defs] Failed to parse a definition at Token: Token { type: Identifier, span: Span { start: LineColumn { line: 35, column: 0 }, end: LineColumn { line: 35, column: 13 } }, text: "BinRangeValue" }
Error: Custom { kind: InvalidInput, error: "Parsing Error: Failed to parse a definition at Token: Token { type: Identifier, span: Span { start: LineColumn { line: 35, column: 0 }, end: LineColumn { line: 35, column: 13 } }, text: \"BinRangeValue\" }" }

I traced the problem down to new CHOICE types being defined with REAL variants. For example, this is the definition that initially causes a failure:

BinRangeValue ::= CHOICE {
	valueInt				INTEGER,
	valueReal			REAL,
	...
}

Commenting out the valueReal REAL, line allows the parser to get past the issue for this definition. There are three such definitions in the file, and commenting out all three results in a successfully generated Rust file.

This is the command that I am running:

hampi-rs-asn1c --module kpm.rs --codec aper --derive debug --derive clone --derive eq-partial-eq -- kpm.asn1 e2sm_common_ies.asn1

Here are the two asn.1 files needed to try this:
e2sm_common_ies.asn1.txt
kpm.asn1.txt

Here is the original spec if it is of interest:
O-RAN.WG3.E2SM-KPM-R003-v03.00.docx

Support ALL EXCEPT constructs

The following construct is not supported

IdentifierString ::= VisibleString (FROM (ALL EXCEPT " "))

It is used in CCSDS SLE specs (see #65) .

An error occurs in LTE RRC Protocol compilation

Hi
Using the following command, I generated the 36331-h20.asn file

python parse_spec.py 36331-h20.docx -o 36331-h20.asn

then I compiled the 336331-h20.asn file using the following command

hampi-rs-asn1c -m lte_rrc.rs --codec uper --derive all -- 36331-h20.asn

But the following error occurs:

INFO asn1_compiler::compiler] Processing file: "36331-h20.asn"
Error: Custom { kind: InvalidInput, error: "Tokenize Error (16) at Line: 6143, Column: 68" }

36331-h20.zip

missing "pub", occassionally

In the generated ngap.rs, some members need "pub" to be useable.

Criticality generated without "pub" on its members.

#[derive(Debug, AperCodec)]
#[asn(type = "ENUMERATED", lb = "0", ub = "2")]
pub struct Criticality(pub u8);
impl Criticality {
const REJECT: u8 = 0u8; // XXX: missing "pub"
const IGNORE: u8 = 1u8; // XXX: missing "pub"
const NOTIFY: u8 = 2u8; // XXX: missing "pub"
}

// "CauseProtocol" has the same problem in its genreation.
// "TimeToWait" too.

Unimplemented builtin type causes infinite loop in compiler

Encountered this issue while trying to compile https://github.com/boundary/wireshark/blob/master/asn1/ulp/SUPL.asn#L671

The compiler gets stuck in an infinite loop in parse_choice_type if the type is not recognised as a built in type in parse_type.

The compiler should error if this occurs.

I will try adding UTCTime to BASE_TYPES to try and finish compiling the file.

However this will happen again for any other of the unsupported built in types, so the infinite loop issue will need fixing.

Supporting `UPER` encoding in the `codecs` trait.

As the support for PER encoding is in place (Aligned PER), support for Unaligned PER can be added with little effort.

A test case for decoding and encoding of at-least one of the supported protocols (like NR-RRC) should be included once this support is in place.

Ideally this can be integrated with the compiler trait after #29 is added - which takes the codecs as a command line switch, thus we do not unnecessarily generate codecs that are not required.

BER Codec support

To be able to support BER codec we need to support following -

  • Support for parsing the tags
  • Support for resolving the tags to the type
  • Support for attributes generation (this can be controlled by a CLI switch)
  • Support for actual codec (utilizing tags in the struct attributes.

asn1_codecs_derive REAL support is not implemented

It looks like in #100 REAL support was added in the codecs, but I overlooked the need to modify asn1_codecs_derive. Encode and decode currently both result in todo!() calls.

We need to implemented the REAL support for asn1_codecs_derive.

How to determine if decode failure is due to needing more bytes

The asn1c library provides a way to detect when an array of bytes is too short foe decoding the data. This might happen if you are reading in multiple concatenated messages and you don't know the number of bytes of each message.

Thus in asn1c you can just keeping reading more bytes until you have the complete message and decoding succeeds.

I don't see a way to do this with the current decoder as the PerCodecError type doesn't seem to have a way to indicate that more bytes are required.

Use `anyhow` for asn1-compiler

Right now we are literally using std::io::Error as any error, wherever not applicable using std::io::ErrorKind::Other. It will be a good idea to start using anyhow inside the asn1-compiler crate.

Possible to access value of key for variants?

Thank you for your excellent work on this project!

I was wondering if it is possible to access the value of the key meta value for the asn attribute which gets defined in the generated code?

For example, in ngap code that I generated, I see this:

pub enum InitiatingMessageValue {
    #[asn(key = 64)]
    Id_AMFCPRelocationIndication(AMFCPRelocationIndication),
    #[asn(key = 0)]
    Id_AMFConfigurationUpdate(AMFConfigurationUpdate),
    #[asn(key = 1)]
    Id_AMFStatusIndication(AMFStatusIndication),
    #[asn(key = 2)]
...

When defining an NGAP_PDU, I need to explicitly set the procedure_code for an InitiatingMessage. Right now, I am just manually filling in the literal value like 64, but I am wondering if there is some way to just reference a key value for Id_AMFCPRelocationIndication from the generated code?

thread 'main' panicked

hello,

I tried running hampi against this freshly exported x2ap specification:

https://github.com/tramex/types_lte_3gpp/blob/main/asn/36.423_spec_X2AP.asn

You can test by using this command this the repo

cargo run -- --codec uper --module code.re -- 36.423_spec_X2AP.asn

but I ended up having this error

thread 'main' panicked at asn-compiler/src/resolver/asn/values.rs:28:66:
called `Result::unwrap()` on an `Err` value: ParseIntError { kind: InvalidDigit }
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

Is that normal ?

How to call hampi-rs-asn1c?

If I understand correctly, the hampi-rs-asn1c binary is supposed to generate some Rust sourcecode from asn specifications.

However I don't manage to run that binary. When I try this:
target/debug/hampi-rs-asn1c --module abc --codec uper examples/specs/example/example.asn

I get this:

error: Found argument 'examples/specs/example/example.asn' which wasn't expected, or isn't valid in this context

Can you please provide some examples?

Building `asn1-compiler` crate with `nightly`

Right now we are using rustfmt programmatically to format the generated code. However the recommended way is to use it as a binary and not use it as a library.

To be able to do this we should be able to call rustfmt as a binary on the generated code inside the generator module. This would help in building the crates with nightly as well.

Optional extension fields are assumed to be present in sequences

When deserializing at a sequence prelude, the code currently seems to assume that extension fields will be present when looking at optional flags. If the serialization code doesn't include the extension fields, then this will result in an error.

As an example, consider the following sequence: https://github.com/gabhijit/hampi/blob/master/examples/specs/e2sm/E2SM-KPM.asn#L55

There are 24 optional fields, but 3 of them are extension fields. The deserializer will fail if not all of the fields are present, with the following message:

Error { cause: BufferTooShort, msg: "PerCodec:GetBitError:Requested Bit 24, Remaining bits 22", context: [] }

Here is an example serialized E2SM-KPM-IndicationMessage that does not include the extension fields:

080000000100000000002043514901200000

You can add fake serialized extension fields by appending another 00 to the end of the message, and then deserialization will work fine.

Enumerated value panics due to empty named_root_values

Decoding an S1AP message

Hello
Is there an example that shows how to decode a message encoded with APER codec? For example, take the following message (which is an S1AP message) and return its Information Elements such as MME-UE-S1AP-ID or Cause.

msg = "0004001A00000300000005C0098134A200080004800411EA000240020000"

`UTCTime` support

The 3GPP LPP specification uses a number of UTCTime fields.

Currently after running hampi-rs-asn1c you get a Rust file that can't compile due to errors like:

error: This ASN.1 Type is not supported.
    --> src/lpp_bindings.rs:6507:14
     |
6507 | #[asn(type = "UTCTime")]
     |              ^^^^^^^^^

A quick fix is to replace all instances of UTCTime with UTF8String since time is just a formatted string.

It would be useful if the asn macro recognised UTCTime as a character string in https://github.com/gabhijit/hampi/blob/master/codecs_derive/src/per/mod.rs#L34

I think you should also add IA5String in there too since it is just a superset of Printable/VisibleString types, as well as NumericString

Fragmentation procedure missing

From @huixue

The more I look into decoder, I think there's something missing on handling the X.691 section 10.9.3.8.4, the "fragmentation procedure". 10.9.3.8.1-10.9.3.8.4 is describing the array of fragments which should be added, when the length is too big, over 16K. The content is being fragmented, and lengths are added to each of them (some are just number 1-4 indicating numbers of 16K chunks)

If you take a look at the attached INTEGER_aper.c, generated by asn1c compiler, line 292-301 is encoding a roll of fragment, using a loop. Line 126-142 is doing the decoding, accordingly.

I don't see this part in the decoder. Am I missing `something?```

Problem with encode_printable_string()

Hello, congrats on v0.2.0! I'm just trying it out and I'm seeing a problem with printable string coding.

To repro it, try adding this test to the end of codecs/mod.rs. The test fails on the tip, but passes on an old branch I had lying around.

    use crate::aper::{decode::decode_printable_string, encode::encode_printable_string};

    #[test]
    fn printable_string() {
        let mut data = AperCodecData::new();
        let s1 = "hello".to_string();
        encode_printable_string(&mut data, Some(1), Some(150), true, &s1, false).unwrap();
        let s2 = decode_printable_string(&mut data, Some(1), Some(150), true).unwrap();
        assert_eq!(s1, s2);
    }

Failure message:

thread 'aper::tests::printable_string' panicked at 'assertion failed: `(left == right)`
  left: `"hello"`,
 right: `"\u{0}\u{0}\u{0}\u{0}\u{0}"`', codecs/src/aper/mod.rs:465:9

Consistent tracing in APER `encode` and `decode`

Right now there is some tracing available in the APER codecs, but the tracing is not consistent and it becomes a challenge to debug the values that are decoded and if the same values are encoded, where the issues exactly is.

As a first step towards fixing this, the tracing should be consistent in the encode and decode modules. As far as possible the tracing should follow the following format -

  1. Upon entering - trace the name of the function (and if makes sense, passed parameters)
  2. Ant intermediate tracing should be with a "--->>" starter symbol, which we know is to be ignored. Ideally all of such trace! should go. They should be covered by test cases.
  3. Just before exiting - dump the offset in the buffer.

Fixing this issue is important before we can fix #7 . In fact this issue is directly a consequence of trying to fix #7.

Handling of Sequence Extensions in `AperCodec` decode.

For decoding the SequenceAdditionGroup and Component from the sequence extensions need to be supported. This needs to be handled further upstream in the parser and resolver modules where, we need to 'resolve' to a proper type for the extensions (right now all the components get merged with root_components which is not entirely correct.

Parse EXPORTS

The SUPL specification uses the EXPORTS section. Currently the compiler fails with an unexpected token error because of this.

The asn1c documentation indicates they just parse but ignore the EXPORTS list.

I quickly modified the compiler to do that and it seems to do the job as it no longer fails to compile the file.

An example of the SUPL ASN file is available at https://github.com/boundary/wireshark/blob/master/asn1/ulp/SUPL.asn#L16

Parse SEQUENCE SIZE ( ... ) OF ...

The SUPL specification uses the following syntax a few times:
SEQUENCE SIZE (1.. 1024) OF ReportData

Specifically the SIZE ( ... ) part is not enclosed in additional braces.

The asn1c compiler is able to parse this fine, but the compiler here fails to parse it.

I wasn't able to figure out how to modify the parser code so I just modified the file to add the extra braces like:
SEQUENCE (SIZE (1.. 1024)) OF ReportData
and I was able to compile the file.

An example SUPL ASN file is at https://github.com/boundary/wireshark/blob/master/asn1/ulp/SUPL.asn#L645

Provide `Default` deriving

Adding a option to derive Default would make it easier to fill out large structures which are mostly left at None or 0

Support for `COMPONENTS OF` Notation in asn1-compiler

As described in X.680 specification, notation COMPONENTS OF is used to define the inclusion, at this point in the list of components, of all the component types.

When I tried to compile such a Type Definition from MHEG-5.asn1, the compiler failed and throw the error when parsing ApplicationClass. As I commented out the line COMPONENTS OF GroupClass, it passed and throwing errors on the next similar line.

I am looking forward to the support for this notation.

Refactoring of Type Attributes handling for `INTEGER` and `ENUMERATED` Types

Right now there is a lot of code duplication in the way type attributes are handled inside individual decode modules eg there is a similar code in different modules, which can be abstracted out to a single function and called from these modules. -

let sz_lb = ... { 
...
};

let sz_ub = ... {
...
};

let sz_ext = ...{
...
};

This code can be abstracted out to a single module like codecs_derive/src/aper/constraints.rs by defining functions there and let the other modules call it.

ASN.1 compiler deriving Eq support for ASN.1 REAL values causes Rust compiler errors

Now that the ASN.1 REAL type is supported as an f64, there is a new issue.

The derive macros for the code generation tool are set at a global level. Deriving the Eq trait is likely common for some of the generated types; however, f64 does not support the Eq trait, only PartialEq.

There needs to be some way to generate code such that the Eq trait is not generated for REAL valued types.

Alignment error parsing a NGAP (APER) NG Setup Request

Hi! Thank you for the Hampi crate, it's very useful to have a Rust-based ASN.1 compiler as an alternative to asn1c and FFI :)

I wanted to ask if I have just found a bug in the parsing of a NGAP NGSetupRequest message (which is APER-encoded):

    let buf: [u8; 57] = [
        0x00, 0x15, 0x00, 0x35, 0x00, 0x00, 0x04, 0x00, 0x1b, 0x00, 0x08, 0x00, 0x02, 0xf8, 0x39,
        0x03, 0x80, 0x00, 0x04, 0x00, 0x52, 0x40, 0x09, 0x03, 0x00, 0x4e, 0x65, 0x72, 0x76, 0x69,
        0x6f, 0x6e, 0x00, 0x66, 0x00, 0x10, 0x00, 0x00, 0x00, 0x00, 0x01, 0x00, 0x02, 0xf8, 0x39,
        0x00, 0x00, 0x10, 0x08, 0x00, 0x00, 0x01, 0x00, 0x15, 0x40, 0x01, 0x40,
    ];

    let mut codec_data = PerCodecData::from_slice_aper(&buf);
    let ngap_pdu = ngap::NGAP_PDU::aper_decode(&mut codec_data).expect("Error decoding NGAP PDU");
    println!("Decoded NGAP PaDU: {:?}", ngap_pdu);

This results in the following error, complaining that the first byte of the gNB ID field (byte 15, first byte on second line) should not be 0x03.

Error { cause: InvalidAlignment, msg: "3 Padding bits at Offset 125 not all '0'.", context: [] }

In fact, changing that byte to 0x00 will make it parse successfully (although I have not verified the contents), but is not what we want since the byte sequence is supposed to be valid to begin with. The byte sequence is taken from a valid 5G NGAP trace that e.g. Wireshark and Open5GS will parse fine, attached.

The gNB ID field is defined in ASN.1 as BIT STRING (SIZE(22..32)). I believe the 0x03 is the length of the gNB ID that follows it (0x80 0x00 0x04, which is what Wireshark displays).

Still not very sure if this is a bug, but if it is, could it be fixed? Otherwise, how should I work around it?

pcap attachment

Context for error message

Hello again @gabhijit. Wanted to run an idea past you before diving in.

It would be helpful if error messages provided context saying how far we had got through the encode / decode when we hit the problem. To achieve this I am thinking that the context could be added as the error is propagated.

Imagine that the starting point is this (which might be macro derived):

impl AperCodec for Tac {
    fn decode(data: &mut AperCodecData) -> Result<Self::Output, AperCodecError> {
        Ok(Self(aper::decode::decode_octetstring(data, Some(3), Some(3), false)?))
    }
}

Then, to add the error context, you would wrap that in a block with a map_err at the end, like this...

impl AperCodec for Tac {
    fn decode(data: &mut AperCodecData) -> Result<Self::Output, AperCodecError> {
        {
            Ok(Self(aper::decode::decode_octetstring(data, Some(3), Some(3), false)?))
        }.map_err(|e: AperCodecError| e.add_context("Tac")) 
    }
}

In which case we would need:

impl Error {
    pub fn add_context(self, context_elem: &str) -> Self {...}
}

Do you think this would fly? As usual, I would be happy to send a PR for codecs but (because I am working with a different upper layer) would prefer not to include the work to codecs_derive to make use of it.

Adding CLi switches to compiler module -

Right now while generating the code we are by default generating #[derive(AperCodec)] on the generated structs. Ideally the codecs to be generated should be taken as command line switch (something like --codec aper) with only supported codec right now being aper.

Also, We may want to additionally control the following through command line switches -

  • Visibility of the struct members (right now default is pub may be we want pub(super) or pub(crate) ?
  • Take the module name as a parameter --module modulename (Right now we are simply using > ngap.rs say.

There might be others too - but a basic infra for having CLI switches with a couple of switches supported will be a good start.

Collections fail to decode consistently

Hello, I have encountered an issue in which messages containing collections (vectors in this case) of submessages fail to decode properly when the number of elements in the collection changes (for test purposes, I have experimented with creating a collection of identical elements, to ensure that it is not something unique to a given element that causes failure).

I have tried to provide as much context as possible, but it is difficult to produce a minimal reproducible example in this case. I will comment more on this at the end of this post.

I have a data type which abides by the form:

#[derive(asn1_codecs_derive :: UperCodec, Debug)]
#[asn(type = "SEQUENCE", extensible = true)]
pub struct GNSS_SSR_OrbitCorrections_r15 {
    pub epoch_time_r15: GNSS_SystemTime,
    pub ssr_update_interval_r15: INTEGER_580,
    pub satellite_reference_datum_r15: ENUMERATED_581,
    pub iod_ssr_r15: INTEGER_582,
    pub ssr_orbit_correction_list_r15: SSR_OrbitCorrectionList_r15, // <- where I believe the problem lies
}

I do not believe that many of these fields are relevant, but know that SSR_OrbitCorrectionList_r15 is a collection which takes the form:

#[derive(asn1_codecs_derive :: UperCodec, Debug)]
#[asn(type = "SEQUENCE-OF", sz_extensible = false, sz_lb = "1", sz_ub = "64")]
pub struct SSR_OrbitCorrectionList_r15(pub Vec<SSR_OrbitCorrectionSatelliteElement_r15>);

to ensure consistentcy, elements of SSR_OrbitCorrectionList_r15 (type SSR_OrbitCorrectionSatelliteElement_r15) are duplicates of one another. SSR_OrbitCorrectionSatelliteElement_r15 contains primitive data types.

Upon encountering a decode failure, I wrote a test to verify that encoding and subsequent decoding of a message of a collection type is successful

// verify that encoding and subsequent decoding of a message of a collection type is successful
pub fn encode_decode_sanity_check_orbit_correction_list(
    orbit_corrections: GNSS_SSR_OrbitCorrections_r15,
) -> Result<()> {
    let mut data: PerCodecData = PerCodecData::new_uper();

    orbit_corrections
        .ssr_orbit_correction_list_r15
        .uper_encode(&mut data)?;

    let bytes = data.into_bytes();
    println!("bytes.len(): {}", bytes.len());

    let mut decode_data: PerCodecData = PerCodecData::from_slice_uper(&bytes);

    let msg = SSR_OrbitCorrectionList_r15::uper_decode(&mut decode_data)?;

    Ok(())
}

This function SOMETIMES returns Ok(()). Most of the time it returns
Error: PerCodec:GetBitError:Requested Bit <something>, Remaining bits <something else>

By modifying ONLY the number of elements in SSR_OrbitCorrectionList_r15, (Vec<SSR_OrbitCorrectionSatelliteElement_r15>), I can achieve different results:

  • sometimes the decoding fails with the error described above
  • sometimes the decoding succeeds but the data has changed from what I encoded
  • sometimes the encoding succeeds nominally, though this is rare

I suspect that this problem relates to collections because I have performed this "roundtrip" process on all other subfields of the parent GNSS_SSR_OrbitCorrections_r15 structure and have not encountered this problem.

An additional piece of information you may find useful is that if I attempted to encode then immediately decode each element of the vector, each of these roundtrips would succeed. It is only when I try to encode the vector SSR_OrbitCorrectionList_r15, or the vector's parent GNSS_SSR_OrbitCorrections_r15 that failure occurs.

Please let me know how I can be of assistance. If the information I have provided is insufficient I would be happy to send you with further information related to the bindings I am using, tests, etc

Codec support for `REAL` type

#98 Implements support for REAL ASN.1 type in the compiler. The current codec support for the REAL type is just a todo!. Proper codec support for REAL type should be added inasn-codecs and asn-codecs-derive

adding bounds to data types causes UPER encoding to fail (range start index X out of range for slice of length Y)

Hello, I have found that attempting to uper encode primitive values (perhaps more complex structs too) causes an indexing error within mod.rs

Please find a minimal reproducible example below:

asn1 bindings:

#[derive(asn1_codecs_derive :: UperCodec, Debug)]
#[asn(type = "INTEGER", lb = "0", ub = "63")]
pub struct INTEGER_20(pub u8);
let mut data: PerCodecData = PerCodecData::new_uper();
let my_int: INTEGER_20 = INTEGER_20(3);
my_int.uper_encode(&mut data)?;

output:

thread 'main' panicked at 'range start index 125 out of range for slice of length 16', library/core/src/slice/index.rs:52:5
stack backtrace:
   0: rust_begin_unwind
             at /rustc/a55dd71d5fb0ec5a6a3a9e8c27b2127ba491ce52/library/std/src/panicking.rs:584:5
   1: core::panicking::panic_fmt
             at /rustc/a55dd71d5fb0ec5a6a3a9e8c27b2127ba491ce52/library/core/src/panicking.rs:142:14
   2: core::slice::index::slice_start_index_len_fail_rt
             at /rustc/a55dd71d5fb0ec5a6a3a9e8c27b2127ba491ce52/library/core/src/slice/index.rs:52:5
   3: core::ops::function::FnOnce::call_once
             at /rustc/a55dd71d5fb0ec5a6a3a9e8c27b2127ba491ce52/library/core/src/ops/function.rs:248:5
   4: core::intrinsics::const_eval_select
             at /rustc/a55dd71d5fb0ec5a6a3a9e8c27b2127ba491ce52/library/core/src/intrinsics.rs:2696:5
   5: core::slice::index::slice_start_index_len_fail
             at /rustc/a55dd71d5fb0ec5a6a3a9e8c27b2127ba491ce52/library/core/src/slice/index.rs:42:9
   6: <core::ops::range::RangeFrom<usize> as core::slice::index::SliceIndex<[T]>>::index
             at /rustc/a55dd71d5fb0ec5a6a3a9e8c27b2127ba491ce52/library/core/src/slice/index.rs:390:13
   7: core::slice::index::<impl core::ops::index::Index<I> for [T]>::index
             at /rustc/a55dd71d5fb0ec5a6a3a9e8c27b2127ba491ce52/library/core/src/slice/index.rs:18:9
   8: core::array::<impl core::ops::index::Index<I> for [T; N]>::index
             at /rustc/a55dd71d5fb0ec5a6a3a9e8c27b2127ba491ce52/library/core/src/array/mod.rs:286:9
   9: asn1_codecs::per::common::encode::encode_internal::encode_constrained_whole_number_common
             at /Users/ayden.chubbic/.cargo/registry/src/github.com-1ecc6299db9ec823/asn1-codecs-0.5.1/src/per/common/encode/encode_internal.rs:128:26
  10: asn1_codecs::per::common::encode::encode_integer_common
             at /Users/ayden.chubbic/.cargo/registry/src/github.com-1ecc6299db9ec823/asn1-codecs-0.5.1/src/per/common/encode/mod.rs:88:13
  11: asn1_codecs::per::uper::encode::encode_integer
             at /Users/ayden.chubbic/.cargo/registry/src/github.com-1ecc6299db9ec823/asn1-codecs-0.5.1/src/per/uper/encode/mod.rs:74:5
  12: <generate_lpp::lpp_bindings::INTEGER_20 as asn1_codecs::per::uper::UperCodec>::uper_encode
             at ./src/lpp_bindings.rs:7924:10
  13: generate_lpp::load_lpp_data_into_file
             at ./src/main.rs:62:5
  14: generate_lpp::main
             at ./src/main.rs:22:5
  15: core::ops::function::FnOnce::call_once
             at /rustc/a55dd71d5fb0ec5a6a3a9e8c27b2127ba491ce52/library/core/src/ops/function.rs:248:5

NOTE: removal of lb = "0", ub = "63" resolves the immediate error, (encoding does not yield a panic), but subsequent decoding fails. e.g. running

let bytes = data.into_bytes();
let mut file = OpenOptions::new().write(true).open(filepath.as_path())?;

let decoded_bytes = std::fs::read(filepath)?;
let mut data: PerCodecData = PerCodecData::from_slice_uper(&decoded_bytes);
let msg = <bindings_file>::INTEGER_20::uper_decode(&mut data)?;
println!("msg contents: {}", msg.0);

yields inaccurate numbers

Add support for serde Serialize, Deserialize

In generated rust, such as ngap.rs, add Serialize, Deserialize for serde use
e.g.

#[derive(Debug, AperCodec)]
#[asn(type = "SEQUENCE", extensible = true)]
pub struct UEContextReleaseRequest {
    pub protocol_i_es: UEContextReleaseRequestProtocolIEs,
}

would be

#[derive(Debug, AperCodec, Serialize, Deserialize)]
#[asn(type = "SEQUENCE", extensible = true)]
pub struct UEContextReleaseRequest {
    pub protocol_i_es: UEContextReleaseRequestProtocolIEs,
}

essentially, do we have a way for controlling what to "derive", in codegen?

The reason is, generated rust would very likely to be part of data structure defs, which better be able to use serde.

decode_printable_string() always returns empty string

Hi there @gabhijit,

decode_printable_string() returns Ok(String::new()) when it should return Ok(out). See https://github.com/gabhijit/hampi/blob/ca9875e8f85065dcf6af01cd0686fc1388585a1a/codecs/src/aper/decode/decode_charstrings.rs#L97

I'd normally submit a pull request but I'm also hitting a build problem which I didn't have time to debug yet, about const_fn being removed in a package included underneath rustfmt - any ideas how to fix that?

58 | #![cfg_attr(extprim_channel="unstable", feature(llvm_asm, test, specialization, const_fn))]
   |                                                                                 ^^^^^^^^ feature has been removed
   |
   = note: split into finer-grained feature gates

Issue with ASN1 BIT STRING type

Hi, first of all thank you for providing this helpful tool.

I currently encounter an issue when looking at the decoded output of elements in S1AP (F1AP, NGAP) that are based on the BIT STRING type.

Here an example snippet of a decoded S1AP PDU:

transport_layer_address: TransportLayerAddress(
                                                        BitVec<u8, bitvec::order::Msb0> {
                                                            addr: 0x00005640288a4050,
                                                            head: 000,
                                                            bits: 32,
                                                            capacity: 64,
                                                        } [

Here is the Hex string of the S1AP message that I try to decode:

00090080c50000060000000200070008000200970042000a1805c81a406002cd29c0001800770000340072450009270f80ac100a650000000763277efa7b4f0207420149062000f1100001003d5201c101090908696e7465726e65740501ac11f1ff5e04fefe9b6927208080211003010010810608080808830608080404000d04080808080010020578500bf600f1100001011a984cf8172164020000006b000518000c0000004900200409ae936c4af5643f94a220c336c6354eb82331071245bb367a3b8fcca4c596

Incorrect encode of INTEGER(0..4294967295)

Constrained 32-bit integer length wrongly encoded in 3 bits instead of 2.

For example, INTEGER(0..4294967295) value 0x10203040 encodes to 0x6010203040 (leading bits 011) when it should be 0xC010203040 (leading bits 11).

Support for Tag resolution in the `resolver`

A tag needs to be properly assigned to a type to be used in the BER/DER encoding. We have a support for the tag in the parser crate when it can be parsed. Similar support for the tags needs to be added in the Asn1resolver structure as well.

See also: #75

NAS-5G

Hi!

Is it possible to use hampi to generate ASN.1 specs for NAS 5G and then use the asn1-compiler to build a Rust datamodel that can encode/decode NAS 5G messages?
If no, is there any alternative to do such things?

Thanks!!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.