Git Product home page Git Product logo

up-transport-zenoh-rust's Introduction

up-transport-zenoh-rust

uProtocol transport implementation for Zenoh in Rust

Build

# Check clippy
cargo clippy --all-targets
# Build
cargo build
# Run test
cargo test
# Test coverage
cargo tarpaulin -o lcov -o html --output-dir target/tarpaulin

Examples

The examples of up-transport-zenoh-rust can be found under examples folder. Assume you're using debug build.1

# Publisher
./target/debug/examples/publisher
# Subscriber
./target/debug/examples/subscriber
# Notifier
./target/debug/examples/notifier
# Notification Receiver
./target/debug/examples/notification_receiver
# RPC Server
./target/debug/examples/rpc_server
# RPC Client
./target/debug/examples/rpc_client
# L2 RPC Client
./target/debug/examples/l2_rpc_client

For the advanced Zenoh configuration, you can either use -h to see more details or pass the configuration file with -c. The configuration file example is under the folder config.

Note

The implementation follows the spec defined in up-l1/zenoh.

Footnotes

  1. Some PC configurations cannot connect locally. Add multicast to lo interface using $ sudo ip link set dev lo multicast on โ†ฉ

up-transport-zenoh-rust's People

Contributors

dmacattack avatar eclipse-uprotocol-bot avatar evshary avatar plevasseur avatar stevenhartley avatar

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

up-transport-zenoh-rust's Issues

Ability to record all messages and message types over all UAuthority

Hey @evshary -- I think in support of the up-zenoh-recorder we'll need the ability to listen for all messages coming over all authorities.

Seems like to me we could accomplish this with handing in an empty UUri, i.e. even the UAuthority is empty for now when calling register_listener().

i.e.

let recording_uuri = UUri::default();
recorder_up_client.register_listener(recording_uuri, my_recording_listener);

I think this would be able to handle the use-case of passively listening for Publish, Notification, and Request message types.

Publish & Notification

Publish and Notification are straight-forward, since they use the Zenoh Subscriber pattern.

Request

For the Request message type we would just be sure not to send a Response, so as to not "compete" with the actual service listening on that UUri.

Response

However, I'm unclear on how we would be able to passively listen to the Response message type, since in Zenoh terminology, when we handle a Request message we call reply on the query. It seems to me like there's nothing sort of "externally visible" for the uRecorder to be able to see when that happens. But maybe I'm wrong and my test case is wrong.

  • Could you help me understand if the uRecorder would be able to see the Response as well when it's sent up by a service?

Test Code

I opened a draft PR over here to show the test code which is failing.

Add Notification examples

Now we've already had Pub/Sub, RpcServer/Client, and it would be good to have Notification example.

Undeclare Queryable

When a listener for RPC requests is being unregistered, it seems like the corresponding Zenoh Queryable is not being undeclared.

Does this represent a memory leak?

Use different/latest up-rust version

With the recent changes to accommodate current up-rust status, would it be possible to now change the up-rust dependency in Cargo.toml to just use 'latest' (i.e., remove the rev-pin)?

Reason for the ask is that the current rev forces use of an older up-rust version, which does not contain the recent improvements we did there...

Move the examples to another repository

We need to move the examples to another repository, just as up-zenoh-example-cpp does.
@stevenhartley Could you please create a repository up-zenoh-example-rust? To be honest, I'm not sure if you prefer to have different repositories for each language-binding examples or not. Let me know where you want to put the Zenoh Rust examples.

The requirement of source while using RPC

Hi @PLeVasseur and @stevenhartley

Since we didn't have too much time to discuss the topic yesterday, it might be clearer to create an independent issue about this.
As far as I know, the requirement for needing "source" comes from how to route the response back to the sender.
However, I'm still a little confused about this. Let me list my questions to get us on the same page.

  1. Why needs to know the "source"?
    While using RPC, I think uProtocol application developers don't need to know how to send back the response.
    It should be done by the UPClient and keep the upper layer agnostic.
    Taking Zenoh as an example, the RPC client uses get to send a query, and the RPC server will trigger queryable callback and get the query.
    https://github.com/ZettaScaleLabs/up-client-zenoh-rust/blob/5faef310caa0c5e2b3f6bcc4494720e418e194f0/src/lib.rs#L333
    Then RPC Server can reply to the query directly.
    https://github.com/ZettaScaleLabs/up-client-zenoh-rust/blob/5faef310caa0c5e2b3f6bcc4494720e418e194f0/src/lib.rs#L204
    Zenoh takes over how to send back the response.

  2. About the uStreamer implementation
    The following architecture is what I imagine about how the implementation will do.
    I guess we don't need to deal with UURI routing here, but I might miss something.
    Maybe we can align about the implementation and see what the gap is.
    uStreamer architecture

  3. The difference between UUID & UURI
    Although I asked it yesterday, I'm still quite confused about it.
    If every entity has its own UURI, why do we still need UUID?
    image
    I thought UUID is the identity of the entity while UURI is the topic, with which different entities can talk to each other.

  4. How to assign UURI to the entity
    If we assume every entity has its own UURI, then the question will be how users initialize their own UURI.
    Let's take RPC client as an example and here is the example code
    https://github.com/ZettaScaleLabs/up-client-zenoh-rust/blob/5faef310caa0c5e2b3f6bcc4494720e418e194f0/examples/rpc_client.rs#L31
    Now we only declare the UURI which is registered by the server, but we don't have any API for clients to create its own UURI right now.

register_listener() with UUri::any() as origin filter

I've run into an issue when using register_listener() in up-transport-zenoh. More specifically, the origin_filter parameter of that method:
Seeing a parameter with that name in the context of a register_listener() method, I was happily passing in an UUri::any(), wanting to listen to any source - and get an error message saying "Wrong combination of source UUri and sink UUri".

After digging, I find that this messages is coming out of get_listener_message_type, which is complaining because the origin any-UUri I'm passing in does not contain a resource_id==0.

Now, there is several points I want to raise about this:

  1. User expectation
    When I'm passing an "any URI" into a register_listener function as an origin filter, I really expect this to work and the listener to listen to any URI. There's not much more to it really - this threw me for hours, I really didn't expect the error to be related to the origin filter, at all.

  2. Architectural layering
    As I see it, message types ("Request", "Notification", etc) - the failed identification of which are the root cause of the error - are a uprotocol level 2 thing, whereas UTransport::register_listener() is uprotocol level 1. So having level 2 concerns shine through into level 1 methods seems to be a design leak.

  3. User experience
    Whatever else we might agree to do here, I really would like to suggest a bit more specific error message texts for get_listener_message_type() ;-)

WDYT?

Handle `Notification`-style `Publish` messages correctly in `send()`

FYI @evshary -- I'm sensing we will need to modify up-client-zenoh-rust UTransport::send() to add being able to send Notification with their Zenoh KeyExpr set at their sink instead of source. I'm referring specifically to this part of the code.

Notifications are not an explicit message type, but are lumped in with Publish.

You can discriminate them based on whether there's a sink or not:

  • No sink => it's actually Publish
  • Has sink => it's actually Notification

(Does this seem safe to do @sophokles73? Any other way to discriminate between them?)

I thought of this when reading Kai's comment.

Update RPC implementation to handle message `UUID`s correctly

Highlighting a comment I made on a previous PR:

I think this RPC flow still needs some thought though.

We should not be creating a new UUIDBuilder here and getting the reqid from it. I believe this is where the UUIDBuilder that's been constructed already for the uE sending the request should be used.

As is stated over here in the UUID spec, the rand_b portion of the UUID is used to uniquely identify the uE, thus it should be "stable" and thus we should use the UUIDBuilder already associated with this uE.

I suppose that may mean that on construction, you'd need to construct a UUIDBuilder and then use it within RpcClient::invoke_method().

WDYT?

test_blocking_user_callback fails

Environment :

$ lsb_release -a
LSB Version: core-11.1.0ubuntu2-noarch:printing-11.1.0ubuntu2-noarch:security-11.1.0ubuntu2-noarch
Distributor ID: Ubuntu
Description: Ubuntu 20.04.6 LTS
Release: 20.04
Codename: focal

Setup

build source

cargo build

run tests

cargo test

Problem observed

Zenoh seemingly fails at getting response. I assume it might be network configuration based, or similar.

Skipping this test shows the reply is also not received

Logs

$ RUST_LOG=trace cargo test 
    Finished test [unoptimized + debuginfo] target(s) in 0.13s
     Running unittests src/lib.rs (target/debug/deps/up_transport_zenoh-d777a23230707f5a)

running 23 tests
test tests::test_get_listener_message_type::publish_message ... ok
test tests::test_get_listener_message_type::impossible_scenario_1 ... ok
test tests::test_get_listener_message_type::request_message ... ok
test tests::test_get_listener_message_type::impossible_scenario_4 ... ok
test tests::test_get_listener_message_type::notification_message ... ok
test tests::test_get_listener_message_type::impossible_scenario_3 ... ok
test tests::test_get_listener_message_type::all_messages_to_a_device ... ok
test tests::test_get_listener_message_type::listen_to_notification_and_response_message ... ok
test tests::test_get_listener_message_type::impossible_scenario_2 ... ok
test tests::test_get_listener_message_type::response_message ... ok
test tests::test_new_up_transport_zenoh::fails_with_invalid_uuri ... ok
test tests::test_new_up_transport_zenoh::succeeds_with_valid_uuri ... ok
test tests::test_to_zenoh_key_string::receive_all_notifications ... ok
test tests::test_new_up_transport_zenoh::fails_with_empty_uauthority ... ok
test tests::test_to_zenoh_key_string::send_notification ... ok
test tests::test_to_zenoh_key_string::subscribe_messages ... ok
test tests::test_to_zenoh_key_string::send_publish ... ok
test tests::test_to_zenoh_key_string::receive_all_messages_to_a_device ... ok
test tests::test_to_zenoh_key_string::receive_all_requests ... ok
test tests::test_to_zenoh_key_string::send_request ... ok
test uri_provider::tests::test_local_uri_provider::publish_notification_resource_id ... ok
test tests::test_new_up_transport_zenoh::fails_with_non_zero_resource_id ... ok
test uri_provider::tests::test_local_uri_provider::rpc_resource_id ... ok

test result: ok. 23 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.95s

     Running tests/blocking_callback.rs (target/debug/deps/blocking_callback-ecac795b12ccfd4d)

running 1 test
2024-08-05T21:21:20.307973Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::session: Config: Config { id: 507dda98534fff27fcca9e38e8bdaa29, metadata: Null, mode: None, connect: ConnectConfig { timeout_ms: None, endpoints: [], exit_on_failure: None, retry: None }, listen: ListenConfig { timeout_ms: None, endpoints: [], exit_on_failure: None, retry: None }, scouting: ScoutingConf { timeout: None, delay: None, multicast: ScoutingMulticastConf { enabled: None, address: None, interface: None, ttl: None, autoconnect: None, listen: None }, gossip: GossipConf { enabled: None, multihop: None, autoconnect: None } }, timestamping: TimestampingConf { enabled: None, drop_future_timestamp: None }, queries_default_timeout: None, routing: RoutingConf { router: RouterRoutingConf { peers_failover_brokering: None }, peer: PeerRoutingConf { mode: None } }, aggregation: AggregationConf { subscribers: [], publishers: [] }, transport: TransportConf { unicast: TransportUnicastConf { accept_timeout: 10000, accept_pending: 100, max_sessions: 1000, max_links: 1, lowlatency: false, qos: QoSUnicastConf { enabled: true }, compression: CompressionUnicastConf { enabled: false } }, multicast: TransportMulticastConf { join_interval: Some(2500), max_sessions: Some(1000), qos: QoSMulticastConf { enabled: false }, compression: CompressionMulticastConf { enabled: false } }, link: TransportLinkConf { protocols: None, tx: LinkTxConf { sequence_number_resolution: U32, lease: 10000, keep_alive: 4, batch_size: 65535, queue: QueueConf { size: QueueSizeConf { control: 1, real_time: 1, interactive_high: 1, interactive_low: 1, data_high: 2, data: 4, data_low: 2, background: 1 }, congestion_control: CongestionControlConf { wait_before_drop: 1000 }, backoff: 100 }, threads: 4 }, rx: LinkRxConf { buffer_size: 65535, max_message_size: 1073741824 }, tls: TLSConf { root_ca_certificate: None, server_private_key: None, server_certificate: None, client_auth: None, client_private_key: None, client_certificate: None, server_name_verification: None, root_ca_certificate_base64: None, server_private_key_base64: None, server_certificate_base64: None, client_private_key_base64: None, client_certificate_base64: None }, unixpipe: UnixPipeConf { file_access_mask: None } }, shared_memory: SharedMemoryConf { enabled: false }, auth: AuthConf { usrpwd: UsrPwdConf { user: None, password: None, dictionary_file: None }, pubkey: PubKeyConf { public_key_pem: None, private_key_pem: None, public_key_file: None, private_key_file: None, key_size: None, known_keys_file: None } } }, adminspace: AdminSpaceConf { enabled: false, permissions: PermissionsConf { read: true, write: false } }, downsampling: [], access_control: AclConfig { enabled: false, default_permission: Deny, rules: None }, plugins_loading: PluginsLoading { enabled: false, search_dirs: None }, plugins: Object {} }
2024-08-05T21:21:20.308122Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::runtime: Zenoh Rust API v0.11.0
2024-08-05T21:21:20.308137Z  INFO test_blocking_user_callback ThreadId(02) zenoh::net::runtime: Using ZID: 507dda98534fff27fcca9e38e8bdaa29
2024-08-05T21:21:20.308245Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::interceptor::access_control: Access control is disabled
2024-08-05T21:21:20.361886Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::hat::p2p_peer::gossip: [Gossip] Add node (self) 507dda98534fff27fcca9e38e8bdaa29
2024-08-05T21:21:20.361982Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::router: New Face{0, 507dda98534fff27fcca9e38e8bdaa29}
2024-08-05T21:21:20.362057Z TRACE test_blocking_user_callback ThreadId(02) zenoh::session: queryable(@/session/507dda98534fff27fcca9e38e8bdaa29/**)
2024-08-05T21:21:20.362106Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: Try to add listener: tcp/[::]:0: ConnectionRetryConf { exit_on_failure: true, period_init_ms: 1000, period_max_ms: 4000, period_increase_factor: 2.0 }
2024-08-05T21:21:20.362333Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: Listener added: tcp/[::]:38907
2024-08-05T21:21:20.362374Z TRACE                       acc-0 ThreadId(21) zenoh_link_tcp::unicast: Ready to accept TCP connections on: [::]:38907
2024-08-05T21:21:20.362556Z  INFO test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: Zenoh can be reached at: tcp/[fd84:dbc5:7dc8:8dc3:679:3e8d:de1a:51c3]:38907
2024-08-05T21:21:20.362565Z  INFO test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: Zenoh can be reached at: tcp/[fd84:dbc5:7dc8:8dc3:2164:f52f:bdf2:b748]:38907
2024-08-05T21:21:20.362569Z  INFO test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: Zenoh can be reached at: tcp/[fe80::31fd:25cb:795e:9a4f]:38907
2024-08-05T21:21:20.362572Z  INFO test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: Zenoh can be reached at: tcp/192.168.68.124:38907
2024-08-05T21:21:20.362648Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: UDP port bound to 224.0.0.224:7446
2024-08-05T21:21:20.362661Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: Joined multicast group 224.0.0.224 on interface 192.168.68.124
2024-08-05T21:21:20.362666Z  INFO test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: zenohd listening scout messages on 224.0.0.224:7446
2024-08-05T21:21:20.362705Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: UDP port bound to 192.168.68.124:53094
2024-08-05T21:21:20.362792Z DEBUG                       net-0 ThreadId(20) zenoh::net::runtime::orchestrator: Waiting for UDP datagram...
2024-08-05T21:21:20.362853Z TRACE                       net-0 ThreadId(20) zenoh::net::runtime::orchestrator: Send Scout(Scout { version: 8, what: WhatAmIMatcher(131), zid: None }) to 224.0.0.224:7446 on interface 192.168.68.124
2024-08-05T21:21:21.065024Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::session: Config: Config { id: 9a81c1c00774e54c83d45e715a7b6b66, metadata: Null, mode: None, connect: ConnectConfig { timeout_ms: None, endpoints: [], exit_on_failure: None, retry: None }, listen: ListenConfig { timeout_ms: None, endpoints: [], exit_on_failure: None, retry: None }, scouting: ScoutingConf { timeout: None, delay: None, multicast: ScoutingMulticastConf { enabled: None, address: None, interface: None, ttl: None, autoconnect: None, listen: None }, gossip: GossipConf { enabled: None, multihop: None, autoconnect: None } }, timestamping: TimestampingConf { enabled: None, drop_future_timestamp: None }, queries_default_timeout: None, routing: RoutingConf { router: RouterRoutingConf { peers_failover_brokering: None }, peer: PeerRoutingConf { mode: None } }, aggregation: AggregationConf { subscribers: [], publishers: [] }, transport: TransportConf { unicast: TransportUnicastConf { accept_timeout: 10000, accept_pending: 100, max_sessions: 1000, max_links: 1, lowlatency: false, qos: QoSUnicastConf { enabled: true }, compression: CompressionUnicastConf { enabled: false } }, multicast: TransportMulticastConf { join_interval: Some(2500), max_sessions: Some(1000), qos: QoSMulticastConf { enabled: false }, compression: CompressionMulticastConf { enabled: false } }, link: TransportLinkConf { protocols: None, tx: LinkTxConf { sequence_number_resolution: U32, lease: 10000, keep_alive: 4, batch_size: 65535, queue: QueueConf { size: QueueSizeConf { control: 1, real_time: 1, interactive_high: 1, interactive_low: 1, data_high: 2, data: 4, data_low: 2, background: 1 }, congestion_control: CongestionControlConf { wait_before_drop: 1000 }, backoff: 100 }, threads: 4 }, rx: LinkRxConf { buffer_size: 65535, max_message_size: 1073741824 }, tls: TLSConf { root_ca_certificate: None, server_private_key: None, server_certificate: None, client_auth: None, client_private_key: None, client_certificate: None, server_name_verification: None, root_ca_certificate_base64: None, server_private_key_base64: None, server_certificate_base64: None, client_private_key_base64: None, client_certificate_base64: None }, unixpipe: UnixPipeConf { file_access_mask: None } }, shared_memory: SharedMemoryConf { enabled: false }, auth: AuthConf { usrpwd: UsrPwdConf { user: None, password: None, dictionary_file: None }, pubkey: PubKeyConf { public_key_pem: None, private_key_pem: None, public_key_file: None, private_key_file: None, key_size: None, known_keys_file: None } } }, adminspace: AdminSpaceConf { enabled: false, permissions: PermissionsConf { read: true, write: false } }, downsampling: [], access_control: AclConfig { enabled: false, default_permission: Deny, rules: None }, plugins_loading: PluginsLoading { enabled: false, search_dirs: None }, plugins: Object {} }
2024-08-05T21:21:21.065155Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::runtime: Zenoh Rust API v0.11.0
2024-08-05T21:21:21.065171Z  INFO test_blocking_user_callback ThreadId(02) zenoh::net::runtime: Using ZID: 9a81c1c00774e54c83d45e715a7b6b66
2024-08-05T21:21:21.065202Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::interceptor::access_control: Access control is disabled
2024-08-05T21:21:21.133267Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::hat::p2p_peer::gossip: [Gossip] Add node (self) 9a81c1c00774e54c83d45e715a7b6b66
2024-08-05T21:21:21.133339Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::router: New Face{0, 9a81c1c00774e54c83d45e715a7b6b66}
2024-08-05T21:21:21.133392Z TRACE test_blocking_user_callback ThreadId(02) zenoh::session: queryable(@/session/9a81c1c00774e54c83d45e715a7b6b66/**)
2024-08-05T21:21:21.133426Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: Try to add listener: tcp/[::]:0: ConnectionRetryConf { exit_on_failure: true, period_init_ms: 1000, period_max_ms: 4000, period_increase_factor: 2.0 }
2024-08-05T21:21:21.133551Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: Listener added: tcp/[::]:42003
2024-08-05T21:21:21.133583Z TRACE                       acc-0 ThreadId(21) zenoh_link_tcp::unicast: Ready to accept TCP connections on: [::]:42003
2024-08-05T21:21:21.133727Z  INFO test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: Zenoh can be reached at: tcp/[fd84:dbc5:7dc8:8dc3:679:3e8d:de1a:51c3]:42003
2024-08-05T21:21:21.133734Z  INFO test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: Zenoh can be reached at: tcp/[fd84:dbc5:7dc8:8dc3:2164:f52f:bdf2:b748]:42003
2024-08-05T21:21:21.133738Z  INFO test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: Zenoh can be reached at: tcp/[fe80::31fd:25cb:795e:9a4f]:42003
2024-08-05T21:21:21.133741Z  INFO test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: Zenoh can be reached at: tcp/192.168.68.124:42003
2024-08-05T21:21:21.133806Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: UDP port bound to 224.0.0.224:7446
2024-08-05T21:21:21.133815Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: Joined multicast group 224.0.0.224 on interface 192.168.68.124
2024-08-05T21:21:21.133820Z  INFO test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: zenohd listening scout messages on 224.0.0.224:7446
2024-08-05T21:21:21.133848Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::runtime::orchestrator: UDP port bound to 192.168.68.124:53700
2024-08-05T21:21:21.133922Z TRACE                       net-0 ThreadId(20) zenoh::net::runtime::orchestrator: Send Scout(Scout { version: 8, what: WhatAmIMatcher(131), zid: None }) to 224.0.0.224:7446 on interface 192.168.68.124
2024-08-05T21:21:21.134007Z DEBUG                       net-0 ThreadId(20) zenoh::net::runtime::orchestrator: Waiting for UDP datagram...
2024-08-05T21:21:21.365652Z TRACE                       net-0 ThreadId(20) zenoh::net::runtime::orchestrator: Send Scout(Scout { version: 8, what: WhatAmIMatcher(131), zid: None }) to 224.0.0.224:7446 on interface 192.168.68.124
2024-08-05T21:21:21.837214Z TRACE test_blocking_user_callback ThreadId(02) zenoh::session: subscribe(ke`up/nonblock_pub/1/1/8000/{}/{}/{}/{}`)
2024-08-05T21:21:21.837348Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::dispatcher::pubsub: Declare subscription Face{0, 9a81c1c00774e54c83d45e715a7b6b66}
2024-08-05T21:21:21.837465Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::dispatcher::resource: Register resource up/nonblock_pub/1/1/8000/{}/{}/{}/{}
2024-08-05T21:21:21.837524Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::hat::p2p_peer::pubsub: Register subscription up/nonblock_pub/1/1/8000/{}/{}/{}/{} for Face{0, 9a81c1c00774e54c83d45e715a7b6b66}
2024-08-05T21:21:21.837618Z TRACE test_blocking_user_callback ThreadId(02) zenoh::net::routing::hat::p2p_peer::pubsub: compute_data_route(up/nonblock_pub/1/1/8000/{}/{}/{}/{}, 0, Router)
2024-08-05T21:21:21.837708Z TRACE test_blocking_user_callback ThreadId(02) zenoh::net::routing::hat::p2p_peer::pubsub: compute_data_route(up/nonblock_pub/1/1/8000/{}/{}/{}/{}, 0, Peer)
2024-08-05T21:21:21.837749Z TRACE test_blocking_user_callback ThreadId(02) zenoh::net::routing::hat::p2p_peer::pubsub: compute_data_route(up/nonblock_pub/1/1/8000/{}/{}/{}/{}, 0, Client)
2024-08-05T21:21:22.135809Z TRACE                       net-0 ThreadId(20) zenoh::net::runtime::orchestrator: Send Scout(Scout { version: 8, what: WhatAmIMatcher(131), zid: None }) to 224.0.0.224:7446 on interface 192.168.68.124
2024-08-05T21:21:22.839540Z TRACE test_blocking_user_callback ThreadId(02) zenoh::publication: write(ke`up/nonblock_pub/1/1/8000/{}/{}/{}/{}`, [...])
2024-08-05T21:21:22.839623Z TRACE test_blocking_user_callback ThreadId(02) zenoh::net::routing::dispatcher::pubsub: Route data for res up/nonblock_pub/1/1/8000/{}/{}/{}/{}
2024-08-05T21:21:22.839657Z TRACE test_blocking_user_callback ThreadId(02) zenoh::net::routing::hat::p2p_peer::pubsub: compute_data_route(up/nonblock_pub/1/1/8000/{}/{}/{}/{}, 0, Client)
2024-08-05T21:21:22.839774Z TRACE test_blocking_user_callback ThreadId(02) zenoh::publication: write(ke`up/nonblock_pub/1/1/8000/{}/{}/{}/{}`, [...])
2024-08-05T21:21:22.839788Z TRACE test_blocking_user_callback ThreadId(02) zenoh::net::routing::dispatcher::pubsub: Route data for res up/nonblock_pub/1/1/8000/{}/{}/{}/{}
2024-08-05T21:21:22.839799Z TRACE test_blocking_user_callback ThreadId(02) zenoh::net::routing::hat::p2p_peer::pubsub: compute_data_route(up/nonblock_pub/1/1/8000/{}/{}/{}/{}, 0, Client)
2024-08-05T21:21:23.367286Z TRACE                       net-0 ThreadId(20) zenoh::net::runtime::orchestrator: Send Scout(Scout { version: 8, what: WhatAmIMatcher(131), zid: None }) to 224.0.0.224:7446 on interface 192.168.68.124
2024-08-05T21:21:23.841351Z TRACE test_blocking_user_callback ThreadId(02) zenoh::session: unsubscribe(Subscriber { id: 1, key_expr: ke`up/nonblock_pub/1/1/8000/{}/{}/{}/{}` })
2024-08-05T21:21:23.841436Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::dispatcher::pubsub: Undeclare subscription Face{0, 9a81c1c00774e54c83d45e715a7b6b66}
2024-08-05T21:21:23.841508Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::hat::p2p_peer::pubsub: Unregister client subscription up/nonblock_pub/1/1/8000/{}/{}/{}/{} for Face{0, 9a81c1c00774e54c83d45e715a7b6b66}
2024-08-05T21:21:23.841591Z TRACE test_blocking_user_callback ThreadId(02) zenoh::net::routing::hat::p2p_peer::pubsub: compute_data_route(up/nonblock_pub/1/1/8000/{}/{}/{}/{}, 0, Router)
2024-08-05T21:21:23.841644Z TRACE test_blocking_user_callback ThreadId(02) zenoh::net::routing::hat::p2p_peer::pubsub: compute_data_route(up/nonblock_pub/1/1/8000/{}/{}/{}/{}, 0, Peer)
2024-08-05T21:21:23.841665Z TRACE test_blocking_user_callback ThreadId(02) zenoh::net::routing::hat::p2p_peer::pubsub: compute_data_route(up/nonblock_pub/1/1/8000/{}/{}/{}/{}, 0, Client)
2024-08-05T21:21:23.841721Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::dispatcher::resource: Unregister resource up/nonblock_pub/1/1/8000/{}/{}/{}/{}
2024-08-05T21:21:23.841742Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::dispatcher::resource: Unregister resource up/nonblock_pub/1/1/8000/{}/{}/{}
2024-08-05T21:21:23.841757Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::dispatcher::resource: Unregister resource up/nonblock_pub/1/1/8000/{}/{}
2024-08-05T21:21:23.841767Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::dispatcher::resource: Unregister resource up/nonblock_pub/1/1/8000/{}
2024-08-05T21:21:23.841781Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::dispatcher::resource: Unregister resource up/nonblock_pub/1/1/8000
2024-08-05T21:21:23.841793Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::dispatcher::resource: Unregister resource up/nonblock_pub/1/1
2024-08-05T21:21:23.841805Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::dispatcher::resource: Unregister resource up/nonblock_pub/1
2024-08-05T21:21:23.841820Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::dispatcher::resource: Unregister resource up/nonblock_pub
2024-08-05T21:21:23.841835Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::dispatcher::resource: Unregister resource up
2024-08-05T21:21:23.841922Z TRACE test_blocking_user_callback ThreadId(02) zenoh::session: close()
2024-08-05T21:21:23.841978Z TRACE test_blocking_user_callback ThreadId(02) zenoh::net::runtime: Runtime::close())
2024-08-05T21:21:23.842277Z TRACE test_blocking_user_callback ThreadId(02) zenoh_transport::unicast::manager: TransportManagerUnicast::clear())
2024-08-05T21:21:23.842661Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::dispatcher::tables: Close Face{0, 9a81c1c00774e54c83d45e715a7b6b66}
2024-08-05T21:21:23.842880Z TRACE test_blocking_user_callback ThreadId(02) zenoh::session: close()
2024-08-05T21:21:23.842900Z TRACE test_blocking_user_callback ThreadId(02) zenoh::net::runtime: Runtime::close())
2024-08-05T21:21:23.842984Z TRACE test_blocking_user_callback ThreadId(02) zenoh_transport::unicast::manager: TransportManagerUnicast::clear())
2024-08-05T21:21:23.843137Z DEBUG test_blocking_user_callback ThreadId(02) zenoh::net::routing::dispatcher::tables: Close Face{0, 507dda98534fff27fcca9e38e8bdaa29}
test test_blocking_user_callback ... FAILED

failures:

---- test_blocking_user_callback stdout ----
thread 'test_blocking_user_callback' panicked at tests/blocking_callback.rs:82:5:
assertion `left == right` failed
  left: ""
 right: "Pub 1"
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace


failures:
    test_blocking_user_callback

test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out; finished in 3.55s

error: test failed, to rerun pass `--test blocking_callback`

skipping test_blocking_user_callback

$ RUST_LOG=trace cargo test -- --skip test_blocking_user_callback
    Finished test [unoptimized + debuginfo] target(s) in 0.11s
     Running unittests src/lib.rs (target/debug/deps/up_transport_zenoh-d777a23230707f5a)

running 23 tests
test tests::test_get_listener_message_type::impossible_scenario_3 ... ok
test tests::test_get_listener_message_type::impossible_scenario_2 ... ok
test tests::test_get_listener_message_type::all_messages_to_a_device ... ok
test tests::test_get_listener_message_type::publish_message ... ok
test tests::test_get_listener_message_type::impossible_scenario_1 ... ok
test tests::test_get_listener_message_type::notification_message ... ok
test tests::test_get_listener_message_type::impossible_scenario_4 ... ok
test tests::test_get_listener_message_type::request_message ... ok
test tests::test_get_listener_message_type::response_message ... ok
test tests::test_get_listener_message_type::listen_to_notification_and_response_message ... ok
test tests::test_new_up_transport_zenoh::succeeds_with_valid_uuri ... ok
test tests::test_new_up_transport_zenoh::fails_with_non_zero_resource_id ... ok
test tests::test_to_zenoh_key_string::receive_all_messages_to_a_device ... ok
test tests::test_to_zenoh_key_string::subscribe_messages ... ok
test tests::test_new_up_transport_zenoh::fails_with_invalid_uuri ... ok
test uri_provider::tests::test_local_uri_provider::publish_notification_resource_id ... ok
test tests::test_new_up_transport_zenoh::fails_with_empty_uauthority ... ok
test tests::test_to_zenoh_key_string::send_publish ... ok
test tests::test_to_zenoh_key_string::receive_all_notifications ... ok
test tests::test_to_zenoh_key_string::receive_all_requests ... ok
test uri_provider::tests::test_local_uri_provider::rpc_resource_id ... ok
test tests::test_to_zenoh_key_string::send_notification ... ok
test tests::test_to_zenoh_key_string::send_request ... ok

test result: ok. 23 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.91s

     Running tests/blocking_callback.rs (target/debug/deps/blocking_callback-ecac795b12ccfd4d)

running 0 tests

test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 1 filtered out; finished in 0.00s

     Running tests/l2_rpc.rs (target/debug/deps/l2_rpc-051c33b42967c658)

running 1 test
2024-08-05T21:24:58.471361Z DEBUG test_l2_rpc ThreadId(02) zenoh::session: Config: Config { id: fb11b636718af048f21860e153484f7c, metadata: Null, mode: None, connect: ConnectConfig { timeout_ms: None, endpoints: [], exit_on_failure: None, retry: None }, listen: ListenConfig { timeout_ms: None, endpoints: [], exit_on_failure: None, retry: None }, scouting: ScoutingConf { timeout: None, delay: None, multicast: ScoutingMulticastConf { enabled: None, address: None, interface: None, ttl: None, autoconnect: None, listen: None }, gossip: GossipConf { enabled: None, multihop: None, autoconnect: None } }, timestamping: TimestampingConf { enabled: None, drop_future_timestamp: None }, queries_default_timeout: None, routing: RoutingConf { router: RouterRoutingConf { peers_failover_brokering: None }, peer: PeerRoutingConf { mode: None } }, aggregation: AggregationConf { subscribers: [], publishers: [] }, transport: TransportConf { unicast: TransportUnicastConf { accept_timeout: 10000, accept_pending: 100, max_sessions: 1000, max_links: 1, lowlatency: false, qos: QoSUnicastConf { enabled: true }, compression: CompressionUnicastConf { enabled: false } }, multicast: TransportMulticastConf { join_interval: Some(2500), max_sessions: Some(1000), qos: QoSMulticastConf { enabled: false }, compression: CompressionMulticastConf { enabled: false } }, link: TransportLinkConf { protocols: None, tx: LinkTxConf { sequence_number_resolution: U32, lease: 10000, keep_alive: 4, batch_size: 65535, queue: QueueConf { size: QueueSizeConf { control: 1, real_time: 1, interactive_high: 1, interactive_low: 1, data_high: 2, data: 4, data_low: 2, background: 1 }, congestion_control: CongestionControlConf { wait_before_drop: 1000 }, backoff: 100 }, threads: 4 }, rx: LinkRxConf { buffer_size: 65535, max_message_size: 1073741824 }, tls: TLSConf { root_ca_certificate: None, server_private_key: None, server_certificate: None, client_auth: None, client_private_key: None, client_certificate: None, server_name_verification: None, root_ca_certificate_base64: None, server_private_key_base64: None, server_certificate_base64: None, client_private_key_base64: None, client_certificate_base64: None }, unixpipe: UnixPipeConf { file_access_mask: None } }, shared_memory: SharedMemoryConf { enabled: false }, auth: AuthConf { usrpwd: UsrPwdConf { user: None, password: None, dictionary_file: None }, pubkey: PubKeyConf { public_key_pem: None, private_key_pem: None, public_key_file: None, private_key_file: None, key_size: None, known_keys_file: None } } }, adminspace: AdminSpaceConf { enabled: false, permissions: PermissionsConf { read: true, write: false } }, downsampling: [], access_control: AclConfig { enabled: false, default_permission: Deny, rules: None }, plugins_loading: PluginsLoading { enabled: false, search_dirs: None }, plugins: Object {} }
2024-08-05T21:24:58.471538Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::runtime: Zenoh Rust API v0.11.0
2024-08-05T21:24:58.471560Z  INFO test_l2_rpc ThreadId(02) zenoh::net::runtime: Using ZID: fb11b636718af048f21860e153484f7c
2024-08-05T21:24:58.471673Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::routing::interceptor::access_control: Access control is disabled
2024-08-05T21:24:58.563716Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::routing::hat::p2p_peer::gossip: [Gossip] Add node (self) fb11b636718af048f21860e153484f7c
2024-08-05T21:24:58.563817Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::routing::router: New Face{0, fb11b636718af048f21860e153484f7c}
2024-08-05T21:24:58.563895Z TRACE test_l2_rpc ThreadId(02) zenoh::session: queryable(@/session/fb11b636718af048f21860e153484f7c/**)
2024-08-05T21:24:58.563951Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: Try to add listener: tcp/[::]:0: ConnectionRetryConf { exit_on_failure: true, period_init_ms: 1000, period_max_ms: 4000, period_increase_factor: 2.0 }
2024-08-05T21:24:58.564226Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: Listener added: tcp/[::]:37579
2024-08-05T21:24:58.564265Z TRACE       acc-0 ThreadId(21) zenoh_link_tcp::unicast: Ready to accept TCP connections on: [::]:37579
2024-08-05T21:24:58.564504Z  INFO test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: Zenoh can be reached at: tcp/[fd84:dbc5:7dc8:8dc3:679:3e8d:de1a:51c3]:37579
2024-08-05T21:24:58.564520Z  INFO test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: Zenoh can be reached at: tcp/[fd84:dbc5:7dc8:8dc3:2164:f52f:bdf2:b748]:37579
2024-08-05T21:24:58.564525Z  INFO test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: Zenoh can be reached at: tcp/[fe80::31fd:25cb:795e:9a4f]:37579
2024-08-05T21:24:58.564531Z  INFO test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: Zenoh can be reached at: tcp/192.168.68.124:37579
2024-08-05T21:24:58.564650Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: UDP port bound to 224.0.0.224:7446
2024-08-05T21:24:58.564667Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: Joined multicast group 224.0.0.224 on interface 192.168.68.124
2024-08-05T21:24:58.564673Z  INFO test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: zenohd listening scout messages on 224.0.0.224:7446
2024-08-05T21:24:58.564718Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: UDP port bound to 192.168.68.124:47119
2024-08-05T21:24:58.564825Z TRACE       net-0 ThreadId(20) zenoh::net::runtime::orchestrator: Send Scout(Scout { version: 8, what: WhatAmIMatcher(131), zid: None }) to 224.0.0.224:7446 on interface 192.168.68.124
2024-08-05T21:24:58.564971Z DEBUG       net-0 ThreadId(20) zenoh::net::runtime::orchestrator: Waiting for UDP datagram...
2024-08-05T21:24:59.268144Z DEBUG test_l2_rpc ThreadId(02) zenoh::session: Config: Config { id: 5985ac09debe474058a8d6a0a3715798, metadata: Null, mode: None, connect: ConnectConfig { timeout_ms: None, endpoints: [], exit_on_failure: None, retry: None }, listen: ListenConfig { timeout_ms: None, endpoints: [], exit_on_failure: None, retry: None }, scouting: ScoutingConf { timeout: None, delay: None, multicast: ScoutingMulticastConf { enabled: None, address: None, interface: None, ttl: None, autoconnect: None, listen: None }, gossip: GossipConf { enabled: None, multihop: None, autoconnect: None } }, timestamping: TimestampingConf { enabled: None, drop_future_timestamp: None }, queries_default_timeout: None, routing: RoutingConf { router: RouterRoutingConf { peers_failover_brokering: None }, peer: PeerRoutingConf { mode: None } }, aggregation: AggregationConf { subscribers: [], publishers: [] }, transport: TransportConf { unicast: TransportUnicastConf { accept_timeout: 10000, accept_pending: 100, max_sessions: 1000, max_links: 1, lowlatency: false, qos: QoSUnicastConf { enabled: true }, compression: CompressionUnicastConf { enabled: false } }, multicast: TransportMulticastConf { join_interval: Some(2500), max_sessions: Some(1000), qos: QoSMulticastConf { enabled: false }, compression: CompressionMulticastConf { enabled: false } }, link: TransportLinkConf { protocols: None, tx: LinkTxConf { sequence_number_resolution: U32, lease: 10000, keep_alive: 4, batch_size: 65535, queue: QueueConf { size: QueueSizeConf { control: 1, real_time: 1, interactive_high: 1, interactive_low: 1, data_high: 2, data: 4, data_low: 2, background: 1 }, congestion_control: CongestionControlConf { wait_before_drop: 1000 }, backoff: 100 }, threads: 4 }, rx: LinkRxConf { buffer_size: 65535, max_message_size: 1073741824 }, tls: TLSConf { root_ca_certificate: None, server_private_key: None, server_certificate: None, client_auth: None, client_private_key: None, client_certificate: None, server_name_verification: None, root_ca_certificate_base64: None, server_private_key_base64: None, server_certificate_base64: None, client_private_key_base64: None, client_certificate_base64: None }, unixpipe: UnixPipeConf { file_access_mask: None } }, shared_memory: SharedMemoryConf { enabled: false }, auth: AuthConf { usrpwd: UsrPwdConf { user: None, password: None, dictionary_file: None }, pubkey: PubKeyConf { public_key_pem: None, private_key_pem: None, public_key_file: None, private_key_file: None, key_size: None, known_keys_file: None } } }, adminspace: AdminSpaceConf { enabled: false, permissions: PermissionsConf { read: true, write: false } }, downsampling: [], access_control: AclConfig { enabled: false, default_permission: Deny, rules: None }, plugins_loading: PluginsLoading { enabled: false, search_dirs: None }, plugins: Object {} }
2024-08-05T21:24:59.268278Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::runtime: Zenoh Rust API v0.11.0
2024-08-05T21:24:59.268292Z  INFO test_l2_rpc ThreadId(02) zenoh::net::runtime: Using ZID: 5985ac09debe474058a8d6a0a3715798
2024-08-05T21:24:59.268324Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::routing::interceptor::access_control: Access control is disabled
2024-08-05T21:24:59.355041Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::routing::hat::p2p_peer::gossip: [Gossip] Add node (self) 5985ac09debe474058a8d6a0a3715798
2024-08-05T21:24:59.355108Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::routing::router: New Face{0, 5985ac09debe474058a8d6a0a3715798}
2024-08-05T21:24:59.355162Z TRACE test_l2_rpc ThreadId(02) zenoh::session: queryable(@/session/5985ac09debe474058a8d6a0a3715798/**)
2024-08-05T21:24:59.355196Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: Try to add listener: tcp/[::]:0: ConnectionRetryConf { exit_on_failure: true, period_init_ms: 1000, period_max_ms: 4000, period_increase_factor: 2.0 }
2024-08-05T21:24:59.355312Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: Listener added: tcp/[::]:36485
2024-08-05T21:24:59.355323Z TRACE       acc-0 ThreadId(21) zenoh_link_tcp::unicast: Ready to accept TCP connections on: [::]:36485
2024-08-05T21:24:59.355491Z  INFO test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: Zenoh can be reached at: tcp/[fd84:dbc5:7dc8:8dc3:679:3e8d:de1a:51c3]:36485
2024-08-05T21:24:59.355498Z  INFO test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: Zenoh can be reached at: tcp/[fd84:dbc5:7dc8:8dc3:2164:f52f:bdf2:b748]:36485
2024-08-05T21:24:59.355501Z  INFO test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: Zenoh can be reached at: tcp/[fe80::31fd:25cb:795e:9a4f]:36485
2024-08-05T21:24:59.355505Z  INFO test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: Zenoh can be reached at: tcp/192.168.68.124:36485
2024-08-05T21:24:59.355575Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: UDP port bound to 224.0.0.224:7446
2024-08-05T21:24:59.355584Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: Joined multicast group 224.0.0.224 on interface 192.168.68.124
2024-08-05T21:24:59.355588Z  INFO test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: zenohd listening scout messages on 224.0.0.224:7446
2024-08-05T21:24:59.355619Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::runtime::orchestrator: UDP port bound to 192.168.68.124:45441
2024-08-05T21:24:59.355705Z TRACE       net-0 ThreadId(20) zenoh::net::runtime::orchestrator: Send Scout(Scout { version: 8, what: WhatAmIMatcher(131), zid: None }) to 224.0.0.224:7446 on interface 192.168.68.124
2024-08-05T21:24:59.355815Z DEBUG       net-0 ThreadId(20) zenoh::net::runtime::orchestrator: Waiting for UDP datagram...
2024-08-05T21:24:59.566302Z TRACE       net-0 ThreadId(20) zenoh::net::runtime::orchestrator: Send Scout(Scout { version: 8, what: WhatAmIMatcher(131), zid: None }) to 224.0.0.224:7446 on interface 192.168.68.124
2024-08-05T21:25:00.058082Z TRACE test_l2_rpc ThreadId(02) zenoh::session: queryable(up/*/*/*/0/l2_responder/2/1/A0)
2024-08-05T21:25:00.058182Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::routing::dispatcher::queries: Register queryable Face{0, fb11b636718af048f21860e153484f7c}
2024-08-05T21:25:00.058277Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::routing::dispatcher::resource: Register resource up/*/*/*/0/l2_responder/2/1/A0
2024-08-05T21:25:00.058332Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::routing::hat::p2p_peer::queries: Register queryable up/*/*/*/0/l2_responder/2/1/A0 (face: Face{0, fb11b636718af048f21860e153484f7c})
2024-08-05T21:25:00.058395Z TRACE test_l2_rpc ThreadId(02) zenoh::net::routing::hat::p2p_peer::queries: compute_query_route(up/*/*/*/0/l2_responder/2/1/A0, 0, Router)
2024-08-05T21:25:00.058461Z TRACE test_l2_rpc ThreadId(02) zenoh::net::routing::hat::p2p_peer::queries: compute_query_route(up/*/*/*/0/l2_responder/2/1/A0, 0, Peer)
2024-08-05T21:25:00.058493Z TRACE test_l2_rpc ThreadId(02) zenoh::net::routing::hat::p2p_peer::queries: compute_query_route(up/*/*/*/0/l2_responder/2/1/A0, 0, Client)
2024-08-05T21:25:00.357516Z TRACE       net-0 ThreadId(20) zenoh::net::runtime::orchestrator: Send Scout(Scout { version: 8, what: WhatAmIMatcher(131), zid: None }) to 224.0.0.224:7446 on interface 192.168.68.124
2024-08-05T21:25:01.060100Z TRACE test_l2_rpc ThreadId(02) zenoh::session: get(up/l2_requester/1/1/0/l2_responder/2/1/A0, BestMatching, QueryConsolidation { mode: Auto })
2024-08-05T21:25:01.060222Z TRACE test_l2_rpc ThreadId(02) zenoh::session: Register query 0 (nb_final = 2)
2024-08-05T21:25:01.060257Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::routing::dispatcher::queries: Route query Face{0, 5985ac09debe474058a8d6a0a3715798}:0 for res up/l2_requester/1/1/0/l2_responder/2/1/A0
2024-08-05T21:25:01.060282Z TRACE test_l2_rpc ThreadId(02) zenoh::net::routing::hat::p2p_peer::queries: compute_query_route(up/l2_requester/1/1/0/l2_responder/2/1/A0, 0, Client)
2024-08-05T21:25:01.060324Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::routing::dispatcher::queries: Send final reply Face{0, 5985ac09debe474058a8d6a0a3715798}:0 (no matching queryables or not master)
2024-08-05T21:25:01.060341Z TRACE test_l2_rpc ThreadId(02) zenoh::session: recv ResponseFinal ResponseFinal { rid: 0, ext_qos: QoS { priority: Data, congestion: Block, express: false }, ext_tstamp: None }
2024-08-05T21:25:01.060385Z TRACE test_l2_rpc ThreadId(02) zenoh::session: recv ResponseFinal ResponseFinal { rid: 0, ext_qos: QoS { priority: Data, congestion: Block, express: false }, ext_tstamp: None }
2024-08-05T21:25:01.060400Z TRACE test_l2_rpc ThreadId(02) zenoh::session: Close query 0
2024-08-05T21:25:01.060432Z ERROR test_l2_rpc ThreadId(02) up_transport_zenoh::rpc: Error while receiving Zenoh reply
2024-08-05T21:25:01.060593Z TRACE test_l2_rpc ThreadId(02) zenoh::session: close()
2024-08-05T21:25:01.060722Z TRACE test_l2_rpc ThreadId(02) zenoh::net::runtime: Runtime::close())
2024-08-05T21:25:01.060901Z TRACE test_l2_rpc ThreadId(02) zenoh_transport::unicast::manager: TransportManagerUnicast::clear())
2024-08-05T21:25:01.061143Z DEBUG test_l2_rpc ThreadId(02) zenoh::net::routing::dispatcher::tables: Close Face{0, 5985ac09debe474058a8d6a0a3715798}
test test_l2_rpc ... FAILED

failures:

---- test_l2_rpc stdout ----
thread 'test_l2_rpc' panicked at tests/l2_rpc.rs:103:10:
called `Result::unwrap()` on an `Err` value: RpcError(UStatus { code: INTERNAL, message: Some("Error while receiving Zenoh reply"), details: [], special_fields: SpecialFields { unknown_fields: UnknownFields { fields: None }, cached_size: CachedSize { size: 0 } } })
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace


failures:
    test_l2_rpc

test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out; finished in 2.61s

error: test failed, to rerun pass `--test l2_rpc`

Update `UpClientZenoh` to take in constructor enough info that we know the `source` / response UUri when `invokeMethod()` is called

Hey @evshary ๐Ÿ‘‹

I think I brought this up before, but it got lost in the shuffle.

When we create a UpClientZenoh we need to provide it the info needed such that when RpcClient::invoke_method() is called, it's able to, within the function call, build the source UUri.

Here's a few links to the up-client-android-java codebase with some pointers on how they went about it:

Could you check into this and propose a solution?

ttl for RpcClient

Ensure that TTL from CallOptions are passed in the UAttributes and then the futures are completed exceptionally when the message times out (we did not get a reply)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.