Comments (3)
@sloretz can you also open a PR against system_tests
to update the requester_py.py file to use this new feature. I would expect this to reduce the flakiness of these tests significantly.
Thanks!
from rclpy.
@ros2/team Mind giving some feedback on this idea?
TL;DR I want to eliminate the GraphListener
using coroutines.
Currently #127 implements wait_for_service
using a GraphListener
class. GraphListener
has its own thread and waits on node graph guard conditions plus timers for the wait_for_service
timeout. This enables a user to call client.wait_for_service()
inside of a callback without blocking forever due to starvation of a SingleThreadedExecutor
.
def my_callback(self, msg):
if self.client.wait_for_service():
# do something with service
Alternatively, GraphListener
can be eliminated using the coroutine support in #135. wait_for_service()
would return a Future
object. If a user wants to wait in a callback they must make it a coroutine and await
the Future
. The executor would wait on the node graph guard conditions, and execute callbacks like normal. The SingleThreadedExecutor
would not be blocked by the user's callback because the future yields back to the executor if the service isn't ready.
async def my_callback(self, msg):
result = await self.client.wait_for_service()
if result:
# do something with service
The trouble happens when trying to offer both options. The executor cannot wait on the node graph guard conditions because rcl_wait
says the same entity cannot be waited on at the same time in two wait sets. It is possible to make a second method wait_for_service_async
that returns a Future
which uses the GraphListener
, but it means the same pattern has to be used anywhere a Future
is returned. Client.call()
and Client.call_async()
would need a ClientListener
managing a separate thread to wait on clients.
I see a few options. I like the first option the most.
- Offer only
wait_for_service
that returns aFuture
. EliminateGraphListener
. If a user wants to block in a callback they must know about python coroutines. - Offer both
wait_for_service
andwait_for_service_async
. Don't ElimateGraphListener
. CreateClientListener
in the future. - Offer both
wait_for_service
andwait_for_service_async
. EliminateGraphListener
. Don't try to protect the user from starvation. If they block in a callback without using a coroutine that's their problem.
from rclpy.
I think it would be good to eliminate the GraphListener
to reduce complexity. Imo it is reasonable to expect users to know / learn about coroutines for that specific use case. But I think offering both as described in option 3 would be nice for the users. It allows them to use a synchronous call if they want to. And if they use it wrongly it might not work. So my preference would be option 3 slightly before option 1.
from rclpy.
Related Issues (20)
- Feature Request for MultiProcessExecutor HOT 2
- async wait/sleep implementation HOT 4
- KeyError in ActionServer._execute_goal HOT 5
- [Feature] Static Type Checking With ament_mypy HOT 1
- Added optional TimerInfo to timer callback
- :farmer: `test_guard_condition` failing consisntent in CycloneDDS HOT 2
- Add utility function to get datetime.datetime from Time HOT 1
- Decode ROS2 raw byte data using rclpy.serialization.deserialize_message HOT 3
- Rate object's destroy() does not destroy the underlying Timer object, leading to CPU usage explosion HOT 6
- Update the executor to not throw ExternalShutdownException
- Make nodes, publishers, subscriptions, services, clients, action servers, action clients Python context managers
- aarch64 cross compiling HOT 9
- Executors mishandle invalid waitables HOT 4
- context.on_shutdown is never called HOT 7
- Bad interaction between `torch.compile` and `MultiThreadedExecutor` HOT 1
- First Time Contributors Documentation (CONTRIBUTING.md) HOT 2
- Python message with uint32 Assertion `PyLong_Check(field)' failed HOT 2
- Memory leak in ActionServer. ```taken_data = self._handle.take_goal_request``` HOT 4
- [Bug] error when node created with rclpy.node.Node HOT 1
- Async tasks are processed in LIFO not FIFO HOT 4
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from rclpy.