asynchronics / asynchronix Goto Github PK
View Code? Open in Web Editor NEWHigh-performance asynchronous computation framework for system simulation
License: Apache License 2.0
High-performance asynchronous computation framework for system simulation
License: Apache License 2.0
The current implementation makes it possible to easily cancel a scheduled event using its SchedulerKey
, provided that this event is scheduled for a future time. When the event is scheduled for the current time but was not processed yet, however, the user must resort to cumbersome workarounds to discard the event when it is processed (see the epoch-based workaround in the current espresso machine example).
An implementation that allows the event to be cancelled at any time before it is processed would bring a sizable quality-of-life improvement.
It is possible today to schedule periodic events by requesting a method to re-scheduling itself at a later time.
Because periodic scheduling is a very common pattern in simulation, however, it would be good to have the possibility to do this with less boilerplate by issuing only one scheduling request with the period as argument.
What is the best way to implement sending messages to an output with a delay? The docs suggest a self-scheduling mechanism, e.g.:
scheduler.schedule_event(Duration::from_secs(1), Self::push_msg, msg);
with a corresponding handler
fn push_msg(&mut self, msg: Msg) {
self.output.send(msg);
}
but I feel this might get slightly tedious in complex models. Ideally, I would like to simply write
self.output.send_in(Duration::from_secs(1), msg);
Any suggestions how I should do this?
There is currently no support for real-time execution, which is necessary in a number of scenarios such as Hardware-in-the-Loop simulation.
Ideally, custom clocks should also be supported to allow synchronization from time sources other than the system, or for more exotic needs such as clocks with scaled real-time.
I'd like to self-schedule an input from within that same input. Something like this:
#[derive(Default)]
struct Fetcher {}
impl Fetcher {
fn blackbox(&mut self) -> bool { return rand::thread_rng().gen_bool(0.5); }
async fn on_fetch(&mut self, _: (), scheduler: &Scheduler<Self>) {
let b = self.blackbox();
println!("fetching!\n{}", if b {"success"} else {"failure, retrying in 1s"});
if b { scheduler.schedule_in(Duration::from_secs(5), Self::on_fetch, ()); }
else { scheduler.schedule_in(Duration::from_secs(1), Self::on_fetch, ()); }
}
}
impl Model for Fetcher {}
Rust doesn't allow for direct recursion in async
functions, so this will give a compiler error. Generally, on workaround is to return a BoxFuture
(instead of a Future
).
// --snip--
fn on_fetch<'a>(&'a mut self, _: (), scheduler: &'a Scheduler<Self>) -> BoxFuture<()> {
async move {
let b = self.blackbox();
println!("fetching!\n{}", if b {"success"} else {"failure, retrying in 1s"});
if b { scheduler.schedule_in(Duration::from_secs(5), Self::on_fetch, ()); }
else { scheduler.schedule_in(Duration::from_secs(1), Self::on_fetch, ()); }
}.boxed()
}
// --snip--
This messes up the code quite a bit, so I was looking for a nicer way to do this. The only alternative I found is to create a dedicated loopback output
#[derive(Default)]
struct Fetcher {
loopback: Output<bool>,
}
impl Fetcher {
// --snip--
async fn on_fetch(&mut self, _: (), scheduler: &Scheduler<Self>) {
let b = self.blackbox();
self.loopback.send(b).await;
}
async fn on_loopback(&mut self, b: bool, scheduler: &Scheduler<Self>) {
println!("fetching!\n{}", if b {"success"} else {"failure, retrying in 1s"});
if b { scheduler.schedule_in(Duration::from_secs(5), Self::on_fetch, ()); }
else { scheduler.schedule_in(Duration::from_secs(1), Self::on_fetch, ()); }
}
}
// --snip--
But this is not any better, it requires the simulator to connect the output correctly, and assumes that the output is not connected to anything else.
Is there a better way to do this with asynchronix?
Are there any plans to provide wasm32-wasi
target support, presumably with single threaded simulation fallback? This would increase portability (for a developer) and security (for a client) for non-performance critical testing purposes.
Currently, the espresso machine example compiles successfully with
cargo build --package asynchronix --target wasm32-wasi --example espresso_machine
but running the generated espresso_machine.wasm
on a WASM VM will fail where the executor tries to spawn threads unchecked. As far as I know, all use of std
APIs not supported by WASI (such as threading) will need to be made target-dependent with macros.
I am not sure how much asynchronix depends on multi-threading and unsafe code, and hence how hard it would be to introduce fallbacks for WASI.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.