Git Product home page Git Product logo

aioreactive's People

Contributors

ahirner avatar alhimik45 avatar davkhech avatar dbrattli avatar dependabot[bot] avatar francipvb avatar vodik avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aioreactive's Issues

My first AsyncObservable

First off, thanks you putting this library together. I've been playing around with it and it's been real slick so far.

In light of that, I think I'm not understanding something. I'm trying to create an Observable and I'm wondering if I'm missing something. Here's what I'm doing right now:

class MyFirstObservable(AsyncObservable):
    def __init__(self, delay):
        self.observers = []

    async def __asubscribe__(self, observer):
        self.observers += [observer]

Now, when Observers subscribe, I will add them to my list. But it feels like maintaining a list of who is subscribed should be built in? When I have events to pass out, I will iterate through my observers and give them events to check out, but I'm just wondering what the intent was in not aggregating observers by default?

aiohttp streaming POST

So I was playing around with doing http requests, but cannot seem to get this to work:

import logging as log
import aiohttp
import json
import asyncio
from collections import deque
import re
from aioreactive.core import FuncSink, start
from aioreactive.producer import Producer, AsyncStream
from aioreactive.producer import op

from malefico.core import recordio as extract

def decode(msg):
   return json.loads(msg.decode("ascii"))

async def subscribe(test):
   headers = {'content-type': 'application/json'}
   url = 'http://localhost:5050/api/v1/scheduler'
   d = {
       "type": "SUBSCRIBE",
       "subscribe": {
           "framework_info": {
               "user":  "username",
               "name":  "Example HTTP Framework"
           }
       }
   }

   async with aiohttp.ClientSession(headers=headers) as session:
       async with session.post(url, data=json.dumps(
               d), timeout=100000) as resp:
           return Producer.from_iterable(resp.content)

async def main():
   url = 'http://localhost:5050/api/v1/scheduler'
   d = {
       "type": "SUBSCRIBE",
       "subscribe": {
           "framework_info": {
               "user":  "username",
               "name":  "Example HTTP Framework"
           }
       }
   }
   stream = AsyncStream()

   headers = {'content-type': 'application/json'}
   xs = (stream
         | op.flat_map(subscribe)
         | extract
         | op.map(decode)
         )
   async with start(xs) as ys:
       await stream.asend(1)
       async for value in ys:
           print(value)


if __name__ == '__main__':
   loop = asyncio.get_event_loop()
   loop.set_debug(True)
   asyncio.ensure_future(main(), loop=loop)
   loop.run_forever()
   loop.close()

Weird thing is if i move the POST call out of the flat_map say to the actual subscribe call and move Producer.from_iterable(req.content) to another flat_map it starts working.

Any ideas?

Also really great library fun to play with and please let me know if I am using it in a completely wrong way.

Errors with adispose()


<CoroWrapper Merge.Sink.cancel() running at /usr/local/lib/python3.6/site-packages/aioreactive/operators/merge.py:36, created at /usr/local/lib/python3.6/asyncio/coroutines.py:84> was never yielded from
Coroutine object created at (most recent call last):
...
  File "/usr/local/lib/python3.6/site-packages/aioreactive/core/disposables.py", line 24, in adispose
    await disposable.adispose()
  File "/usr/local/lib/python3.6/asyncio/coroutines.py", line 109, in __next__
    return self.gen.send(None)
  File "/usr/local/lib/python3.6/site-packages/aioreactive/core/disposables.py", line 14, in adispose
    await self._dispose()
  File "/usr/local/lib/python3.6/asyncio/coroutines.py", line 109, in __next__
    return self.gen.send(None)
  File "/usr/local/lib/python3.6/site-packages/aioreactive/core/streams.py", line 66, in adispose
    self.cancel()
  File "/usr/local/lib/python3.6/asyncio/coroutines.py", line 84, in debug_wrapper
    return CoroWrapper(gen, None)

Errors on examples/autocomplete

Hi.
I tried to run Autocomplete's sample.
However, it did not work well with the following error.
(I fixed the import part before running.)

WebSocket opened
<CoroWrapper WebSocketResponse.send_str() running at /.../python-3.6.4/lib/python3.6/site-packages/aiohttp/web_ws.py:247, created at /lib/python3.6/asyncio/coroutines.py:85> was never yielded from
Coroutine object created at (most recent call last, truncated to 10 last lines):
  File "/.../python-3.6.4/lib/python3.6/site-packages/aioreactive/core/bases.py", line 40, in asend
    await self.asend_core(value)
  File "/lib/python3.6/asyncio/coroutines.py", line 110, in __next__
    return self.gen.send(None)
  File "/.../python-3.6.4/lib/python3.6/site-packages/aioreactive/core/streams.py", line 41, in asend_core
    await self._observer.asend(value)
  File "/lib/python3.6/asyncio/coroutines.py", line 110, in __next__
    return self.gen.send(None)
  File "/.../python-3.6.4/lib/python3.6/site-packages/aioreactive/core/bases.py", line 40, in asend
    await self.asend_core(value)
  File "/lib/python3.6/asyncio/coroutines.py", line 110, in __next__
    return self.gen.send(None)
  File "/.../python-3.6.4/lib/python3.6/site-packages/aioreactive/core/observers.py", line 98, in asend_core
    await self._send(value)
  File "/lib/python3.6/asyncio/coroutines.py", line 110, in __next__
    return self.gen.send(None)
  File "autocomplete.py", line 58, in asend
    ws.send_str(value)
  File "/lib/python3.6/asyncio/coroutines.py", line 85, in debug_wrapper
    return CoroWrapper(gen, None)

I want a sample to pass coroutine to map, so I want to know how to do it normally.

Cannot run filter example

"""Example to show how to split a stream into two substreams."""
import asyncio

from aioreactive.core import subscribe, AsyncAnonymousObserver

from aioreactive.core import AsyncObservable
from aioreactive.operators import pipe as op


async def main():
    xs = AsyncObservable.from_iterable(range(10))

    # Split into odds and evens
    odds = xs | op.filter(lambda x: x % 2 == 1)
    evens = xs | op.filter(lambda x: x % 2 == 0)

    async def mysink(value):
        print(value)

    await subscribe(odds, AsyncAnonymousObserver(mysink))
    await subscribe(evens, AsyncAnonymousObserver(mysink))


if __name__ == '__main__':
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())
    loop.close()

upon running:
AttributeError: module 'aioreactive.operators.pipe' has no attribute 'filter'

rx.delay frequently destroys tasks

Hi, i'm unsure if this is the intended behaviour, but rx.delay causes asyncio to frequently throw the following error.

if i'm not mistaken, rx.delay(x: seconds) is supposed to be an operator that delays the parent for x seconds.

this is how i think rx.delay should behave:

def setup():
    return pipe(
        rx.interval(0, 2),
        rx.flat_map(many),
        rx.subscribe_async(observer),
    )


async def observer(x):
    print(x)
async def delayed(x, y):
    await asyncio.sleep(y * 0.1)
    return (x, y)


def many(x: int):
    return rx.merge_seq(
        [
            pipe(
                delayed(x, 1),
                rx.from_async,
            ),
            pipe(
                delayed(x, 2),
                rx.from_async,
            ),
        ]
    )

however, when we use rx.delay instead of calling asyncio.sleep

async def delayed(x, y):
    return (x, y)


def many(x: int):
    return rx.merge_seq(
        [
            pipe(
                delayed(x, 1),
                rx.from_async,
                rx.delay(0.1),
            ),
            pipe(
                delayed(x, 2),
                rx.from_async,
                rx.delay(0.2),
            ),
        ]
    )

the exceptions are thrown

(0, 1)
(0, 2)
ERROR:asyncio:Task was destroyed but it is pending!
task: <Task pending name='Task-17' coro=<delay.<locals>.subscribe_async.<locals>.worker.<locals>.loop() done, defined at /Users/shawnkoh/repos/ninjacado/.venv/lib/python3.10/site-packages/expression/core/fn.py:59> wait_for=<Future pending cb=[Task.task_wakeup()]>>
ERROR:asyncio:Task was destroyed but it is pending!
task: <Task pending name='Task-18' coro=<delay.<locals>.subscribe_async.<locals>.worker.<locals>.loop() done, defined at /Users/shawnkoh/repos/ninjacado/.venv/lib/python3.10/site-packages/expression/core/fn.py:59> wait_for=<Future pending cb=[Task.task_wakeup()]>>
(1, 1)
(1, 2)
(2, 1)
(2, 2)
(3, 1)
(3, 2)
ERROR:asyncio:Task was destroyed but it is pending!
task: <Task pending name='Task-43' coro=<delay.<locals>.subscribe_async.<locals>.worker.<locals>.loop() done, defined at /Users/shawnkoh/repos/ninjacado/.venv/lib/python3.10/site-packages/expression/core/fn.py:59> wait_for=<Future pending cb=[Task.task_wakeup()]>>
ERROR:asyncio:Task was destroyed but it is pending!
task: <Task pending name='Task-44' coro=<delay.<locals>.subscribe_async.<locals>.worker.<locals>.loop() done, defined at /Users/shawnkoh/repos/ninjacado/.venv/lib/python3.10/site-packages/expression/core/fn.py:59> wait_for=<Future pending cb=[Task.task_wakeup()]>>
ERROR:asyncio:Task was destroyed but it is pending!
task: <Task pending name='Task-56' coro=<delay.<locals>.subscribe_async.<locals>.worker.<locals>.loop() done, defined at /Users/shawnkoh/repos/ninjacado/.venv/lib/python3.10/site-packages/expression/core/fn.py:59> wait_for=<Future pending cb=[Task.task_wakeup()]>>
ERROR:asyncio:Task was destroyed but it is pending!
task: <Task pending name='Task-57' coro=<delay.<locals>.subscribe_async.<locals>.worker.<locals>.loop() done, defined at /Users/shawnkoh/repos/ninjacado/.venv/lib/python3.10/site-packages/expression/core/fn.py:59> wait_for=<Future pending cb=[Task.task_wakeup()]>>
(4, 1)
(4, 2)

from_iterable broken, or is it my mistake?

I've written a merge_sorted function using aioreactive idiom (it returns an async iterable), and it works fine with async for (see attached notebook).
However, when I try to convert it into a source using from_iterable, that doesn't work at all (also in attached notebook). Is it a bug in from_iterable or am I doing it wrong?
Thanks a lot!
aioreactive merge_sorted.zip

buffer operator?

Hi, I've been implementing a state management library (http://github.com/nardi/pobx) using RxPY, and am now trying to build an async version using this library. However, in my RxPY implementation I make use of the buffer operator to propagate a bunch of values all at once. Are there any plans to implement this operator here too, or should I try to get my own version running? FWIW, in RxPY buffer is in turn implemented using the window operator, which is also not available here, so it might be a bit of work.

Is there an operator to connect an Observable to an Observer, and if not, which one would you prefer?

Hi Dag,

I think the pipe syntax for chaining operators is really cool, and am missing something similar to complete the chain.

That is, I find myself writing code like xs = source | op.map( f1) | op.map( f2) | ... and then having to complete it with a clunky

async def f():
    subscription = await subscribe(xs, AnonymousAsyncObserver(sink_fun))

when I'd much rather just say something like

f = xs > sink_fun

where the > operator means we're coupling the source with an Observer that can consume it, returning a regular coroutine.

Is there something like that in aioreactive, and if not, do you have any immediate plans to write it?

If the answer to both of the above is 'no' , I'll have a go at writing one this week and would be grateful for any hints :)

Thanks a lot!

Implementing reduce

Hello,

Is there a reduce operator or not yet?

If not, let me know and I will try to implement it.

documentation for creating observable streams

I've got a functions:

def get_producer():
    tool = Tool()

    async def start(listener):
        def on_event(event):
            listener(event)
        tool.on_scan_result = on_event
        tool.start_scan()
        await tool.transport.drain()

    async def stop():
        tool.stop_scanning()
        await tool.transport.drain()

    return { "start": start, "stop": stop } # my old api.
    return AsyncObservable( # what goes here? )

And I'd like to be able to turn it into an observable stream, that automatically starts/stops scanning when the obserables are added/removed but I'm not sure how.

Retry operator

Hi,

I was wondering I you could give me a tip on how to implement a retry operator.I tried doing something similar to RXpy, but could not figure it out

Thanks

Is this project abandoned?

I would rather use this library than RxPY, so if you're not planning to develop it anymore I would be interested in continuing it :)

AttributeError: 'NoneType' object has no attribute 'subscribe_async' in aioreactive/combine.py", line 99, in update

The complete code is extended , but this is a summary.

This exception does not represent a real problem because my application finishes executing all the functions correctly, and it only happens with the first "asend" of the subject incoming_data.

type_data = Dict[str, Any]
incoming_data: rx.AsyncObservable['type_data'] = rx.AsyncSubject()


async def webhook(request):
    payload = await request.json()
    await incoming_data.asend(payload)
    return web.Response(status=200)


async def on_startup(app):

    rs: RequestSender = RequestSender()
    mb: MessageBuilder = MessageBuilder()

    obs_request_sender = pipe(incoming_data,
                              rx.map(lambda x_d: mb.adapt_message(ticket=x_d[1]["ticket"],
                                                                  model_message=x_d[1]["model_message"],
                                                                  destiny='guazuapp',
                                                                  data=x_d[1])),

                              rx.flat_map_async(lambda x_d: rs.producer(messages_to_send=x_d[1]['messages_to_send'])))

   
    await obs_request_sender.subscribe_async(rx.AsyncAnonymousObserver(on_next_obs_request_sender, a_throw ))

        


async def on_next_obs_request_sender(payload):
    print(payload)
    
async def a_throw(ex: Exception) -> None:
    print(ex)


async def aio_app():
    app = web.Application()
    app.on_startup.append(on_startup)
    app.add_routes([web.post('/middleware_receptor', webhook)])
    return app


def main():
    port = os.environ.get("PORT", 5055)
    web.run_app(aio_app(), host="localhost", port=int(port))


if __name__ == "__main__":
    main()

class RequestSender:
  
    n_workers:int = 3
    queue = asyncio.Queue()
    
    @log_function_runtime
    async def producer(self, messages_to_send: List[Dict[str, str]]):
        for dict_message in messages_to_send:
            self.queue.put_nowait(dict_message)
        # Create (n_workers:int) worker tasks to process the queue concurrently.
        tasks = []
        
        session  = aiohttp.ClientSession()
        for i in range(self.n_workers):
            task = asyncio.create_task(self.request_worker(session))
            tasks.append(task)
            # Wait until the queue is fully processed.
        await self.queue.join()
        
        # Cancel our worker tasks.
        for task in tasks:
            task.cancel()
        # Wait until all worker tasks are cancelled.
        await asyncio.gather(*tasks, return_exceptions=True)
        await session.close()
        
    @log_function_runtime
    async def request_worker(self, session):
        while True:
            dict_message:Dict[str, str] = await self.queue.get()
            destiny:str = dict_message.get("destiny")
            body_message = dict_message.pop(destiny)
            function_name= destiny
            await self.request_post(function_name, body_message, session, dict_message)
            self.queue.task_done()

    async def request_post(self, function_name: str, body_message: str, session, dict_message:Dict[str, Any]):
        async with session.post(url=url, headers=headers, data=body_message) as response_post:
            response = await response_post.text()
            print(f"response: {response}")
            return rx.single(response)

This is all the trace I have

Task exception was never retrieved
future: <Task finished name='Task-4' coro=<start_immediate..runner() done, defined at /usr/local/lib/python3.10/site-packages/expression/core/aiotools.py:86> exception=AttributeError("'NoneType' object has no attribute 'subscribe_async'")>
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/expression/core/aiotools.py", line 87, in runner
return await computation
File "/usr/local/lib/python3.10/site-packages/aioreactive/combine.py", line 146, in worker
await message_loop(initial_model)
File "/usr/local/lib/python3.10/site-packages/aioreactive/combine.py", line 141, in message_loop
model = await update(msg, model)
File "/usr/local/lib/python3.10/site-packages/aioreactive/combine.py", line 99, in update
inner = await xs.subscribe_async(obv(model.key))
AttributeError: 'NoneType' object has no attribute 'subscribe_async'

Package Version


aiohttp 3.8.1
aiohttp-jinja2 1.5
aioreactive 0.16.0
aiosignal 1.2.0
async-timeout 4.0.2
attrs 22.1.0
autopep8 1.6.0
certifi 2022.6.15
charset-normalizer 2.1.0
expression 2.0.1
frozenlist 1.3.0
greenlet 1.1.2
idna 3.3
Jinja2 3.1.2
MarkupSafe 2.1.1
multidict 6.0.2
mysql-connector-python 8.0.29
pip 22.0.4
protobuf 4.21.4
pycodestyle 2.9.1
requests 2.28.1
ring 0.9.1
Rx 3.2.0
setuptools 58.1.0
six 1.16.0
SQLAlchemy 1.4.39
toml 0.10.2
typing_extensions 4.1.1
urllib3 1.26.11
watchdog 2.1.9
wheel 0.37.1
wirerope 0.4.5
yarl 1.7.2

Python 3.11 support?

pip install using python 3.11 returns the following error:

ERROR: Ignored the following versions that require a different python version: 0.16.0 Requires-Python >=3.9,<3.11
ERROR: Could not find a version that satisfies the requirement aioreactive==0.17.0 (from versions: 0.2.0, 0.2.1, 0.3.0, 0.5.0, 0.6.0, 0.7.0, 0.8.0, 0.9.0, 0.10.0, 0.11.0, 0.12.0, 0.13.0, 0.14.0, 0.15.0)
ERROR: No matching distribution found for aioreactive==0.17.0

Error example autocomplete , web.Application instance initialized with different loop

summarized Traceback.
...
File "/home/denis/Documents/env_aioreactive/lib/python3.9/site-packages/aiohttp/web.py", line 321, in _run_app
await runner.setup()
File "/home/denis/Documents/env_aioreactive/lib/python3.9/site-packages/aiohttp/web_runner.py", line 279, in setup
self._server = await self._make_server()
File "/home/denis/Documents/env_aioreactive/lib/python3.9/site-packages/aiohttp/web_runner.py", line 373, in _make_server
self._app._set_loop(loop)
File "/home/denis/Documents/env_aioreactive/lib/python3.9/site-packages/aiohttp/web_app.py", line 223, in _set_loop
raise RuntimeError(
RuntimeError: web.Application instance initialized with different loop

Process finished with exit code 1

Packages Versions


aiohttp 3.8.1
aiohttp-jinja2 1.5
aioreactive 0.16.0
aiosignal 1.2.0
async-timeout 4.0.2
attrs 22.1.0
charset-normalizer 2.1.0
expression 2.2.0
frozenlist 1.3.0
idna 3.3
Jinja2 3.1.2
MarkupSafe 2.1.1
multidict 6.0.2
pip 21.3.1
setuptools 60.2.0
typing_extensions 4.3.0
wheel 0.37.1
yarl 1.7.2

How can I install this?

Hi there!

This lib looks neat! I'm really excited to give it a try!

Is there any info on how to install this?

I think it would be cool if there was an Install or Getting Started section in README.md. :)

Latest release?

pypi shows 0.16.0 as the latest release (Mar 22, 2022)
github shows 0.17.0 as the latest tag (Mar 23, 2032)
last master branch commit was Jul 14, 2022

which version should I be using?

Concat with async iterables

I'm trying to adapt the concat operator to implement the catch_exception op, but just noticed that the concat fails with an async iterable. I'm using the following observable:

async def asynciter():
    for i in range(5):
        await asyncio.sleep(1)
        yield i
xs = from_async_iterable(asynciter())

I have a fork with a working catch_exception and retry here, as well as the tests mentioned: tr11@6219f9a

split.py example fails with InvalidStateError

  1. Create a new Python 3.9 environment.
  2. pip install aioreactive
  3. Download and run split.py from examples/streams
    Expected result: see values being printed out.
    Actual result:
Exception in callback MailboxProcessor.__process_events()
handle: <Handle MailboxProcessor.__process_events()>
Traceback (most recent call last):
  File "C:\Users\Ian\.conda\envs\aioreactive\lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "C:\Users\Ian\.conda\envs\aioreactive\lib\site-packages\expression\core\mailbox.py", line 152, in __process_events
    cont(msg)
  File "C:\Users\Ian\.conda\envs\aioreactive\lib\site-packages\expression\core\aiotools.py", line 45, in done
    future.set_result(value)
asyncio.exceptions.InvalidStateError: invalid state
Exception in callback MailboxProcessor.__process_events()
handle: <Handle MailboxProcessor.__process_events()>
Traceback (most recent call last):
  File "C:\Users\Ian\.conda\envs\aioreactive\lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "C:\Users\Ian\.conda\envs\aioreactive\lib\site-packages\expression\core\mailbox.py", line 152, in __process_events
    cont(msg)
  File "C:\Users\Ian\.conda\envs\aioreactive\lib\site-packages\expression\core\aiotools.py", line 45, in done
    future.set_result(value)
asyncio.exceptions.InvalidStateError: invalid state

Are there any more radical interface changes coming such as those of 27 days ago?

Dag,

the wave of commits 27 days ago aligned the API more with RxPY and introduced quite a number of breaking API changes (subscribe instead of start, etc).

Could you please tell whether you plan any more major changes like that in the near future, and when do you plan to push this last wave of changes onto PyPI?

Thanks a lot!

Iterable AsyncMultiStream?

Hi! I'm wondering why AsyncMultiStream doesn't inherit from AsyncStreamIterable as AsyncSingleStream does? To me it looks pretty reasonable to iterate hot observables because each observer already actually gets individual AsyncSingleStream during subscription.

AsyncIteratorObserver takes constructor arguments

Hello, i noticed that the examples in the readme using AsyncIteratorObserver do not work.

here is how it is in the docs

obv = rx.AsyncIteratorObserver()
async with await xs.subscribe_async(obv) as subscription:
    async for x in obv:
        # do stuff with x

but from my findings it works like this:

obv = rx.AsyncIteratorObserver(xs)
async with await xs.subscribe_async() as subscription:
    async for x in obv:
        # do stuff with x

if i dont do it like this i get an error about missing parameters.

in the test code i found how to use it:

obv = rx.AsyncIteratorObserver(xs)

from_iterable does not handle exceptions in iterables

import asyncio
from aioreactive.core import AsyncObservable, run


async def generator():
    # Also fails with sync generators
    for i in range(10):
        if i > 2:
            print("Let's raise")
            raise ValueError(i)
            # stream hangs here

        await asyncio.sleep(.1)
        print('Yield', i)
        yield i


async def main():
    iterable = generator()
    observer = AsyncObservable.from_iterable(iterable)
    await run(observer, timeout=None)


if __name__ == '__main__':
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())
    loop.close()

So the flow is simply hangs after the exception is thrown in the generator

Stream Fork

I was wondering if there is a nice way to "fork" a stream, meaning to use the filter operator to get a number of streams from one stream, based on a certain condition for example.

Also please let me know if issues are the wrong place to ask these questions.

AsyncRx.distinct_until_changed, wrong signature

In AsyncRx

the signature
def distinct_until_changed(self) -> AsyncObservable[_TSource]:
should be
def distinct_until_changed(self) -> AsyncRx[_TSource]:

The actual return is correct, only the signature is wrong

ImportError: cannot import name 'match' from 'expression.core'

When using aioreactive, I get an ImportError: cannot import name 'match' from 'expression.core'. This is due to the match module having been removed from Expression.

Trace:

  File "/home/duranda/devel/aioreactive-test/venv/lib/python3.11/site-packages/aioreactive/__init__.py", line 545, in filter
    from .filtering import filter as _filter
  File "/home/duranda/devel/aioreactive-test/venv/lib/python3.11/site-packages/aioreactive/filtering.py", line 4, in <module>
    from expression.core import (
ImportError: cannot import name 'match' from 'expression.core' (/home/duranda/devel/healthme-api/venv/lib/python3.11/site-packages/expression/core/__init__.py)

from_iterable crashes with streams of tuples

It seems that debug code is not able to format tuples and namedtuples properly. A snippet that reproduces the problem:

import asyncio
from aioreactive.core import subscribe, AsyncAnonymousObserver
from aioreactive.core import AsyncObservable
from collections import namedtuple

async def main():
    Foo = namedtuple("Foo", ["name", "value"])
    foos = [Foo("one", 1), Foo("two", 2)]
    xs = AsyncObservable.from_iterable(foos)

    async def mysink(value):
        print(value)

    await subscribe(xs, AsyncAnonymousObserver(mysink))

if __name__ == '__main__':
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())
    loop.close()
AsyncAnonymousObserver exception was never retrieved
future: <AsyncAnonymousObserver finished 
exception=TypeError('not all arguments converted during string formatting',)>
Traceback (most recent call last):
  File ".../lib/python3.6/site-packages/aioreactive/operators/from_iterable.py", line 32, in worker
    log.debug("sync_worker. asending: %s" % value)
TypeError: not all arguments converted during string formatting

Observable from an async_generator?

How can I create an AsyncObservable from an async_generator using aioreactive?

async def records():
    for i in range(0, 3):
        yield i
source = rx.from_(records())
source.subscribe(
    on_next=lambda x: logger.info(f"on_next: {x=}"),
    on_error=lambda x: logger.error(f"error: {x=}"),
    on_completed=lambda: logger.info("completed")
)

results in an Exception: TypeError("'async_generator' object is not iterable")

README unclear

image

See screenshot: Readme suggests to use subscribe_async but actually drops the _async part in the example. Is this intended?

distinct_until_changed, not so nice signature?

It has signature
def distinct_until_changed(
source: AsyncObservable[_TSource],
) -> AsyncObservable[_TSource]

but I would have expected
def distinct_until_changed() -> Callable[[AsyncObservable[_TSource]], AsyncObservable[_TSource]]

Seems like a lot of signatures here and there could need some clean-up!???

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.