Git Product home page Git Product logo

django-eventstream's Introduction

Django EventStream

EventStream provides API endpoints for your Django application that can push data to connected clients. Data is sent using the Server-Sent Events protocol (SSE), in which data is streamed over a never-ending HTTP response.

For example, you could create an endpoint, /events/, that a client could connect to with a GET request:

GET /events/ HTTP/1.1
Host: api.example.com
Accept: text/event-stream

The client would receive a streaming HTTP response with content looking like this:

HTTP/1.1 200 OK
Transfer-Encoding: chunked
Connection: Transfer-Encoding
Content-Type: text/event-stream

event: message
data: {"foo": "bar"}

event: message
data: {"bar": "baz"}

...

Features:

  • Easy to consume from browsers or native applications.
  • Reliable delivery. Events can be persisted to your database, so clients can recover if they get disconnected.
  • Per-user channel permissions.

Dependencies

  • Django >=5
  • PyJWT >=1.5, <3
  • gripcontrol >=4.0, <5
  • django_grip >=3.0, <4
  • six >=1.10, <2

Optional Dependencies

If your using the Django Rest Framework version of the module.

  • Django Rest Framework ==3.15 (for using the ViewSet and router.register).

Those dependecies will be install automatically when you install this package.

Setup

Without Django Rest Framework

First, install this module and the daphne module:

pip install django-eventstream daphne

Add the daphne and django_eventstream apps to settings.py:

INSTALLED_APPS = [
    ...
    "daphne",
    "django_eventstream",
]

Add an endpoint in urls.py:

from django.urls import path, include
import django_eventstream

urlpatterns = [
    ...
    path("events/", include(django_eventstream.urls), {"channels": ["test"]}),
]

That's it! If you run python manage.py runserver, clients will be able to connect to the /events/ endpoint and get a stream.

With Django Rest Framework

First, install this module with REST dependencies and the daphne module:

pip install django-eventstream[DRF] daphne

Note: The [DRF] is a optional dependency that will install the Django Rest Framework module.

Add the daphne and django_eventstream apps to settings.py:

INSTALLED_APPS = [
    ...
    "daphne",
    "django_eventstream",
]

Add an endpoint in urls.py:

from django.urls import path, include
from django_eventstream.views import EventsViewSet, configure_events_view_set

router = DefaultRouter()

# In theses examples, we are using the `configure_events_view_set` function to create a view set with the channels and messages_types that we want to use.
router.register('events1', configure_events_view_set(channels=["channel1", "channel2",...], messages_types=["message","info",...]), basename='events1') 
# Or you can create a view set and directly pass it to the router. You will be able to set the channels by URL or by query parameters. For messages_types you can set it by query parameters or use the default value "message".
router.register('events2', EventsViewSet, basename='events2')

urlpatterns = [
    ...
    path("", include(router.urls)), # Here we register the router urls
]

Note: These configurations are only applicable when using the Django Rest Framework version of the module (i.e., when the Django Rest Framework is installed).

Sending events

To send data to clients, call send_event:

from django_eventstream import send_event

#send_event(<channel>, <event_type>, <event_data>)
send_event("test", "message", {"text": "hello world"})

The first argument is the channel to send on, the second is the event type, and the third is the event data. The data will be JSON-encoded using DjangoJSONEncoder.

Note: In a basic setup, send_event must be called from within the server process (e.g. called from a view). It won't work if called from a separate process, such as from the shell or a management command, you could use Redis or another message queue to communicate between processes and after send your event.

Be aware tha if you send message a message type error (or any type that used for the SSE protocol, like open, error, close...), the message maybe interpreted as a error message by the client.

Disable the browsable api view sse for production [DRF]

To disable the Browsable API view SSE for production, you can modify the settings.py of the project by adding the following lines as recommended by Django REST Framework:

REST_FRAMEWORK = {
    'DEFAULT_RENDERER_CLASSES': (
        # Add here the renderer classes you want to use
    ),
}

Next, manage the renderers you want to use with DEFAULT_RENDERER_CLASSES. In this example, you will need to use django_eventstream.renderers.SSEEventRenderer to enable SSE functionality. If you also want to use the Browsable API view, add django_eventstream.renderers.BrowsableAPIEventStreamRenderer if not do not add it :

/!\ And be careful to don't add the eventstream renderers before the JSONRenderer and BrowsableAPIRenderer (or other Renderer), otherwise the API will probably not work as expected.

REST_FRAMEWORK = {
    'DEFAULT_RENDERER_CLASSES': [
        'rest_framework.renderers.JSONRenderer',
        'rest_framework.renderers.BrowsableAPIRenderer',
        'django_eventstream.renderers.SSEEventRenderer',
        'django_eventstream.renderers.BrowsableAPIEventStreamRenderer'
         # Add other renderers as needed
    ]
}

Deploying

After following the instructions in the previous section, you'll be able to develop and run locally using runserver. However, you should not use runserver when deploying, and instead launch an ASGI server such as Daphne, e.g.:

daphne your_project.asgi:application

See How to deploy with ASGI.

WSGI mode can work too, but only in combination with a GRIP proxy. See Multiple instances and scaling.

Multiple instances and scaling

If you need to run multiple instances of your Django app for high availability or scalability, or need to send events from management commands, then you can introduce a GRIP proxy such as Pushpin or Fastly Fanout into your architecture. Otherwise, events originating from an instance will only be delivered to clients connected to that instance.

For example, to use Pushpin with your app, you need to do three things:

  1. In your settings.py, add the GripMiddleware and set GRIP_URL to reference Pushpin's private control port:
MIDDLEWARE = [
    "django_grip.GripMiddleware",
    ...
]
GRIP_URL = 'http://localhost:5561'

The middleware is part of django-grip, which should have been pulled in automatically as a dependency of this module.

  1. Configure Pushpin to route requests to your app, by adding something like this to Pushpin's routes file (usually /etc/pushpin/routes):
* localhost:8000 # Replace `localhost:8000` with your app's URL and port
  1. Configure your consuming clients to connect to the Pushpin port (by default this is port 7999). Pushpin will forward requests to your app and handle streaming connections on its behalf.

If you would normally use a load balancer in front of your app, it should be configured to forward requests to Pushpin instead of your app. For example, if you are using Nginx you could have configuration similar to:

location /api/ {
    proxy_pass http://localhost:7999
}

The location block above will pass all requests coming on /api/ to Pushpin.

Event storage

By default, events aren't persisted anywhere, so if clients get disconnected or if your server fails to send data, then clients can miss messages. For reliable delivery, you'll want to enable event storage.

First, set up the database tables:

python manage.py migrate

Then, set a storage class in settings.py:

EVENTSTREAM_STORAGE_CLASS = 'django_eventstream.storage.DjangoModelStorage'

That's all you need to do. When storage is enabled, events are written to the database before they are published, and they persist for 24 hours. If clients get disconnected, intermediate proxies go down, or your own server goes down or crashes at any time, even mid-publish, the stream will automatically be repaired.

To enable storage selectively by channel, implement a channel manager and override is_channel_reliable.

Receiving in the browser

Include client libraries on the frontend:

<script src="{% static 'django_eventstream/eventsource.min.js' %}"></script>
<script src="{% static 'django_eventstream/reconnecting-eventsource.js' %}"></script>

Listen for data:

var es = new ReconnectingEventSource('/events/');

es.addEventListener('message', function (e) {
    console.log(e.data);
}, false);

es.addEventListener('stream-reset', function (e) {
    // ... client fell behind, reinitialize ...
}, false);

Authorization

Declare a channel manager class with your authorization logic:

from django_eventstream.channelmanager import DefaultChannelManager

class MyChannelManager(DefaultChannelManager):
    def can_read_channel(self, user, channel):
        # require auth for prefixed channels
        if channel.startswith('_') and user is None:
            return False
        return True

Configure settings.py to use it:

EVENTSTREAM_CHANNELMANAGER_CLASS = 'myapp.channelmanager.MyChannelManager'

Whenever permissions change, call channel_permission_changed. This will cause clients to be disconnected if they lost permission to the channel.

from django_eventstream import channel_permission_changed

channel_permission_changed(user, '_mychannel')

Note: OAuth may not work with the AuthMiddlewareStack from Django Channels. See this token middleware.

Routes and channel selection

The channels the client listens to are specified using Django view keyword arguments on the routes. Alternatively, if no keyword arguments are specified, then the client can select the channels on its own by providing one or more channel query parameters in the HTTP request.

Examples:

# specify fixed list of channels
path('foo/events/', include(django_eventstream.urls), {'channels': ['foo']})

# specify a list of dynamic channels using formatting based on view keywords
path('objects/<obj_id>/events/', include(django_eventstream.urls),
    {'format-channels': ['object-{obj_id}']})

# client selects a single channel using a path component
path('events/<channel>/', include(django_eventstream.urls))

# client selects one or more channels using query parameters
path('events/', include(django_eventstream.urls))

Note that if view keywords or a channel path component are used, the client cannot use query parameters to select channels.

If even more advanced channel mapping is needed, implement a channel manager and override get_channels_for_request.

Cross-Origin Resource Sharing (CORS) Headers

There are settings available to set response headers Access-Control-Allow-Origin, Access-Control-Allow-Credentials, and Access-Control-Allow-Headers, which are EVENTSTREAM_ALLOW_ORIGINS, EVENTSTREAM_ALLOW_CREDENTIALS, and EVENTSTREAM_ALLOW_HEADERS, respectively.

Examples:

EVENTSTREAM_ALLOW_ORIGINS = ['http://example.com', 'https://example.com']
EVENTSTREAM_ALLOW_CREDENTIALS = True
EVENTSTREAM_ALLOW_HEADERS = 'Authorization'

Note that EVENTSTREAM_ALLOW_HEADERS only takes a single string value and does not process a list.

If more advanced CORS capabilities are needed, see django-cors-headers.

django-eventstream's People

Contributors

ahouy avatar bjmnbraun avatar bogdan-calapod avatar dependabot[bot] avatar enzofrnt avatar erfantarighi avatar hugolundin avatar iftimie avatar jkarneges avatar jonhuber avatar maikksmt avatar noamkush avatar rickyv33 avatar svvitale avatar syifarahmat avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

django-eventstream's Issues

grip proxy not working

I see the events being created in the django_eventstream_event table. I setup a grip_url to be my ip on the backend of http://pushpin:5561 and I have a daphne running.

However looking at the access log of my pushpin server there is absolutely nothing happening on there. And the client that is connected to daphne does not receive any messages.

Is there any way I can debug this? The setup works fine if I just use runserver but that obviously doesn't work in production.

SSL Issue

When ssl was implemented on our platform, it stopped receiving events.

We've used channels and daphne daphne app4gives.asgi:application to run our application

server_addr not set

I don't know if this is the right place. But I have problems with connecting to the SSE endpoint. Sometimes it works sometimes it doesn't. I get this error

"server": self.server_addr,
    AttributeError: 'NoneType' object has no attribute 'value'

Do you maybe know what is going on and how I can fix it?

Every other event pattern listening to the stream

Hi,

I followed your instructions on the setup using Django channels 2. Everything seems to be working great and I have my /events/ stream working. I started sending events to the stream and I noticed something.
I’m sending JSON objects like these (all with same event type), separately:

{ “message”: “1”}
{ “message”: “2”}
{“message”: “3”}
{ “message”: “4”}
{“message”: “5”}

However, the events I listen from the stream are sent in an every other event pattern, like:

{ “message”: “1”}
{ “message”: “3”}
{“message”: “5”}

I couldn’t find any similar issue, and I can’t understand why I’m getting this pattern every time.
Could you help me understand what is the origin of the problem?
Could you also suggest a way/tool to debug whats happening in the stream, because I can’t have any information about what is happening after I call the send_event function, and this is making the debug almost impossible...

I'm using Django 3. Could that be a problem? I didn't see any specification of the required Django version.

Read/unread events

Hi, I'm trying to change how the event persistence works. The goal is is to keep track of unread events so that the user can retrieve unread messages. I'd appreciate any opinion on what I'm doing.

In my project each user has a dedicated channel identified by user pk. I was able to accomplish this with the following code:

urls.py:

urlpatterns = [
    ...
    path('events/<str:channel>)/', include(django_eventstream.urls)),
    ...

channelmanager.py subclass:

class ChannelManager(DefaultChannelManager):
    def can_read_channel(self, user, channel):
        return str(user.pk) == channel

To keep track of read/unread status I added a model and a post_save signal.

models.py

class EventTracker(models.Model):
    event = models.ForeignKey(Event, on_delete=models.CASCADE)
    read_date = models.DateTimeField(db_index=True, default=None, blank=True, null=True)
@receiver(post_save, sender=Event)
def eventstream_post_save(sender, instance, raw, using, update_fields, **kwargs):
    if (instance.type) == 'message':
        EventTracker.objects.create(event=instance, read_date=None)

And this is the context processor I use to pass unread messages counter to the template:

def messages(request):
    unread = EventTracker.objects.filter(
        event__channel=request.user.pk, event__type="message", read_date__isnull=True
    )

    return {'unread_messages_count': unread.count()}

It's a draft and I didn't test it well, but it seems to work. I have a couple of questions:

Questions:

  • what do you think about this approach?
  • I'd need to set EVENT_TIMEOUT to, for example 1 year. Would it be possible to change the current def trim_event_log(self) signature to def trim_event_log(self, event_timeout=EVENT_TIMEOUT), and use this value to get the cutoff?
  • would it be possible to make Event model configurable in settings.py, as you did with EVENTSTREAM_STORAGE_CLASS? I'm using postgresql and was thinking to change the data field from TextField as currently is to JSONField.

If you consider these changes useful I and if you want, I could create a pull request.

Thanks

minor improvements for chat example

The chat example is obviously not meant to be a fully polished app, but still there may be some small UI improvements we could make:

  • Maybe text should come up from the bottom?
  • On mobile, rubber band vertical scrolling should only affect the inner message log area and not the whole screen.
  • Display the user's own username somewhere.
  • Display some "enter message:"-style prompt at the bottom or alternative visual, so it's obvious where to type.
  • Add a link to the code.

Is it possible to update subscribed channels?

Hello,

My problem is that I want my clients to receive data from the server but the channels they have subscribed to may change over time. So, I've looked into django_grip and grip_control but I'm not sure if it is possible to update subscribed channels of a SSE connection as you would with a Websocket.
Or should I use websocket ? My issue would then be passing the authentication token.

Thank you very much for your help.

Have a great day !

postgresql backend support

Hello django-evenstream devs,

in my setup there seems to be an issue with django-eventstream and postgresql support.

If I use sqlite as my main database backend, the events there pushed fine. I now migrated my app to a production system and changed my DB backend to postgres. The events there only pushed on shutdown of the daphne instance - really strange.

So to reproduce:

  1. Deploy Django App with postgres backend and daphne as our asgi webserver
  2. Try to send some event to your users
    • The user doesnt receive the updates, but it seems that the messages are some kind of queued.
  3. Stop the daphne worker - immediatly the user receives the update from the queue.

Just to repeat: with sqlite all is working as expected.

Any hint how to resolve that?

send_event doesn't work with pushpin inside container

Hi, I have a multi container application setup and I can't configure communication between processes via Pushpin.
Container setup is next:

  • gunicorn django container - app
  • daphne django container - async_app
  • nginx container - nginx
  • celery worker container - celery
  • pushpin container - pushpin

In my django project I installed django-eventstream app and configured everything according your guide. I've set up GRIP_URL to 'http://pushpin:5561'.
I've configured routing

from django.conf.urls import url
from channels.routing import URLRouter
from channels.http import AsgiHandler
from channels.auth import AuthMiddlewareStack
import django_eventstream

urlpatterns = [
    url(r'^account/(?P<account_id>\w+)/events/', AuthMiddlewareStack(
        URLRouter(django_eventstream.routing.urlpatterns)
    ), {'format-channels': ['account-{account_id}']}),
    url(r'', AsgiHandler),
]

nginx configured to proxy /events requests to daphne container

location ~* .account/[0-9]/events/$ {
            proxy_http_version 1.1;
            proxy_set_header Upgrade $http_upgrade;
            proxy_set_header Connection "upgrade";

            client_max_body_size 500m;
            proxy_pass http://async_app;
            proxy_set_header X-Real-IP $remote_addr;
            proxy_set_header X-Forwarded-For $remote_addr;
            proxy_set_header Host $host;
        }

And in I've also volumed config folder to pushpin container where I put routes file

* async_app_1:80

I've initialized eventsource on front-end:

var es = new ReconnectingEventSource('/account/' + currentUser.accountId + '/events/');

        es.addEventListener('message', function (e) {
            var data = JSON.parse(e.data);
            console.log('Recieved message');
            console.log(data.text);
        }, false); 

        es.onopen = function () {
            console.log('connected');
        };

And in my celery task I call send_event:

def calculate():
    send_event(f'account-{account_id}', 'message', {'text': 'Started calculating'})
    calculate()
    send_event(f'account-{account_id}', 'message', {'text': 'Ended calculating'})

When I load page I see connection being successfuly established and it receives keep-alive message from time to time. But when I trigger my celery task in pushpin container log I see that both times messages were published but no one recieved them

[INFO] 2020-06-08 11:05:03.351 [handler] control: POST /publish/ code=200 10 items=1
[INFO] 2020-06-08 11:05:03.357 [handler] publish channel=events-account-2 receivers=0
[INFO] 2020-06-08 11:05:03.926 [handler] control: POST /publish/ code=200 10 items=1
[INFO] 2020-06-08 11:05:03.926 [handler] publish channel=events-account-2 receivers=0

I've double checked django settings and searched error log for any clue but everything seems to be working fine except that client doesn't get subscribed to message channel for some reason.

Too many open files

Hi,

We are using django-eventstream for sending out events to clients. You can think of our workflow to be celery like use case but a very simple one. Things were working flawlessly until we hit the 'too many open files' error (Redhat 7.4). We tracked which processes are opening the files using 'lsof' and found python was shooting several threads which loaded the required libraries (mostly .so files). We are using gunicorn as our server which spawns uvicorn workers. Tried to fall back to 'runserver', but faced the same issue.

On trying out the 'time' and 'chat' examples, we saw the same behavior. On every refresh of the page (same machine, same browser, same tab) a new thread is spawned and 'lsof' lists an increment of about 2k files on every refresh of the page.

We tried to recreate the same issue on two other different machines with the same OS. Saw the same behavior, expect in 1 machine. This was a laptop with 4GB of RAM and the rest are servers with 256GB of RAM. Interestingly everything works absolutely fine in the laptop, but not in the servers. Maybe because of the relative sparsity of resources, OS is closing the files in the laptop but not in servers, which is causing the 'too many open files' error?

Any idea how to resolve this issue?
Cheers!

chat example input bug

When using the chat example on Safari on iOS, submitting a message by tapping return on the soft-keyboard rather than tapping the submit button in the HTML content area causes a flickering effect.

Option for blocking publish

Feature request: Support for the "blocking" option from django_grip in eventstream.

Rationale:
We are running into an issue where some messages are not sent from celery workers. I suspect what is happening is that the celery worker process is exiting (including daemon threads) before the event is published by the daemon thread in pubcontrol.

We have logging in place just before calling eventstream.send_event, and the call is being made by the celery workers, but the message is not received by the client. This is a intermittent problem. We observed this with a self-hosted pushpin proxy, and continue to see this using fanout.io.

asgi + wsgi server: events not sent

Hello,

I've set up an application with django_eventstream and it works fine running locally with ./manage.py runserver.
When I build my application behind uwsgi (for the wsgi endpoints) and uvicorn (for the asgi endpoints) on two distinct ports, clients can connect to the event stream but don't receive any events.

I think I've narrowed down the issue: when a client connects to the asgi app, the listener get registered in the ListenerManager. But when the wsgi app sends an event, the ListenerManager reports that it has no listeners.
Printing the ListenerManager in these cases shows that the wsgi app and the asgi app do not use the same ListenerManager:

<django_eventstream.consumers.ListenerManager object at 0x7f3e4ab136d0>
<django_eventstream.consumers.ListenerManager object at 0x7ff74499d8e0>

So it's likely that I didn't understand how to serve my asgi + wsgi app in production.

This is how I serve my app:

  uvicorn passe_un_dessin.asgi:application --host 0.0.0.0 --port 9002 & \
  uwsgi --http :9001 --wsgi-file passe_un_dessin/wsgi.py --master --processes 4 --threads 2

passe_un_dessin.settings :

WSGI_APPLICATION = "passe_un_dessin.wsgi.application"
ASGI_APPLICATION = "passe_un_dessin.routing.application"

passe_un_dessin.asgi :

import os
import django
from channels.routing import get_default_application

os.environ.setdefault("DJANGO_SETTINGS_MODULE", "passe_un_dessin.settings.prod")
django.setup()
application = get_default_application()

passe_un_dessin.routing :

from channels.routing import ProtocolTypeRouter, URLRouter
import core.routing

application = ProtocolTypeRouter({"http": URLRouter(core.routing.urlpatterns)})

core.routing:

import django_eventstream
from channels.http import AsgiHandler
from channels.routing import URLRouter
from django.conf.urls import url

urlpatterns = [
    url(r"^events/", URLRouter(django_eventstream.routing.urlpatterns),),
    url(r"", AsgiHandler),
]

I've installed channels and django-eventstream.

Can you see what I'm doing wrong?

Firefox connection was interrupted while the page was loading

I configured a simple project following Setup with Channels. As stated, I tried to connect to 127.0.0.1:8000/events/ and using Chrome everything worked fine. I also added the javascript code to receive events in the browser following the relevant docs and still no problems using Chrome.

When I connect to /events/ with Firefox (68.9.0esr) I get this download window
image

and trying to receive events in the browser I get this in the Firefox web console:

The connection to http://127.0.0.1:8000/events/ was interrupted while
the page was loading. reconnecting-eventsource.js:58:21

The command I use to start Daphne is:

$ daphne project.asgi:application

django-eventstream version: latest available commit (745033a)
Django version: 3.0.4

Any idea why and a possible fix/workaround?

PyJWT dependency version

Hi, I'm using django-eventstream in a project. Now I have to add djoser as a new dependency but pip says there is a version conflict on PyJWT.

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed.
This behaviour is the source of the following dependency conflicts.
django-eventstream 4.1.0 requires PyJWT<2,>=1.5,
but you have pyjwt 2.1.0 which is incompatible.

So, pip installed PyJWT 2.1.0 anyway and django-eventstream still works. Or at least, I only use send_event and it still works.

I wonder if I can safely keep 2.1.0 or if there would be some issue with something else (I'll have to use a channel manager class with custom authorization logic). After all if setup.py specifies PyJWT>=1.5,<2 maybe there is a reason.

Any opinion?

PS.
if it matters, PyJWT was at v1.7.1 before pip install djoser.

Is it possible to use gzip compression?

Hello, the application where I'm using django-eventstream sends every few seconds a good amount of text data using send_event. I wonder if it'd be possible to send compressed messages to the clients instead of full responses (~100kb).

I need to reduce the outgoing bandwidth, thanks for any advice you could give.

Troubles getting user with django Oauth2

Hello,
I've an issue making authentication work in the channels. I made a fairly simple channel manager :

from logging import info
from django_eventstream.channelmanager import DefaultChannelManager

class MyChannelManager(DefaultChannelManager):
    def can_read_channel(self, user, channel):
        info(user)
        # Require Auth
        if user is None:
           return False
        return True

The problem is, I'm using Oauth2 for authentication, and the user given in the can_read_channel is always None. In my views, I get the user with request.user without problems by passing the token in the headers but not in the channel manager.

Do I need some extra step to get the user ? I've followed the tutorial given by the readme so I have :

urlpatterns = [
    url(r'^v1/gateways-events/gateway/(?P<obj_id>\w+)/', AuthMiddlewareStack(
        URLRouter(django_eventstream.routing.urlpatterns)
    ), {'format-channels': ['gateway-{obj_id}']}),
    url(r'', AsgiHandler),
]

In a routing.py file


from channels.routing import ProtocolTypeRouter, URLRouter
import apps.myhardware.routing

application = ProtocolTypeRouter({
    'http': URLRouter(apps.myhardware.routing.urlpatterns),
})

In the main routing.py file...

Thanks in advance for your help !

TypeError: url() got an unexpected keyword argument 'channels'

hello i am configuring the project following your explanations but at some point i got this error :

TypeError: url() got an unexpected keyword argument 'channels'

triggered by this line :
url(r'^events/', AuthMiddlewareStack(URLRouter(django_eventstream.routing.urlpatterns)), channels=['test']),

i have no other clues cause i am beginning on this project, i've checked my other routing.py file which calls this one, installed channels and django-eventstream

Skipping delivery to certain users on a given channel

I found a skip_user_ids argument in

def send_event(channel, event_type, data, skip_user_ids=[]):
	from .event import Event

which gets passed on to this

def publish_event(channel, event_type, data, pub_id, pub_prev_id,
		skip_user_ids=[]):
	from django_grip import publish

	content_filters = []
	if pub_id:
		event_id = '%I'
		content_filters.append('build-id')
	else:
		event_id = None
	content = sse_encode_event(event_type, data, event_id=event_id, escape=bool(pub_id))
	meta = {}
	if skip_user_ids:
		meta['skip_users'] = ','.join(skip_user_ids)
	publish(
		'events-%s' % quote(channel),
		HttpStreamFormat(content, content_filters=content_filters),
		id=pub_id,
		prev_id=pub_prev_id,
		meta=meta)

To be saved in meta and never to be used again,
Is this feature planned in the near future?
If not, I would like to spend some time adding this feature. Can anyone point me in the right direction?

send_event function does not work on linux server

Hi guys,
I'm using Django-eventstream over Django channels to send an event to my client app,
on my local machine, the events are sent correctly to the client.
but when I upload the app to my Linux server the webhook just getting open and 'keep-alive' but the events won't get to the client at all.
I use daphne to deploy my Asgi app and use Nginx as my getaway.
when I use python manage.py runserver (on the Linux server) the client is getting the messages.
I don't see the events trying to be sent in the daphne logs as well.
does anyone have a clue why this is happening?
Thanks!

AttributeError: 'NoneType' object has no attribute 'get_current_id'

hello i have the following error : AttributeError: 'NoneType' object has no attribute 'get_current_id'
here are the logs :

Traceback (most recent call last):
  File "/home/vincentnahmias/Workspace/venvs/testbed/lib/python3.6/site-packages/django/core/handlers/exception.py", line 41, in inner
    response = get_response(request)
  File "/home/vincentnahmias/Workspace/venvs/testbed/lib/python3.6/site-packages/django/core/handlers/base.py", line 187, in _get_response
    response = self.process_exception_by_middleware(e, request)
  File "/home/vincentnahmias/Workspace/venvs/testbed/lib/python3.6/site-packages/django/core/handlers/base.py", line 185, in _get_response
    response = wrapped_callback(request, *callback_args, **callback_kwargs)
  File "/home/vincentnahmias/Workspace/IOT-Testbed-Dashboard/Dashboard/views.py", line 13, in home
    context['last_id'] = get_current_event_id(['test'])
  File "/home/vincentnahmias/Workspace/venvs/testbed/lib/python3.6/site-packages/django_eventstream/eventstream.py", line 108, in get_current_event_id
    cur_ids[channel] = str(storage.get_current_id(channel))
AttributeError: 'NoneType' object has no attribute 'get_current_id'

i just copy past the fuction in views.py in your project into mine so i don't really know what's the cause.

EventsConsumer doesn't work with channels >=2.1.6

When using Django Channels >= 2.1.6, EventsConsumer doesn't work. Upon sending a GET request to the SSE endpoint (e.g. with curl), no response data is received back and the connection is left open. EventsConsumer never replies and no events are delivered. This didn't happen with channels 2.1.5, where it works fine.

The implementation of AsyncHttpConsumer was changed in 2.1.6, which seems to be the cause of this issue. As a result, EventsConsumer.disconnect() gets called and the consumer is left in this dangling state. A dirty workaround is to await the self.stream task at the end of EventsConsumer.handle().

multi sites support

is it possible to support domain wildcard ? like :
EVENTSTREAM_ALLOW_ORIGIN = '*.your-domain.com'

Connected clients

Is it possible to get the connected clients for a channel? I'd need to get related clients info (ip, number of connected clients etc).

Basically, what I'd really need is, if I have more than n connected client I should be able to limit the number of outgoing messages. Tangential reference ticket: #67 (any opinion on that?)

Default number of limit on channel numbers

From source code, I can see that the default number of limit for the maximum channel number is 10.

My question is: If I do not overwrite this value, does that mean there are only as many as 10 channels?

def __init__(self, http_request=None, channel_limit=10, view_kwargs={})

use JS from CDNs

Rather than keeping local copies of JS files in this repository that could go out of date, we should consider referencing the files from a CDN. Maybe with a template tag that allows choosing a certain version, or default to latest.

Sending a message from a celery worker

hello!

maybe one of you got an idea how to resolve this issue.

I try to send a message to the user eventstream via a django signal - as long as the signal is called from the main django process the message is send to the client. Unfortunately some background jobs should also use this signal. Its triggering correctly in the celery worker, but the send message is never received by the client.

Maybe someone got a hint in the right direction?

Thank you!

persistence configurability

It would be useful to be able to configure persistence by channel or message, rather than all or nothing.

Documentation for production deployment

Hi,
I am trying to figure out how to deploy fanout/event stream to production. I have a beta server on elastic beanstalk under https://beta..io . It has all the Certificates provisioned by AWS.

So I've created a production realm in fanout.io for the production version of my site. I created fanout..io and now I wanted to clarify the recommended approach.

Should I essentially have two clusters one for real time with fanout and one for my production. Or do I just put my service behind fanout as a reverse proxy.

I am a bit confused as to how I should set it up.

Then the question about SSL comes to mind, is there a pragmatic way for me to refresh my certificates with Fanout using letsencrypt for example. Or do I have to update the certificates manually.

I couldn't find any deployment instructions in the docs, so I decided to bring this up here, in case others are looking for similar instructions.

Exception inside application: Object of type 'dict_keys' is not JSON serializable

Error message:

[2018/09/01 01:13:11] HTTP GET /messages/default/ 200 [0.08, 127.0.0.1:61023]
in EventsConsumer.py line 147: request= <AsgiRequest: GET '/events/?channel=room-default'>
in eventresponse.py line 73: es_meta= {'iss': 'es', 'exp': 1535767991, 'channels': dict_keys(['room-default']), 'user': '2'}
^^^^^^^^^ 
   >>>wrong<<<


2018-09-01 01:13:12,302 - ERROR - server - Exception inside application: Object of type 'dict_keys' is not JSON serializable
  File "venv/lib/python3.6/site-packages/channels/sessions.py", line 175, in __call__
    return await self.inner(receive, self.send)
  File "venv/lib/python3.6/site-packages/channels/middleware.py", line 41, in coroutine_call
    await inner_instance(receive, send)
  File "venv/lib/python3.6/site-packages/channels/generic/http.py", line 26, in __call__
    await self.handle(b"".join(body))
  File "venv/lib/python3.6/site-packages/django_eventstream/consumers.py", line 148, in handle
    response = event_response.to_http_response(request)
  File "venv/lib/python3.6/site-packages/django_eventstream/eventresponse.py", line 74, in to_http_response
    params['es-meta'] = jwt.encode(es_meta, settings.SECRET_KEY.encode('utf-8'))
  File "venv/lib/python3.6/site-packages/jwt/api_jwt.py", line 62, in encode
    cls=json_encoder
  File "/anaconda/lib/python3.6/json/__init__.py", line 238, in dumps
    **kw).encode(obj)
  File "/anaconda/lib/python3.6/json/encoder.py", line 199, in encode
    chunks = self.iterencode(o, _one_shot=True)
  File "/anaconda/lib/python3.6/json/encoder.py", line 257, in iterencode
    return _iterencode(o, 0)
  File "/anaconda/lib/python3.6/json/encoder.py", line 180, in default
    o.__class__.__name__)
  Object of type 'dict_keys' is not JSON serializable
[2018/09/01 01:13:12] HTTP GET /events/?channel=room-default 500 [0.90, 127.0.0.1:61023]

Envrionment:

(venv) ➜ pip freeze
asgiref==2.3.2
async-timeout==3.0.0
attrs==18.1.0
autobahn==18.8.1
Automat==0.7.0
certifi==2018.8.24
channels==2.1.3
chardet==3.0.4
constantly==15.1.0
daphne==2.2.2
Django==2.1
django-eventstream==2.2.0
django-grip==1.8.0
djangorestframework==3.8.2
gripcontrol==3.3.0
hyperlink==18.0.0
idna==2.7
incremental==17.5.0
psycopg2-binary==2.7.5
pubcontrol==2.4.2
PyHamcrest==1.9.0
PyJWT==1.6.4
pytz==2018.5
requests==2.19.1
six==1.11.0
Twisted==18.7.0
txaio==18.7.1
urllib3==1.23
Werkzeug==0.14.1
zope.interface==4.5.0

Client side code:

conn_eventstream: function(){
                  let uri = '/events/?channel=room-' + encodeURIComponent('{{ room_name }}');
                  let es = new ReconnectingEventSource(uri);
                  console.log('stream opened. room_name=' + uri);

                  es.addEventListener('open', function () {
                      console.log('Event stream opened')
                  }, false);

                  es.addEventListener('error', function () {
                      console.log('Error. Connection lost. Trying to reconnect')
                  }, false);

                  es.addEventListener('message', function (evt) {
                      console.log(('Message from server: ' + evt.data))
                  }, false)
              }
          }

django-eventsteam vs django-grip

had started experimenting with django-grip. what's the difference between the two libraries ?
also getting cors issues currently with django-grip

    <article hx-sse="connect localhost:8000/events/{{ item.id }}">
        {{  item.body | safe }} 
        {% if media %}
            <img src="{{ media.image }}" />
        {% endif %}
        <a href="{% url 'create-comment' item.id %}">reply</a>
    </article>

my view to subscribe to something

    client_details = {"id": request.user.pk, "item": item_id}
    set_hold_stream(request, f"items/{item_id}")
    return HttpResponse("connected", content_type="text/event-stream")

session cookie missing from reconnect requests

I am currently evaluating fanout in a development environment using ngrok. I use curl to connect to my django application with a valid session:
curl -i https://my-realm.fanoutcdn.com/eventstream/ --cookie "sessionid=my-valid-session-key"

I have a custom manager which assigns channels based on the user. On the first request, this works fine. It seems that there are later requests which lack the session cookie to authenticate the user. If I put a breakpoint in my get_channels_for_request method, I can see that request.COOKIES contains the session id in the first request, and is an empty dictionary in later requests.

From my log files (ngxinx proxy to django runserver):

[11/Aug/2018:14:54:41 +0000] "GET /eventstream/ HTTP/1.1" 200 2077 "-" "curl/7.47.0" 0.322 2827 nate -  
127.0.0.1 - - [11/Aug/2018:14:54:42 +0000] "GET /eventstream/?recover=true&link=next HTTP/1.1" 200 0 "-" "-" 0.025 756 - -
127.0.0.1 - - [11/Aug/2018:14:54:43 +0000] "GET /eventstream/?recover=true&link=next HTTP/1.1" 200 0 "-" "-" 0.024 756 - -
127.0.0.1 - - [11/Aug/2018:14:56:44 +0000] "GET /eventstream/?recover=true&link=next HTTP/1.1" 200 0 "-" "-" 0.032 756 - -
127.0.0.1 - - [11/Aug/2018:14:58:45 +0000] "GET /eventstream/?recover=true&link=next HTTP/1.1" 200 0 "-" "-" 0.031 756 - -

On the first request, the client is getting the correct channels (and if I send an event very quickly I can see it in my terminal). On later requests the user in the ChannleManager is None, so the client is not subscribed to any channels. Note that when I say latter request, I am referring to my nginx logs, as the original curl request has remained open.

This appears to make it impossible to have user-specific channels.

Events Only Sent On Server Exit

I searched through the issues but couldn't find anything quite like this - though #49 may be related.
I have setup django-eventstream per the readme (with channels setup) and the client is able to connect to event channels. It is a very similar setup to the chat example.

However, calling send_event seemingly does nothing, no events are sent - at least it seems that way- until the server exits. Whenever the server exits (or restarts in the case of the hot loading django dev server) all of the events that should have been sent will be sent at once and received by the client. I cannot find another way to make the server finally send all these events that it is (presumably) keeping cached aside from restarting it. It seems as if there is some sort of sending buffer that needs to be flushed?

This has been tried with Daphne, Daphne with HTTP 2.0 support, and Django's runserver interestingly all produce the same results.
To be thorough - though it seems as though events are never sent by the server - I've also swapped between a webpack dev server on the frontend and an nginx server - both producing the same result.

Versions:
Django-Eventstream: 3.1.0
Django: 3.0.3
Daphne: 2.5.0
Python: 3.7.5

Is there some sort of hidden buffer setting? I searched through the code for this repo and the related GRIP repos but didn't see any smoking gun...

http_request is missing sessions attribute

Django documentation for contrib AuthMiddleware and SessionMiddleware specify that request.user and request.session are available within views.

django-eventstream populates request.user only, so this means when creating a custom ChannelManager get_channels_for_request has access to request.user but has to use request.scope["session"] to get to the session.

has been blocked by CORS policy: Request header field cache-control is not allowed by Access-Control-Allow-Headers in preflight response.

has been blocked by CORS policy: Request header field cache-control is not allowed by Access-Control-Allow-Headers in preflight response.

settings.py

EVENTSTREAM_ALLOW_ORIGIN = "*"
EVENTSTREAM_ALLOW_CREDENTIALS = False

# this is for corsheaders (pip)
CORS_ORIGIN_ALLOW_ALL = True

is something wrong with the setitng? or its causing cors iissue cause it's asgi?
other api are worling file, just /event does not work..

send_event not working when django =2.1.3

I use channels = 2.1.5,django = 2.1.3,which send_event failed

the working one is channels = 2.1.2, django = 1.11.15

I found publish method at line 159 in pubcontrol.py:
def publish(self, channel, item, blocking=False, callback=None):

in this function,i found self.clients is empty,is this why no events sent?

*** connection lost, reconnecting... *** connected

I am trying to integrate django-eventstream with my application.
I took the chat example and put it in as a new app in my application under url "/chat/". I downloaded ngrok and ran it on port 8003 as this is where my application is.

Added 'django_eventstream', to installed apps
added 'django_grip.GripMiddleware', as my middleware

Got my Grip URL and Event Stream URLS defined
GRIP_URL = 'https://api.fanout.io/realm/"myid"?iss="myid"&key=base64:"mykey"'
Set up a model storage class
EVENTSTREAM_STORAGE_CLASS = 'django_eventstream.storage.DjangoModelStorage'

I also updated the ngrok origin to the origin returned to me when I ran ngrok.

Now when I launch the chat I keep getting this error. How do I go about troubleshooting this?
*** connection lost, reconnecting...
*** connected

Also I do not see anything on ngrok log.

Connections ttl opn rt1 rt5 p50 p90
0 0 0.00 0.00 0.00 0.00

Messages persist as they should. Am I missing anything in the configuration? As I am not sure where to look at next.

django_grip>=2.0.0,<3'

As Django Version 3 was released django.utils.decorators is deprecated.
While installig django-eventstream django-grip version 2.2 will be used, which leads to following error.

django.utils.decorators
ImportError: cannot import name 'available_attrs' from 'django.utils.decorators' (/usr/local/lib/python3.7/dist-packages/django/utils/decorators.py)

install_requires should include version 3 of django_grip.

Best regards

Channel seems closed after receiving two messages

Hello i am working on a iot system monitoring web application : here is my problem i receive in continuous data on a django server and each time i receive a new message i wanna send it to my client through django channels : here is my server code :

async def runCoap(self):
        protocol = await Context.create_client_context()
        while True:
            requestTemp = Message(code=GET, uri='coap://129.6.60.38/other/sensors/temperature')
            requestHumidity = Message(code=GET, uri='coap://129.6.60.38/other/sensors/humidity')
            try:
                responseTemp = await protocol.request(requestTemp).response
                responseHumidity = await protocol.request(requestHumidity).response
                await asyncio.sleep(1)
            except Exception as e:
                print('Failed to fetch resource:')
                print(e)
            else:
                payloadTemp=responseTemp.payload.decode("utf-8")
                payloadHumidity=responseHumidity.payload.decode("utf-8")
                a=len(payloadTemp)
                b=len(payloadHumidity)
                payloadTemp=json.loads(payloadTemp[:a-77])
                payloadHumidity=json.loads(payloadHumidity[:b-77])
                print("-----------------------------------------------")
                send_event('test', 'message', payloadTemp)
                send_event('test', 'message', payloadHumidity)

and here is my client code :

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN">
{% load staticfiles %}

<html lang="en">
  <head>
    {% load render_bundle from webpack_loader %}
    <script src="{% static 'django_eventstream/json2.js' %}"></script>
    <script src="{% static 'django_eventstream/eventsource.min.js' %}"></script>
    <script src="{% static 'django_eventstream/reconnecting-eventsource.js' %}"></script>
    <script>
      var logMessage = function (s) {
        console.log(s)
      };

      var start = function () {
        logMessage('connecting...');

        {% if last_id %}
          var es = new ReconnectingEventSource('{{ url|safe }}', {lastEventId: '{{ last_id }}'});
        {% else %}
          var es = new ReconnectingEventSource('{{ url|safe }}');
        {% endif %}

        es.onopen = function () {
          logMessage('connected');
        };

        es.onerror = function () {
          logMessage('connection error');
        };

        es.addEventListener('stream-reset', function (e) {
          e = JSON.parse(e.data);
          logMessage('stream reset: ' + JSON.stringify(e.channels));
        }, false);

        es.addEventListener('stream-error', function (e) {
          // hard stop
          es.close();
          e = JSON.parse(e.data);
          logMessage('stream error: ' + e.condition + ': ' + e.text);
        }, false);

        es.addEventListener('message', function (e) {
          logMessage('message: ' + e.data);
        }, false);
      };
    </script>

  </head>
  <body onload="start();">
    <div id= "Dashboard">
      {% render_bundle "IOT_Testbed_Dashboard" %}
      {% block content %}
      {% endblock %}
    </div>
  </body>
</html>

the issue is i only see two messages printed on my client side it looks like after receiving those messages the script is not running anymore or something i don't really know what to do (newbie at javascript/front end programmation))

EventSource constantly losing connection

I was trying out django-eventstream on a raspberry and installed pushpin and all it's dependencies directly from ppa.

Pushpin works if I use curl or pushpin-publish, but with chat example, I can not get it to work.
I just constantly get

*** connected
*** connection lost, reconnecting...

messages.

Django 1.11 support on python 3

The way that django-eventstream requires channels in setup.py prevents dependency resolution for django 1.11 on python 3. I am upgrading a project which uses pip-tools for dependency management and get the following output:

There are incompatible versions in the resolved dependencies:
  Django<2 (from -r requirements/requirements.in (line 33))
  Django>=2.2 (from channels==2.4.0->django-eventstream==3.1.0->-r requirements/requirements.in (line 34))

pip-tools does not currently have an option to override that. jazzband/pip-tools#561

The way this is defined effectively makes django-eventstream only compatible with django>=2.2 on python 3. Since pip doesn't have strict dependency resolution this probably goes unnoticed in most cases.

I would be happy to offer a PR, but I don't see a good solution without forcing users to explicitly install channels, which would likely break things for many users. It seems like this might be addressed with extra_requires but I am not seeing how to make it dependent on the django version.

view_kwargs not passed to ChannelManager

I have defined a path as
path(r'recharge_room/dashboard/events/<int:pk>)', include(django_eventstream.urls), {'format-channels': ['room-{pk}']}), in my project's urls.py.
When trying to connect, I received a 200 response: {"condition": "bad-request", "text": "Invalid request: No channels specified."}. Thus, I started looking for the issue. I let the DefaultChannelManager print the view_kwargs it receives for get_channels_for_request, and they only include my pk.
Does someone know what I am doing wrong?
Any help would be greatly appreciated.

Setup instructions with Channels 3

Channels 3 is out and its migration guide warns us of a few breaking changes and deprecated classes, which the setup instructions here use.

Is this lib compatible with Channels 3?
Is the example code different?

About a minute's delay between send_event and pushpin logging the published event

I've recently set up pushpin as a GRIP proxy in front of my existing app configured as instructed.

Things seem to work well, except that events seem to take a long time to be published. There seems to be a delay between 10 seconds and 2 minute between the moment send_event returns, and the moment when pushpin prints log about the published event. After pushpin has logged the event, the subscribed client almost immediately receives the event.

What could cause such a delay?

Connection Error when running examples

I followed the instructions given in the Readme for both the basic and the chat examples, but I continuously get "Connected" and "Connection Lost" messages. I am running it on an Ubuntu machine.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.