Git Product home page Git Product logo

Comments (15)

NachoAlesLopez avatar NachoAlesLopez commented on August 20, 2024 3

As @Tedezed has already mentioned, a solution to this issue is to create a channel for each node. It will allow each node to execute its own job queue and the nodes won't be executing the same job. The only problem with this solution, as far as I know, is that you can't restrict the execution of channels, only increase the capacity of the queue for each channel.

I have implemented a feature that will ignore channels that are not declared in the configuration file. By adding ignore_unknown_channels = True, any channel that is not found in the "channels" attribute will be ignored during the notification phase of the channel manager. If everyone is OK by this, I can create a pull request and include the feature, although I've only tested it with Odoo 10.

from queue.

sbidoul avatar sbidoul commented on August 20, 2024 1

If the concept of node must be added, I'm not sure it should be linked to channels.

Perhaps it's better to make it a first class concept and have the jobs being dispatched from the root channel to nodes, with the job runner knowing how many jobs are running on each node at any time.

It could also be seen as an extension of the definition of the capacity of the root channel. Instead of a single capacity value for the root channel, it could be a list of (node, capacity) tuples.

from queue.

NachoAlesLopez avatar NachoAlesLopez commented on August 20, 2024 1

If anyone is still interested in this. I've uploaded the change I talked about in my fork. The implementation is done in branch 10.0 with this commit.

from queue.

guewen avatar guewen commented on August 20, 2024

The first thing you should avoid to do is to execute the jobs on the same running odoo than the one processing HTTP requests for users. Usually we spawn a second node only for the jobs.

If it is not enough, you can configure on which host the jobs are executed (see #51), which should allow you to send the requests to a proxy doing for instance a round robin between different odoo nodes (disclaimer: never done that). There is no possibility to have different nodes by type of work, but if needed it could probably be done at the channel level (channel root.X -> node A, root.Y -> node B).

I would be interested to have some stats about your queue (number of jobs per day / per minute, peaks, avg duration of a job, ...)

from queue.

Tedezed avatar Tedezed commented on August 20, 2024

Hi @guewen

I am already using several odoo nodes behind a balancer and a node for jobs. I will try to divide the jobs by channel, although I am worried that the queue tail is common.

from queue.

guewen avatar guewen commented on August 20, 2024

Interesting. What do you mean by "the queue tail is common"?

from queue.

Tedezed avatar Tedezed commented on August 20, 2024

For example, having two channels, root1 and root2. Each of the channels in different nodes and with execution of one to one.
I intuit that it works in the following way:
jobs
The node odoo-node-job-1 will start executing jobs, but until it reaches job03 the next node odoo-node-job-2 will not start to execute your own job, as if they had a shared process queue.

from queue.

guewen avatar guewen commented on August 20, 2024

The tail is common but if you set the capacity of the root channel equal to the total of the sum of the children it should be ok. If we consider that we can define one node per channel and the node configured on a channel has the precedence over the channel defined on its parent, it should work.

Example:

root: Main Node, 6 slots
root.sale: Node A, 3 slots
root.ecommerce: Node B, 3 slots

With this configuration, we should have 3 jobs executed on Node A and the same on Node B at the same time.

Now, that's true that if you have jobs in the root channel, they would use slots that would be unavailable from the Node A and B. But I guess it would only be a matter of configuring the channels sensibly.

Example (and be sure that no job function is assigned to the "root" channel):

root: Main Node, 8 slots
root.misc: Main Node, 2 slots
root.sale: Node A, 3 slots
root.ecommerce: Node B, 3 slots

from queue.

Tedezed avatar Tedezed commented on August 20, 2024

I think I'm doing something wrong

I configure two nodes with:

  • Node1:
    longpolling_port = 8072
    xmlrpc_port = 8069
    channels = root: 4

  • Node2:
    longpolling_port = 8099
    xmlrpc_port = 8088
    channels = root.sales: 2

I also create two jobs, each with a different channel and what happens after is both jobs run on both nodes inclusive having totally different ports.

I see:

from queue.

guewen avatar guewen commented on August 20, 2024

You can't start the jobrunner on 2 different nodes. The concept we were discussing in the last comments does not exist, sorry if I confused you.

from queue.

chand3040 avatar chand3040 commented on August 20, 2024

So looking at the conversation it seems that odoo que is not scalable horizontally.
So is it possible to take it ahead and request this feature in OCA as this would be one of the most helpful feature for scaling things.

from queue.

Tedezed avatar Tedezed commented on August 20, 2024

@chand3040 In these moments you can scale Odoo, but no nodes with jobs.

You need to have the types of machines consulting the same database.
For example, you would have an Odoo scaling horizontally to resolve client requests and a single node to resolve jobs.

from queue.

chand3040 avatar chand3040 commented on August 20, 2024

@Tedezed : Thanks for pointing that out. I understood what you are saying. To keep one machine only for queuing and not entertaining any other http /user requests but as database connection to every machine is same so I think it would be more helpful if we allow every machine to take care of job que so that things get distributed and point of failure is not a single machine.

from queue.

aliencrash avatar aliencrash commented on August 20, 2024

Hi,
I'm interesting in this, before seeing this post I was thinking more on host base job queue. This is for a project we are doing based on K8s. So the suggestion here is using channel what about a host based solution?

from queue.

github-actions avatar github-actions commented on August 20, 2024

There hasn't been any activity on this issue in the past 6 months, so it has been marked as stale and it will be closed automatically if no further activity occurs in the next 30 days.
If you want this issue to never become stale, please ask a PSC member to apply the "no stale" label.

from queue.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.