Git Product home page Git Product logo

buoy's People

Contributors

lpgauth avatar rkallos avatar saleyn avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

buoy's Issues

stream response body of async_get

Hi Louis-Philippe,

buoy sounds really interesting.

Is it possible to stream the response body with buoy? Your slideshare presentation suggests otherwise but I couldn't find it in the documentation.

Niko.

Some performance issue

Hello,

Basically in our flow we are sending very large body's in the requests (we are buffering lot of events and send them bulk via a single http call to one of our services). Buoy is great but we noticed at least a theoretical issue:

We are using the async interface and in the message that we receive as response, request is included as well which means very huge binaries are still referenced so they still reside in memory and are not released once they were sent to the server.

Do you think it's possible to set some option and to not include the request in the response message?

Silviu

Backlog is not initialized

I can't figure out why this is happening, but there are no crash logs or any errors. This works locally but not on a production machine. Here is a log of what I run:

([email protected])4> Url2 = buoy_utils:parse_url(<<"http://localhost:7949/recommend">>).
{buoy_url,<<"localhost:7949">>,<<"localhost">>,
          <<"/recommend">>,7949,http}
([email protected])5> buoy_pool:start(Url2, [{pool_size, 16}]).
ok
([email protected])6> buoy:post(Url2, [{<<"content-type">>, <<"application/json">>}], <<"{}">>).
** exception error: bad argument
     in function  ets:update_counter/3
        called as ets:update_counter(shackle_backlog,http_localhost_7949_13,
                                     [{2,0},{2,1,1024,1024}])
     in call from shackle_backlog:check/2 (/opt/bidbox/_build/default/lib/shackle/src/shackle_backlog.erl, line 73)
     in call from shackle_pool:server/1 (/opt/bidbox/_build/default/lib/shackle/src/shackle_pool.erl, line 96)
     in call from shackle:call/3 (/opt/bidbox/_build/default/lib/shackle/src/shackle.erl, line 52)

Some more info:

([email protected])9> ets:tab2list(shackle_backlog).
[]
([email protected])10> ets:tab2list(shackle_pool).
[{http_localhost_7949,{pool_options,1024,buoy_client,16,
                                    random}}]
([email protected])11> ets:tab2list(buoy_pool).
[{{http,<<"localhost">>,7949},http_localhost_7949}]

and some system info:

([email protected])13> erlang:system_info(system_version).
"Erlang/OTP 19 [erts-8.3] [source] [64-bit] [smp:16:16] [ds:16:16:10] [async-threads:10] [hipe] [kernel-poll:true]\n"

Concurrent requests

This might be a silly question, but what does "concurrent requests per connection" mean? I thought HTTP/1.1 was only 1 request at a time over a keep-alive connection?

Possible bug

Hello I was wonder if in buoy_client line 64 is not :

{ok, [{request_id(RequestsIn), Response}], State#state

instead :

{ok, [{RequestsIn, Response}], State#state

Otherwise I cannot see how RequestId from handle_request can match correctly the response.

Silviu

Connection notifications

Hello,

It will be nice to have at the pool level some sort of notifications sent when there is no connection or at least one connection like : connection_down, connection_up.

What I'm trying to fix:

When the server is stopped and there are log of pending requests (which I'm retrying till succeed), I can watch the connectivity status and don't send them until there is at least one buoy process connected.

This way I can improve the behavior of the app when the server where I'm sending the requests is down

A similar feature is available in gun http client as well.

Silviu

What's the purpose of "pool"?

Could you please explain why I've to do:

1> Url = buoy_utils:parse_url(<<"http://example.com">>).
{buoy_url,<<"example.com">>,<<"example.com">>,<<"/">>,80, http}
2> ok = buoy_pool:start(Url, [{pool_size, 1}]).
3> {ok, Resp} = buoy:get(Url, #{timeout => 500}).

than simply:

{ok, Resp} = buoy:get(<<"http://example.com">>, #{timeout => 500}).

Why i should parse the URL first and what's the purpose of creating a pool?

I've ~5000 different URLs pointing the same server (with different paths).
I would like to stress test the server with these URL list.

What's the best way to use buoy for this use-case?

Thanks!

How to use this?

Hey, I see the version on hex is quite old. When I try to add it like this:

      {:buoy, "~> 0.1", git: "https://github.com/lpgauth/buoy.git"},

I get this error:

Dependencies have diverged:
* metal (https://github.com/lpgauth/metal.git)
  different specs were given for the metal app:

  > In deps/shackle/rebar.config:
    {:metal, ~r/.*/, [env: :prod, override: true, git: "https://github.com/lpgauth/metal.git", tag: "0.1.1", manager: :rebar3]}

  > In deps/foil/rebar.config:
    {:metal, "0.1.1", [env: :prod, repo: "hexpm", hex: "metal", override: true, manager: :rebar3]}

  Ensure they match or specify one of the above in your deps and set "override: true"
** (Mix) Can't continue due to errors on dependencies

How might I fix this?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.