Git Product home page Git Product logo

surf's Introduction

Surf

Surf the web

Built with ๐ŸŒŠ by The http-rs team

Surf the web - HTTP client framework

Surf is a Rust HTTP client built for ease-of-use and multi-HTTP-backend flexibility. Whether it's a quick script, or a cross-platform SDK, Surf will make it work.

  • Extensible through a powerful middleware system
  • Multiple HTTP back-ends that can be chosen
  • Reuses connections through a configurable Client interface
  • Fully streaming requests and responses
  • TLS enabled by default (native tls or rustls)
  • Built on async-std (with optional tokio support)

Examples

let mut res = surf::get("https://httpbin.org/get").await?;
dbg!(res.body_string().await?);

It's also possible to skip the intermediate Response, and access the response type directly.

dbg!(surf::get("https://httpbin.org/get").recv_string().await?);

Both sending and receiving JSON is real easy too.

#[derive(Deserialize, Serialize)]
struct Ip {
    ip: String
}

let uri = "https://httpbin.org/post";
let data = &Ip { ip: "129.0.0.1".into() };
let res = surf::post(uri).body_json(data)?.await?;
assert_eq!(res.status(), 200);

let uri = "https://api.ipify.org?format=json";
let Ip { ip } = surf::get(uri).recv_json().await?;
assert!(ip.len() > 10);

And even creating streaming proxies is no trouble at all.

let req = surf::get("https://img.fyi/q6YvNqP").await?;
let body = surf::http::Body::from_reader(req, None);
let res = surf::post("https://box.rs/upload").body(body).await?;

Setting configuration on a client is also straightforward.

use std::convert::TryInto;
use std::time::Duration;
use surf::{Client, Config};
use surf::Url;

let client: Client = Config::new()
    .set_base_url(Url::parse("http://example.org")?)
    .set_timeout(Some(Duration::from_secs(5)))
    .try_into()?;

let mut res = client.get("/").await?;
println!("{}", res.body_string().await?);

Features

The following features are available. The default features are curl-client, middleware-logger, and encoding

  • curl-client (default): use curl (through isahc) as the HTTP backend.
  • h1-client: use async-h1 as the HTTP backend with native TLS for HTTPS.
  • h1-client-rustls: use async-h1 as the HTTP backend with rustls for HTTPS.
  • hyper-client: use hyper (hyper.rs) as the HTTP backend.
  • wasm-client: use window.fetch as the HTTP backend.
  • middleware-logger (default): enables logging requests and responses using a middleware.
  • encoding (default): enables support for body encodings other than utf-8.

Installation

Install OpenSSL -

  • Ubuntu - sudo apt install libssl-dev
  • Fedora - sudo dnf install openssl-devel

Make sure your rust is up to date using: rustup update

With cargo add installed :

$ cargo add surf

Safety

This crate makes use of a single instance of unsafe in order to make the WASM backend work despite the Send bounds. This is safe because WASM targets currently have no access to threads. Once they do we'll be able to drop this implementation, and use a parked thread instead and move to full multi-threading in the process too.

Contributing

Want to join us? Check out our "Contributing" guide and take a look at some of these issues:

See Also

Thanks

Special thanks to prasannavl for donating the crate name, and sagebind for creating an easy to use async curl client that saved us countless hours.

License

MIT OR Apache-2.0

surf's People

Contributors

azanellato avatar brightly-salty avatar cjpearce avatar dignifiedquire avatar fairingrey avatar fishrock123 avatar goto-bus-stop avatar hipstermojo avatar icewind1991 avatar jbr avatar jkelleyrtp avatar joshtriplett avatar k-nasa avatar killercup avatar licenser avatar link2xt avatar misuzu avatar mustafapc19 avatar philiptrauner avatar rossdylan avatar rtyler avatar shuhei avatar taiki-e avatar tako8ki avatar tirr-c avatar tmcw avatar utsavm9 avatar yoshuawuyts avatar zkat avatar zyctree avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

surf's Issues

Support persistent connections

When working with http, it's possible for clients to send a Keep-Alive header and send multiple http requests through a single TCP connection. It's pretty important if Surf is going to be used in highly-concurrent environments for it to support this for perf reasons, at least on some backends.

http-client interface

We should have an abstract interface so we're able to swap out the request backends.

Basic version

This is probably the first interface we should aim for. Request goes in, response goes out.

trait HttpClient {
    type Error;
    fn request(req: Request) -> BoxFuture<'static, Result<Response, Self::Error>>;
}

With reusable connections

I'm less sure about this one. We'll cross that bridge when we get there, but the idea is to be able to pass a reusable connection into the client to make a connection before proceeding. Useful for connection pooling.

trait HttpClient {
    type RequestError;
    type ConnectionError;
    type Connection;

    fn connect(&self) -> BoxFuture<'static, Result<Self::Connection, Self::ConnectionError>>;
    fn request(conn: Self::Connection, req: Request) -> BoxFuture<'static, Result<Response, Self::Error>>;
}

Example: GitHub SDK

I was talking to @davidbarsky today, and something that would be great to document is how to build an SDK fronted by some sort of login function. This should probably become a proper example (perhaps for GitHub or something?) but here's an initial sketch:

Usage

// intialize the sdk. If there's no local credentials stored, call the callback and get a username + password out
let sdk = Sdk::connect(async || {
    let username = dialoguer::Input::new("Username").interact()?;
    let password = dialoguer::PasswordInput::new("Password").interact()?;
    Ok((username, password))
}).await?;

// If we were able to get the sdk setup, we can now interact with the API!
sdk.upload("/tmp/chashu.png").await?;

Implementation

struct Sdk {
    client: surf::Client,
    api_token: String,
};

#[derive(serde::DeserializeOwned, serde::Serialize)]
struct Credentials {
    api_token: String
}

impl Sdk {
    /// 
    /// # Example
    /// ```
    /// let sdk = Sdk::connect(async || {
    ///     let username = Input::new("Username").interact()?;
    ///     let password = PasswordInput::new("Password").interact()?;
    ///     Ok((username, password))
    /// }).await?;
    ///
    /// sdk.upload("/tmp/chashu.png").await?;
    /// ```
    pub async fn login(login_fn: impl Fn() -> (String, String)) -> Result<Self, LoginError> {
        // Try and get the credentials locally
        let res = try {
            let file = directories::BaseDirs::data_dir().push_str("my_sdk/store.json");
            let string = async_std::fs::read_to_string(file).await?;
            let creds: Credentials = serde_json::parse(string)?;
            creds
        };

        // Either we had the credentials locally, or we should 
        match creds {
            Ok(creds) => {
                // TODO: save api_token locally to disk
                Self {
                    api_token: creds.api_token,
                    client: surf::Client::new(),
                }

            }
            Err(_) => {
                let (username, password) = login_fn?,
                let sdk = self::login_with_creds(username, password).await;
                // TODO: save sdk.api_token locally to disk
                sdk
            }
        }
    }

    /// method to login with credentials. Usually called from the callback in 
    async fn login_with_creds(username: String, password: String) -> Result<Self, LoginError> {
        #[derive(serde::Serialize)]
        struct Request {
            username: String,
            password: String,
        }

        #[derive(serde::DeserializeOwned)]
        struct Request {
            token: String,
        }

        let client = surf::Client::new()?;

        let creds: Response = client.post("my-app.com/api/login")
            .json(Request { username, password })
            .recv_json()
            .await?;

        Ok(Self {
            client,
            api_token: creds.token
        })
    }

    async fn store_creds(creds: Credentials) -> Result<(), Error> {
        let file = directories::BaseDirs::data_dir().push_str("my_sdk/store.json");
        mkdirp(file)?;
        let buf = serde_json::as_bytes(creds)?;
        async_std::fs::write_all(buf).await?;
        Ok(())
    }

    pub async fn upload(&self, file: AsRef<Path>) -> Result<(), Error> {
        // upload file to some api using the token and http client
    }
}

surf::Exception does not implement std::error::Error

This is an unfriendly choice for errors returned by a library because it makes them not work with ?.

async fn repro() -> Result<(), failure::Error> {
    let _ = surf::get("https://www.rust-lang.org").await?; // doesn't work
    Ok(())
}
async fn repro() -> anyhow::Result<()> {
    let _ = surf::get("https://www.rust-lang.org").await?; // doesn't work
    Ok(())
}

Application-focused error types like failure::Error are built on impl<T: std::error::Error> From<T> which is why it matters that library errors implement std::error::Error.

curl client does not receive the full body

It seems like the curl based client does not wait until all the data is received, which can make it fail deserializing when the response is too large.

The code is roughly this, but this is not the actual URL as its a private gitlab instance:

    let response: Vec<Project> = surf::get("http://gitlab.com/api/v4/projects")
        .recv_json()
        .await
        .unwrap();

The response suddenly ends like this:

..."star_count":100,"forks_count":2,"last_activity_at":"2019-07-02T08:41:28.976Z","namespace":{"id":999,"name":"r

Querying the same URL with the curl command line tool shows the whole body.

Surf get and post requests do not work

When I tried surf for make get request on https://static.crates.io/crates/{crate}/{crate}-{version}.crate for example I get CloudFront responding 501 Not Yet Implemented, "The request could not be satisfied". Cloudfront is not the only website that do not like surf's get requests.

I also tried to make surf post requests, those only worked with the curl-client as the hyper-client do not send the body to the server. There is already an open issue about this #46.

So I end up doing crazy things that worked, I declared two dependencies of the same package and use the one with curl for the post and the one with hyper for the get requests.

[dependencies.surf]
git = "https://github.com/badboy/surf.git"
branch = "hyper-client"
default-features = false
features = ["hyper-client"]

[dependencies.surf-curl]
package = "surf"
git = "https://github.com/rustasync/surf.git"
branch = "master"
Get and Post raw requests

Get requests with the curl feature (does not work)

A typical get request with surf:

GET /crates/pcsc/pcsc-2.2.0.crate HTTP/1.1
Host: 127.0.0.1:8000
Accept: */*
Accept-Encoding: deflate, gzip
user-agent: curl/7.54.0 isahc/0.7.3
transfer-encoding: chunked
Expect: 100-continue

0

And here is one working with the curl command line:

GET /crates/pcsc/pcsc-2.2.0.crate HTTP/1.1 HTTP/1.1
Host: 127.0.0.1:8000
User-Agent: curl/7.54.0
Accept: */*

Post requests with the hyper feature (does not work)

Here is one post request with surf:

POST /indexes/bf16b15f/documents HTTP/1.1
x-meili-api-key: ckxGd3f97DXVKAhfkZ2mBTSc
content-type: application/json
host: 127.0.0.1:8000
transfer-encoding: chunked

0

And here is one using the curl command:

POST /indexes/bf16b15f/documents HTTP/1.1
Host: 127.0.0.1:8000
User-Agent: curl/7.54.0
Accept: */*
Content-Length: 7
Content-Type: application/x-www-form-urlencoded

"Hello"

InvalidData errors are unhelpful

When deserializing JSON fails a std::io::ErrorKind::InvalidData is returned. While this is technically accurate it isn't very useful. serde provides very useful error messages so it would be nice if those were available somehow.

surf::Client's trait bounds make it hard to pass into other types

Last time I injected surf::Client I had to add two generic types and copy-paste trait bounds verbatim to both struct and impl block declarations:

struct MyStruct<C, E> / impl<C, E> for MyStruct<C, E>
where
    C: surf::middleware::HttpClient<Error = E>,
    E: std::error::Error + Send + Sync + 'static,
...

Related to #54

Something unwraps "Too many open files" errors

Bug Report

Your Environment

Software Version(s)
surf master
Rustc rustc 1.38.0-nightly (534b42394 2019-08-09)
Operating System Linux
Client Type chttp

Expected Behavior

All connection failures should be reported to the user to handle.

Current Behavior

thread 'async-task-driver' panicked at 'client failed to initialize: Io(Os { code: 24, kind: Other, message: "Too many open files" })', src/libcore/result.rs:1084:5

Code Sample

https://github.com/async-std/async-crawler/blob/master/src/main.rs#L52

Set crawl depth to 3 on a machine with a 1024 open files limit.

What's the type of `surf::Client::new()` ?

Hello, I'm trying to create a client instance and pass it to functions for reuse:

let client = surf::Client::new();
get_item(client, url);

But what type should I use for the function parameter ?

async fn get_item(client: ???, url: String) -> ...

How can I get the status code when using Request?

Hello folks!
I have a function that looks like this:

   pub async fn get<T: DeserializeOwned>(&self, request_builder: &RequestBuilder) -> Vec<T> {
       let request = request_builder.build();
       let url = request.url().clone().into_string();
       let response_str = request.recv_string().await.unwrap();

       serde_json::from_str::<Vec<T>>(&response_str).unwrap()
   }

I've been looking around all the surf api surface, and swapping trying different methods like recv_string, or recv_json but if I do not use surf::Client.get I can't receive a Response object to check the Status Code and retrieve the body from there.

I need to use Request because I am using a RequestBuilder, and the idea is doing something like request.send and receive a Result with the response/error. Is this possible?

Thanks!

How to get the request body

Hello guys. I am checking source code inside surf::Request and I am trying to find a way to inspect body bytes / body string without moving ownership because I need to send some signed headers with the content, and the use the same request to be sent using http client.

The method I'm trying to do would be something like this:

 pub(crate) async fn sign(request: &surf::Request<impl HttpClient>) {

        let host = request.url();
        let verb = request.method()
                        .to_string()
                        .to_uppercase();

      //Need to read request without moving

   }

Im seeing there is a bunch of methods to set the body but I can't find anyone to read the bytes that are already inside the body. Any way to do this?

Thanks!

querystring encoding

There should be a way to use serde to encode querystrings after setting the url. e.g. using serde-urlencoded:

#[derive(Serialize)
struct Params { name: String }

let params = Params { name: "nori".to_string() };
let res = surf::get("https://my-url.com")
    .query(params)
    .await?;

How to set request timeout?

Didn't find anything on timeouts in docs or sources. Is there a way to get an error after waiting for the response after the given duration?

Making the `Client` to use dynamic dispatch

Instead of passing the internal implementation in Client<T: HttpClient>, why not just hiding it by using dynamic dispatch and a smart pointer? In the end, the performance hit would not be that bad compared to the time taken by the network traffic itself.

So, instead of

#[derive(Debug, Default)]
pub struct Client<C: HttpClient> {
    client: C,
}

we'd have

#[derive(Debug, Default)]
pub struct Client {
    client: Box<dyn HttpClient>,
}

Would be nice in some of our cases, where we need to keep the Client in a struct, and we might need to switch the underlying implementation e.g. in cases where we build for wasm. Having dynamic dispatch wouldn't change all the type signatures.

Difficult to use closure as middleware

You can't trivially set a closure as a middleware on a request even though closures implement Middleware; the obvious doesn't work by itself:

surf::post(...)
    .middleware(|request, client, next| { // error on `next`: type annotation needed
        // I want to log the full request, the logger middleware doesn't print much
        trace!("request {:?}", request);
        next.run(request, client)
    })
...

However, because the concrete HttpClient types are not importable, there's no way to give a type for Next that satisfies the compiler:

    .middleware(|request, client, next: Next<'_, /* what the heck do I put here? `impl HttpClient` isn't allowed */> | { .. })

It's not too difficult to write a function that allows you to omit the type annotation, but it's not the greatest experience:

fn apply_middleware_fn<C: HttpClient, F>(request: surf::Request<C>, middleware: F) -> surf::Request<C>
    where F: Send + Sync + 'static + for<'a> Fn(Request, C, Next<'a, C>) -> BoxFuture<'a, Result<Response, Exception>> 
{
    request.middleware(middleware)
}

apply_middleware_fn(request, |request, client, next| { ... });

This could be fixed pretty simply by providing this wrapper as a method on Request:

impl<C: HttpClient> Request<C> {

    pub fn middleware_fn<F>(self, middleware_fn: F) -> Self
        where F: Send + Sync + 'static + for<'a> Fn(Request, C, Next<'a, C>) -> BoxFuture<'a, Result<Response, Exception>>
    {
        self.middleware(middleware_fn)
    }
}

This should allow the above usage to work as-is without naming types in the closure.

hyper-client does not actually build

Using

surf = { version = "1.0.1", features = ["hyper-client"], default-features = false }

I get the following compile error:

   Compiling surf v1.0.1
error[E0432]: unresolved import `hyper_tls`
 --> /home/developer/.cargo/registry/src/github.com-1ecc6299db9ec823/surf-1.0.1/src/http_client/hyper.rs:9:5
  |
9 | use hyper_tls::HttpsConnector;
  |     ^^^^^^^^^ use of undeclared type or module `hyper_tls`

error[E0432]: unresolved import `native_tls`
  --> /home/developer/.cargo/registry/src/github.com-1ecc6299db9ec823/surf-1.0.1/src/http_client/hyper.rs:10:5
   |
10 | use native_tls::TlsConnector;
   |     ^^^^^^^^^^ use of undeclared type or module `native_tls`

error[E0034]: multiple applicable items in scope
  --> /home/developer/.cargo/registry/src/github.com-1ecc6299db9ec823/surf-1.0.1/src/http_client/hyper.rs:76:18
   |
76 |                 .map(|chunk| chunk.map(|chunk| chunk.to_vec()))
   |                  ^^^ multiple `map` found
   |
   = note: candidate #1 is defined in an impl of the trait `futures_util::future::FutureExt` for the type `_`
   = help: to disambiguate the method call, write `futures_util::future::FutureExt::map(Compat01As03::new(body), |chunk| chunk.map(|chunk| chunk.to_vec()))` instead
   = note: candidate #2 is defined in an impl of the trait `futures_util::stream::StreamExt` for the type `_`
   = help: to disambiguate the method call, write `futures_util::stream::StreamExt::map(Compat01As03::new(body), |chunk| chunk.map(|chunk| chunk.to_vec()))` instead
note: candidate #3 is defined in the trait `std::iter::Iterator`
   = help: to disambiguate the method call, write `std::iter::Iterator::map(Compat01As03::new(body), |chunk| chunk.map(|chunk| chunk.to_vec()))` instead

error: aborting due to 3 previous errors

which makes sense considering both of tls crates are marked as optional, but are never activated in surf's Cargo.toml.

update to futures 0.3

Now that async/await is stable, futures 0.3 and async-std 1.0 is released, it would be nice to be able to use surf with all these new stable versions of those projects.

`headers` submodule

We should add a headers submodule with common header types, and a new HeadersIter iterator to replace the visitor pattern we currently have in Response.headers.

chttp backend

Add a chttp backend. This allows us to interface with Curl asynchronously, and should shave about 5mb off our resulting binaries (+ should speed up compilation significantly.)

If we can get this to build reliably we may even want to consider enabling this by default. Our builds take looong right now, and I'm generally not confident in the quality of our HTTP layer. Using curl seems like it would be more in line with how Rust also defaults to the system allocator: easy to swap out, but uses the layer provided by the system by default.

Provide mock `HttpClient`

This would make testing anything using a REST API simpler as one could just mock out the responses.

yup_hyper_mock lets you create a mock Connector implementation that responds to various hostnames:

mock_connector!(MockRedirectPolicy {
    "http://127.0.0.1" =>       "HTTP/1.1 301 Redirect\r\n\
                                 Location: http://127.0.0.2\r\n\
                                 Server: mock1\r\n\
                                 \r\n\
                                "
    "http://127.0.0.2" =>       "HTTP/1.1 302 Found\r\n\
                                 Location: https://127.0.0.3\r\n\
                                 Server: mock2\r\n\
                                 \r\n\
                                "
    "https://127.0.0.3" =>      "HTTP/1.1 200 OK\r\n\
                                 Server: mock3\r\n\
                                 \r\n\
                                "
});

Since we have access to the full request in HttpClient we have a lot more flexibility with matching requests; ideally it should be able to match on paths and body contents.

How to read response as a continuous stream?

I'm trying to make a GET request where the response is a (slow) continuous stream. However, the response reading results to a core dump.

async fn connect_to_server(&mut self) -> Result<()> {
    let mut reader = surf::get("http://...[server-address]").await?;

    println!("{:?}", reader.status());

    let mut buf = [0_u8; 1024];

    loop {
        let n = reader.read(&mut buf).await?;
        println!("Read data {:?} {}", &buf[..16], n);
    }

    Ok(())
}

stdout says:

200
Read data [13, 13, 23, ...] 12
Read data [15, 99, 121, ...] 887
Read data [15, 99, 121, ...] 0
thread 'async-task-driver' panicked at 'Receiver::next_message called after `None`', src/libcore/option.rs:1166:5
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace.
Aborted (core dumped)

Any ideas how to solve this? Here's the relevant part of the backtrace:

  11: core::option::Option<T>::expect
             at /rustc/4cf7673076e6975532213e494dd3f7f9d8c2328e/src/libcore/option.rs:345
  12: futures_channel::mpsc::Receiver<T>::next_message
             at /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-channel-preview-0.3.0-alpha.17/src/mpsc/mod.rs:842
  13: <futures_channel::mpsc::Receiver<T> as futures_core::stream::Stream>::poll_next
             at /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-channel-preview-0.3.0-alpha.17/src/mpsc/mod.rs:912
  14: <sluice::pipe::chunked::Reader as futures_io::if_std::AsyncRead>::poll_read
             at /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/sluice-0.4.1/src/pipe/chunked.rs:83
  15: <sluice::pipe::PipeReader as futures_io::if_std::AsyncRead>::poll_read
             at /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/sluice-0.4.1/src/pipe/mod.rs:43
  16: <chttp::body::Body as futures_io::if_std::AsyncRead>::poll_read
             at /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/chttp-0.5.5/src/body.rs:157
  17: <alloc::boxed::Box<T> as futures_io::if_std::AsyncRead>::poll_read
             at /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-io-preview-0.3.0-alpha.17/src/lib.rs:352
  18: <surf::http_client::Body as futures_io::if_std::AsyncRead>::poll_read
             at /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/surf-1.0.1/src/http_client/mod.rs:83
  19: <&mut T as futures_io::if_std::AsyncRead>::poll_read
             at /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-io-preview-0.3.0-alpha.17/src/lib.rs:352
  20: <surf::response::Response as futures_io::if_std::AsyncRead>::poll_read
             at /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/surf-1.0.1/src/response.rs:246
  21: <&mut T as futures_io::if_std::AsyncRead>::poll_read
             at /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-io-preview-0.3.0-alpha.17/src/lib.rs:352

Add Request.body_file method

Once runtime has a File struct we should have a req.body_file(path) method that streams out the file and uses mime_guess to auto-set the mime type. Would make it really easy to send files over the wire โœจ

Make `Next` cloneable?

Sometimes middleware needs to make requests multiple times e.g. retry middleware. However, Next::run consumes self, so we can make only one actual request per call.

Next has immutable references only, so it's possible to make it derive Clone/Copy. Would it be okay to make it cloneable/copyable?

Synthetic Responses

While it's possible to craft new Requests with Request::new(), it's not currently possible to synthesize a new Response that can be returned from middlewares.

Being able to do this is important for my caching middleware, because Responses need to be looked up in the cache and returned. In this case, I'm not even running next.run() when a cache hit is verified.

Cannot do anything without `native-client` feature

Since clients are all pub(crate) and one_off module is depend on native-client feature, we can't do anything without native-client feature. We should make other clients public or used by one_off module.

Failed to compile hello-world example (AsyncRead is not implemented for std::io::Empty)

I tried to compile examples/hello-world.rs example on rust 1.39 beta and got an exception while compiling surf

error[E0277]: the trait bound `std::io::Empty: futures_io::if_std::AsyncRead` is not satisfied
  --> /home/akviatko/.cargo/registry/src/github.com-1ecc6299db9ec823/surf-1.0.2/src/http_client/mod.rs:64:21
   |
64 |             reader: Box::new(std::io::empty()),
   |                     ^^^^^^^^^^^^^^^^^^^^^^^^^^ the trait `futures_io::if_std::AsyncRead` is not implemented for `std::io::Empty`
   |
   = note: required for the cast to the object type `dyn futures_io::if_std::AsyncRead + std::marker::Send + std::marker::Unpin`

error: aborting due to previous error

For more information about this error, try `rustc --explain E0277`.
error: could not compile `surf`.

To learn more, run the command again with --verbose.
  • rust toolchain: beta-x86_64-unknown-linux-gnu
  • rustc 1.39.0-beta.2 (5752b6348 2019-09-27)

I also created a repo with example project that does not compile. It has Cargo.lock which might help to investigate the bug.

Cannot convert Exception to failure::Error

async fn main() -> failure::Fallible<()> {
    surf::Client::new()
        .get("https://google.com")
        .recv_json::<()>()
        .await?;

    Ok(())
}

yields

the size for values of type `dyn std::error::Error + std::marker::Send + std::marker::Sync` cannot be known at compilation time

query string does not get sent

Hi!

I am using surf and the set_query() function does not seem to actually set the query string. I have checked the tests and outside of setting the url().query() result (which works fine), I couldn't find any real examples.

Expected

https://httpbin.org/get?a=1535&b=https%3A%2F%2Fblog.x5ff.xyz

{
  "args": {
    "a": "1535", 
    "b": "https://blog.x5ff.xyz"
  }, 
  "headers": {
    "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8", 
    "Accept-Encoding": "gzip, deflate, br", 
    "Accept-Language": "en-US,en;q=0.9,de;q=0.8", 
    "Cookie": "freeform=", 
    "Dnt": "1", 
    "Host": "httpbin.org", 
    "Upgrade-Insecure-Requests": "1", 
    "User-Agent": "Mozilla/5.0 (X11; Fedora; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36"
  }, 
  "origin": "your IP", 
  "url": "https://httpbin.org/get?a=1535&b=https%3A%2F%2Fblog.x5ff.xyz"
}

Actual

Output of https://github.com/celaus/surf-query-string

Query string: Some("a=1535&b=https%3A%2F%2Fblog.x5ff.xyz")
Response: {
  "args": {}, 
  "headers": {
    "Accept": "*/*", 
    "Accept-Encoding": "deflate, gzip", 
    "Content-Length": "0", 
    "Host": "httpbin.org", 
    "User-Agent": "curl/7.65.1-DEV chttp/0.5.5"
  }, 
  "origin": "your IP", 
  "url": "https://httpbin.org/get"
}
active toolchain
----------------

nightly-x86_64-unknown-linux-gnu (default)
rustc 1.39.0-nightly (4cf767307 2019-08-18)

(on Fedora, if that matters)

Error while compiling: futures_io::if_std::AsyncRead is not implemented for std::io::Empty

Hi! I was poking around the Rust async ecosystem out of curiosity, and when I tried to install surf in an empty project I got this error:

   Compiling surf v1.0.2
error[E0277]: the trait bound `std::io::Empty: futures_io::if_std::AsyncRead` is not satisfied
  --> /Users/cdickinson/.cargo/registry/src/github.com-1ecc6299db9ec823/surf-1.0.2/src/http_client/mod.rs:64:21
   |
64 |             reader: Box::new(std::io::empty()),
   |                     ^^^^^^^^^^^^^^^^^^^^^^^^^^ the trait `futures_io::if_std::AsyncRead` is not implemented for `std::io::Empty`
   |
   = note: required for the cast to the object type `dyn futures_io::if_std::AsyncRead + std::marker::Send + std::marker::Unpin`

error: aborting due to previous error

For more information about this error, try `rustc --explain E0277`.
error: could not compile `surf`.

To learn more, run the command again with --verbose.

The verbose output is here:

     Running `rustc --edition=2018 --crate-name surf /Users/cdickinson/.cargo/registry/src/github.com-1ecc6299db9ec823/surf-1.0.2/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type lib --emit=dep-info,metadata,link -C debuginfo=2 --cfg 'feature="curl-client"' --cfg 'feature="default"' --cfg 'feature="isahc"' --cfg 'feature="js-sys"' --cfg 'feature="middleware-logger"' --cfg 'feature="native-client"' --cfg 'feature="wasm-bindgen"' --cfg 'feature="wasm-bindgen-futures"' --cfg 'feature="wasm-client"' --cfg 'feature="web-sys"' -C metadata=1c3e514c0a858c33 -C extra-filename=-1c3e514c0a858c33 --out-dir /Users/cdickinson/projects/personal/ds/target/debug/deps -L dependency=/Users/cdickinson/projects/personal/ds/target/debug/deps --extern futures=/Users/cdickinson/projects/personal/ds/target/debug/deps/libfutures-365de1e9e66eb651.rmeta --extern http=/Users/cdickinson/projects/personal/ds/target/debug/deps/libhttp-bbece310474301d6.rmeta --extern isahc=/Users/cdickinson/projects/personal/ds/target/debug/deps/libisahc-c4452edccdc968ce.rmeta --extern log=/Users/cdickinson/projects/personal/ds/target/debug/deps/liblog-d5fa1487881b71e9.rmeta --extern mime=/Users/cdickinson/projects/personal/ds/target/debug/deps/libmime-6b951b76af8ba87e.rmeta --extern mime_guess=/Users/cdickinson/projects/personal/ds/target/debug/deps/libmime_guess-fba49381eb3931b8.rmeta --extern serde=/Users/cdickinson/projects/personal/ds/target/debug/deps/libserde-600cba9868aa7ea8.rmeta --extern serde_json=/Users/cdickinson/projects/personal/ds/target/debug/deps/libserde_json-12247f39cf48300d.rmeta --extern serde_urlencoded=/Users/cdickinson/projects/personal/ds/target/debug/deps/libserde_urlencoded-9519514955e6e3f5.rmeta --extern url=/Users/cdickinson/projects/personal/ds/target/debug/deps/liburl-691a7168f39f84f4.rmeta --cap-lints allow -L native=/Users/cdickinson/projects/personal/ds/target/debug/build/libnghttp2-sys-d38528ac152f3c05/out/i/lib`
error[E0277]: the trait bound `std::io::Empty: futures_io::if_std::AsyncRead` is not satisfied
  --> /Users/cdickinson/.cargo/registry/src/github.com-1ecc6299db9ec823/surf-1.0.2/src/http_client/mod.rs:64:21
   |
64 |             reader: Box::new(std::io::empty()),
   |                     ^^^^^^^^^^^^^^^^^^^^^^^^^^ the trait `futures_io::if_std::AsyncRead` is not implemented for `std::io::Empty`
   |
   = note: required for the cast to the object type `dyn futures_io::if_std::AsyncRead + std::marker::Send + std::marker::Unpin`

error: aborting due to previous error

For more information about this error, try `rustc --explain E0277`.
error: could not compile `surf`.

The dep is installed like so:

[dependencies]
surf = "1.0.2"

And my toolchain info is:

โ†’ rustup toolchain list
stable-x86_64-apple-darwin
nightly-2018-12-27-x86_64-apple-darwin
nightly-x86_64-apple-darwin (default)

A quick google didn't turn up any results, so it might be that I'm just holding it wrong! Any pointers would be much appreciated โ€“ & thanks for your great work on the async ecosystem ๐Ÿ’–

unsatisfied trait bound

I am currently using surf v1.0.2 with rustc 1.40.0-nightly (c27f7568b 2019-10-13).

Here's the error when I trying to run the example on the README.

error[E0277]: the trait bound `std::io::Empty: futures_io::if_std::AsyncRead` is not satisfied
  --> /home/eric/.cargo/registry/src/github.com-1ecc6299db9ec823/surf-1.0.2/src/http_client/mod.rs:64:21
   |
64 |             reader: Box::new(std::io::empty()),
   |                     ^^^^^^^^^^^^^^^^^^^^^^^^^^ the trait `futures_io::if_std::AsyncRead` is not implemented for `std::io::Empty`
   |
   = note: required for the cast to the object type `dyn futures_io::if_std::AsyncRead + std::marker::Send + std::marker::Unpin`

error: aborting due to previous error

For more information about this error, try `rustc --explain E0277`.
error: could not compile `surf`.

WASM Polish

#25 introduces a minimal WASM backend for Surf. In order to make it production ready, we should:

  • support different methods on request
  • support body uploads
  • remove all unwraps
  • set headers on request
  • get headers from response
  • create an internal fetch client abstraction to get ready to move the bulk of the code to gloo

Thanks!

Response method body_string may use wrong character encoding

The body_string method in the Response class is as follows:

    pub async fn body_string(&mut self) -> Result<String, Exception> {
        let bytes = self.body_bytes().await?;
        Ok(String::from_utf8(bytes).map_err(|e| io::Error::new(io::ErrorKind::InvalidData, e))?)
    }

This just grabs the body as a byte array, and then decodes it with UTF-8. If it wasnโ€™t originally encoded in UTF-8, this is wrong. What I think most people would expect body_string to do (I certainly did) is extract the character encoding from the charset parameter of the Content-Type header and decode the bytes to a string using that encoding.

While there is a note that โ€œIf the body cannot be interpreted as valid UTF-8, an Err is returned.โ€ I found this confusing. I didnโ€™t think it meant that the declared encoding was ignored, I thought it actually meant if the body could not be represented in UTF-8 (perhaps because it was binary data not matching any, or the declared, encoding). I am not sure whether this method, as currently written, is really all that useful? I found it rather misleading myself.

How to send multiple http requests with surf & async-std ?

Hello,

How can I send N http requests with at most m TCP connections (to limit file descriptors) with surf & async-std ?

I imagined something like:

let urls: Vec<String> = ...;
let client = surf::Client::new();

stream::???(urls)    
    .map(GET with client)
    .limit(m);

(I read surf & async-std docs but still don't know how to do it.)

design doc

Surf Architecture

This is a (quick) design document for Surf, an HTTP client with middleware
support in Rust.

Goals

  • concise API for most common cases
  • builder API for more complex cases
  • backend agnostic (e.g. swap Hyper for libcurl or browser fetch)
  • extendable using middleware
    • enable some middleware by default (compression, redirects, etc)
    • provide a way to opt-out of default middleware

API Example

// text
let res = xhr::get("/endpoint").send().await?;
dbg!(res);

// json
#[derive(Deserialize)]
struct User { name: String }
let user: User = xhr::get("/endpoint").send_json().await?;
dbg!(user);

Rust types are nice in that send_json can take a type parameter (User in
this case) which allows it to try and deserialize the type into that struct, and
if it can't it'll return an error. This allows for super concise JSON APIs!

Another implication of the way this is done is that the struct definitions can
be shared between servers & clients. This means that we can guarantee a server
and corresponding client can work with each other, even if the wire protocol is
something like JSON. And this can further be enhance using something like
session-types.

Architecture

  • surf as the top-level framework crate with sane defaults & middleware
  • http-client to provide the swappable backend layer
    • the middleware layer probably should live here too
    • contains types also
  • a collection of useful middleware in the same repo as surf
 -----------------
| Surf middleware | = `cargo add surf`
| Surf core       |
 -----------------
       ^
       |-----------------
                        |
 -------------------    |
| http-client-curl  |   |
| http-client-fetch |   |
 -------------------    |
| http-client-hyper | --|
| http-client       | = `cargo add http-client`
 -------------------

Middleware

let user: User = xhr::get("/endpoint")
    .middleware(some_middleware::new("value"))
    .send_json().await?;

We should take a look at make-fetch-happen and start by building out the features it implements. We should probably define some sort of prioritization here (:

Connection Pooling

A core feature of surf should be the ability to perform connection pooling.
I'm not sure yet how that works, as I've never used & researched it. But it
seems very important, and we should come up with a design here. I'm thinking
in terms of API it may be something like this:

let pool = surf::Pool::new()
    .max_sockets(512)
    .max_free_sockets(256)
    .build();

let res = pool.get("some_url").send().await?;

I somehow suspect that a similar API may be relevant for performing H2 requests
down the line too, as it's yet another form of multiplexing. It's a bit early
right now, but if we can get the API right we can add this functionality at a
later stage without too many surface changes.

Conclusion

This is about it for now. I hope I covered the rough outlines here!

References

Allow ?Sized arguments to request methods

I'm migrating from reqwest because I want my library to be used in WASM or native, and one of things that has bitten me is that Request::set_query() and Request::body_json() require their parameters to be Sized. A few of the requests I make pass in &[&str] which gets encoded as JSON. Serde handles this fine, and both Surf and reqwest call serde_json::to_vec().

Is it possible to modify body_form(), body_json(), and set_query() to take a &impl Serialize + ?Sized parameter?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.