Git Product home page Git Product logo

req's Introduction

Req

License BSD3 Hackage Stackage Nightly Stackage LTS CI

{-# LANGUAGE OverloadedStrings #-}

module Main (main) where

import Control.Monad.IO.Class
import Data.Aeson
import Network.HTTP.Req

main :: IO ()
-- You can either make your monad an instance of 'MonadHttp', or use
-- 'runReq' in any IO-enabled monad without defining new instances.
main = runReq defaultHttpConfig $ do
  let payload =
        object
          [ "foo" .= (10 :: Int),
            "bar" .= (20 :: Int)
          ]
  -- One function—full power and flexibility, automatic retrying on timeouts
  -- and such, automatic connection sharing.
  r <-
    req
      POST -- method
      (https "httpbin.org" /: "post") -- safe by construction URL
      (ReqBodyJson payload) -- use built-in options or add your own
      jsonResponse -- specify how to interpret response
      mempty -- query params, headers, explicit port number, etc.
  liftIO $ print (responseBody r :: Value)

Req is an HTTP client library that attempts to be easy-to-use, type-safe, and expandable.

“Easy-to-use” means that the library is designed to be beginner-friendly so it's simple to add to your monad stack, intuitive to work with, well-documented, and does not get in your way. Doing HTTP requests is a common task and a Haskell library for this should be approachable and clear to beginners, thus certain compromises were made. For example, one cannot currently modify ManagerSettings of the default manager because the library always uses the same implicit global manager for simplicity and maximal connection sharing. There is a way to use your own manager with different settings, but it requires more typing.

“Type-safe” means that the library tries to eliminate certain classes of errors. For example, we have correct-by-construction URLs; it is guaranteed that the user does not send the request body when using methods like GET or OPTIONS, and the amount of implicit assumptions is minimized by making the user specify their intentions in an explicit form. For example, it's not possible to avoid specifying the body or the method of a request. Authentication methods that assume HTTPS force the user to use HTTPS at the type level.

“Expandable” refers to the ability to create new components without having to resort to hacking. For example, it's possible to define your own HTTP methods, create new ways to construct the body of a request, create new authorization options, perform a request in a different way, and create your own methods to parse a response.

The library uses the following mature packages under the hood to guarantee you the best experience:

It is important to note that since we leverage well-known libraries that the whole Haskell ecosystem uses, there is no risk in using Req. The machinery for performing requests is the same as with http-conduit and Wreq. The only difference is the API.

Related packages

The following packages are designed to be used with Req:

  • req-conduit—support for streaming request and response bodies in constant memory.

If you happen to have written a package that adds new features to Req, please submit a PR to include it in this list.

Blog posts

Contribution

Issues, bugs, and questions may be reported in the GitHub issue tracker for this project.

Pull requests are also welcome.

License

Copyright © 2016–present Mark Karpov

Distributed under BSD 3 clause license.

req's People

Contributors

aver1y avatar debug-ito avatar dependabot[bot] avatar elldritch avatar felixonmars avatar gabriella439 avatar galagora avatar googleson78 avatar ivo-stefchev avatar jforberg avatar jkachmar avatar joehillen avatar kindaro avatar mbucc avatar mrkkrp avatar ocharles avatar ocramz avatar ramirez7 avatar rsoeldner avatar sjshuck-ibm avatar srid avatar stevenwinfo avatar waiting-for-dev avatar y-usuzumi avatar ysangkok avatar zer0- avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

req's Issues

Newlines in header values silently break everything

When I have a header option like

header "Authorization" ("Bearer " <> mytoken)

and mytoken ends with a newline (got from external process and forgot to strip it) then the newline is inserted in the middle of headers, thus making some headers part of the body and sending you on a debugging trip.

For example, python requests library checks headers and rejects inappropriate headers before actually sending a request.

req test suite nondeterministic failures (httpbin probems?)

As seen on the Stackage build server:

  httpbin-tests/Network/HTTP/ReqSpec.hs:293:5: 
  1) Network.HTTP.Req, streaming random bytes, works
       uncaught exception: HttpException
       VanillaHttpException (HttpExceptionRequest Request {
         host                 = "httpbin.org"
         port                 = 443
         secure               = True
         requestHeaders       = []
         path                 = "/stream-bytes/0"
         queryString          = ""
         method               = "GET"
         proxy                = Nothing
         rawBody              = False
         redirectCount        = 10
         responseTimeout      = ResponseTimeoutDefault
         requestVersion       = HTTP/1.1
       }
        (StatusCodeException (Response {responseStatus = Status {statusCode = 503, statusMe
ssage = "Service Temporarily Unavailable"}, responseVersion = HTTP/1.1, responseHeaders = [
("Date","Tue, 10 Jul 2018 02:44:52 GMT"),("Content-Type","text/plain"),("Content-Length","4
0"),("Connection","keep-alive"),("ETag","\"5b42e707-28\""),("Vary","Accept"),("x-now-trace"
,"sfo1"),("server","now"),("now","1"),("cache-control","s-maxage=0")], responseBody = (), r
esponseCookieJar = CJ {expose = []}, responseClose' = ResponseClose}) "The deployment is cu
rrently unavailable\n"))
       (after 97 tests and 1 shrink)
         Small {getSmall = 0}

  To rerun use: --match "/Network.HTTP.Req/streaming random bytes/works/"

Randomized with seed 1455496155

Finished in 3.7513 seconds

cf. snoyberg/http-client#360 for a similar issue

Make `parseUrlHttp[s]` take in a `Text` value

ByteString semantically represents a sequence of 8 bit octets, not a series of ASCII characters, and URLs are semantically sequences of characters. You can see this distinction when looking at how Char8 is too big to store only ASCII, and with how often ByteString is used to store non-ASCII data, and with the non-ASCII encoding typically used when outputting to stdout.

Currently if we have a Text value that we want to convert into a URL then we are in an awkward position of deciding how to convert it to a ByteString. We can T.unpack it and any non-ASCII text essentially becomes garbage. We can T.encodeUtf8 it which is probably the most sensible thing to do, as it appears that parseUrlHttps will re-pack that utf-8 back into the appropriate Text value. We could also mistakenly use any other non-ASCII-compatible encoding like utf-16 and break everything.

It seems to make the most sense to just have it be a Text value from the start, particularly given the fact that parseUrlHttp[s] already seems to do proper UTF-8 decoding.

Examples encourage use of orphan instances

The first code examples in the documentation suggests doing this as part of the boilerplate:

instance MonadHttp IO where
    handleHttpException = throwIO

This results in an orphan instance compiler warning:

Main.hs:15:1: warning: [-Worphans]
    Orphan instance: instance MonadHttp IO
    To avoid this
        move the instance declaration to the module of the class or of the type, or
        wrap the type with a newtype and declare the instance on the new type.
   |
15 | instance MonadHttp IO where
   | ^^^^^^^^^^^^^^^^^^^^^^^^^^^...

It doesn't sit well for the first sample to result in scary compiler warning.

Update

I did some more tinkering and it looks like using runReq can be used to avoid having to define an orphan instance of MonadHttp for IO. This, unfortunately, introduces a further package dependency, namely data-default since there seems to be no other way to obtain the default value of HttpConfig. This leads to the following, warning-free initial example:

{-# LANGUAGE OverloadedStrings #-}

module Main (main) where

import Control.Monad.IO.Class
import Data.Default
import Network.HTTP.Req
import qualified Data.ByteString.Char8 as B

main :: IO ()
main = runReq def $ do
    let n :: Int
        n = 5
    bs <- req GET (https "httpbin.org" /: "bytes" /~ n) NoReqBody bsResponse mempty
    liftIO $ B.putStrLn (responseBody bs)

My suggestion would be to update the first examples to demonstrate this usage of the API. I would also consider exporting a new defaultHttpConfig function so that users do not need to use Data.Default.def and add the data-default dependency if they would prefer not to. If you were open to this, I'd be happy to make the changes myself and submit a pull request.

Failing tests on LTS build

Failures:

  httpbin-tests/Network/HTTP/ReqSpec.hs:103: 
  1) Network.HTTP.Req, receiving request headers back, works
       expected: Object (fromList [("headers",Object (fromList [("Accept-Encoding",String "gzip"),("Foo",String "bar"),("Baz",String "quux"),("Host",String "httpbin.org")]))])
        but got: Object (fromList [("headers",Object (fromList [("Accept-Encoding",String "gzip"),("Connection",String "close"),("Foo",String "bar"),("Baz",String "quux"),("Host",String "httpbin.org")]))])

  httpbin-tests/Network/HTTP/ReqSpec.hs:115: 
  2) Network.HTTP.Req, receiving GET data back, works
       expected: Object (fromList [("args",Object (fromList [])),("url",String "https://httpbin.org/get"),("headers",Object (fromList [("Accept-Encoding",String "gzip"),("Host",String "httpbin.org")]))])
        but got: Object (fromList [("args",Object (fromList [])),("url",String "https://httpbin.org/get"),("headers",Object (fromList [("Accept-Encoding",String "gzip"),("Connection",String "close"),("Host",String "httpbin.org")]))])

  httpbin-tests/Network/HTTP/ReqSpec.hs:130: 
  3) Network.HTTP.Req, receiving POST JSON data back, works
       expected: Object (fromList [("args",Object (fromList [])),("json",String "foo"),("data",String "\"foo\""),("url",String "https://httpbin.org/post"),("headers",Object (fromList [("Content-Type",String "application/json; charset=utf-8"),("Accept-Encoding",String "gzip"),("Host",String "httpbin.org"),("Content-Length",String "5")])),("files",Object (fromList [])),("form",Object (fromList []))])
        but got: Object (fromList [("args",Object (fromList [])),("json",String "foo"),("data",String "\"foo\""),("url",String "https://httpbin.org/post"),("headers",Object (fromList [("Content-Type",String "application/json; charset=utf-8"),("Accept-Encoding",String "gzip"),("Connection",String "close"),("Host",String "httpbin.org"),("Content-Length",String "5")])),("files",Object (fromList [])),("form",Object (fromList []))])

  httpbin-tests/Network/HTTP/ReqSpec.hs:152: 
  4) Network.HTTP.Req, receiving PATCHed file back, works
       expected: Object (fromList [("args",Object (fromList [])),("json",Null),("data",String "User-agent: *\nDisallow: /deny\n"),("url",String "https://httpbin.org/patch"),("headers",Object (fromList [("Accept-Encoding",String "gzip"),("Host",String "httpbin.org"),("Content-Length",String "30")])),("files",Object (fromList [])),("form",Object (fromList []))])
        but got: Object (fromList [("args",Object (fromList [])),("json",Null),("data",String "User-agent: *\nDisallow: /deny\n"),("url",String "https://httpbin.org/patch"),("headers",Object (fromList [("Accept-Encoding",String "gzip"),("Connection",String "close"),("Host",String "httpbin.org"),("Content-Length",String "30")])),("files",Object (fromList [])),("form",Object (fromList []))])

  httpbin-tests/Network/HTTP/ReqSpec.hs:173: 
  5) Network.HTTP.Req, receiving PUT form URL-encoded data back, works
       expected: Object (fromList [("args",Object (fromList [])),("json",Null),("data",String ""),("url",String "https://httpbin.org/put"),("headers",Object (fromList [("Content-Type",String "application/x-www-form-urlencoded"),("Accept-Encoding",String "gzip"),("Host",String "httpbin.org"),("Content-Length",String "18")])),("files",Object (fromList [])),("form",Object (fromList [("foo",String "bar"),("baz",String "5"),("quux",String "")]))])
        but got: Object (fromList [("args",Object (fromList [])),("json",Null),("data",String ""),("url",String "https://httpbin.org/put"),("headers",Object (fromList [("Content-Type",String "application/x-www-form-urlencoded"),("Accept-Encoding",String "gzip"),("Connection",String "close"),("Host",String "httpbin.org"),("Content-Length",String "18")])),("files",Object (fromList [])),("form",Object (fromList [("foo",String "bar"),("baz",String "5"),("quux",String "")]))])

Randomized with seed 1702600171

Finished in 2.5767 seconds
71 examples, 5 failures

missing defaultHttpConfig

Hi,
I am trying example from readme and it failed with:
Variable not in scope: defaultHttpConfig :: HttpConfig
|
12 | main = runReq defaultHttpConfig $ do
| ^^^^^^^^^^^^^^^^^
Failed, no modules loaded.

It is stack based project and i have these in the dependencies:
library:
source-dirs: src
dependencies:
- mtl
- validation
- optparse-applicative
- req
- aeson

Skip url-encoding

How can I skip url-encoding? For example, I'd like to keep question sign in "search?show_type".

PS. Thanks for great Hachikell (as they say in Moscow) library!

Help request

Would you mind helping me to convert this curl call to a req one:

→ curl -X GET http://puppetdb.prd.srv.cirb.lan:8080/v3/facts --data-urlencode 'query=["=", "certname", "svappcavl841.prd.srv.cirb.lan"]'

req test suite failures

As seen on the stackage build server:

req-0.2.0
dist/build/httpbin-tests/httpbin-tests
...

Failures:

  httpbin-tests/Network/HTTP/ReqSpec.hs:104:
  1) Network.HTTP.Req, receiving request headers back, works
       expected: Object (fromList [("headers",Object (fromList [("Accept-Encoding",String
"gzip"),("Foo",String "bar"),("Baz",String "quux"),("Host",String "httpbin.org")]))])
        but got: Object (fromList [("headers",Object (fromList [("X-Request-Id",String "ea
264881-cd8e-4b1a-b876-186a47bceb9c"),("Connect-Time",String "0"),("Accept-Encoding",String
 "gzip"),("Connection",String "close"),("Foo",String "bar"),("Baz",String "quux"),("Host",
String "httpbin.org"),("Total-Route-Time",String "0"),("Via",String "1.1 vegur")]))])

  httpbin-tests/Network/HTTP/ReqSpec.hs:116:
  2) Network.HTTP.Req, receiving GET data back, works
       expected: Object (fromList [("args",Object (fromList [])),("url",String "https://ht
tpbin.org/get"),("headers",Object (fromList [("Accept-Encoding",String "gzip"),("Host",Str
ing "httpbin.org")]))])
        but got: Object (fromList [("args",Object (fromList [])),("url",String "https://ht
tpbin.org/get"),("headers",Object (fromList [("X-Request-Id",String "b745faaa-9340-40c8-93
35-cde7b6f81522"),("Connect-Time",String "0"),("Accept-Encoding",String "gzip"),("Connecti
on",String "close"),("Host",String "httpbin.org"),("Total-Route-Time",String "0"),("Via",S
tring "1.1 vegur")]))])

  httpbin-tests/Network/HTTP/ReqSpec.hs:131:
  3) Network.HTTP.Req, receiving POST JSON data back, works
       expected: Object (fromList [("args",Object (fromList [])),("json",String "foo"),("d
ata",String "\"foo\""),("url",String "https://httpbin.org/post"),("headers",Object (fromLi
st [("Content-Type",String "application/json; charset=utf-8"),("Accept-Encoding",String "g
zip"),("Host",String "httpbin.org"),("Content-Length",String "5")])),("files",Object (from
List [])),("form",Object (fromList []))])
        but got: Object (fromList [("args",Object (fromList [])),("json",String "foo"),("data",String "\"foo\""),("url",String "https://httpbin.org/post"),("headers",Object (fromList [("Content-Type",String "application/json; charset=utf-8"),("X-Request-Id",String "a2dd54c7-974b-4fe9-9d2a-a2118585c6aa"),("Connect-Time",String "0"),("Accept-Encoding",String "gzip"),("Connection",String "close"),("Host",String "httpbin.org"),("Content-Length",String "5"),("Total-Route-Time",String "0"),("Via",String "1.1 vegur")])),("files",Object (fromList [])),("form",Object (fromList []))])

  httpbin-tests/Network/HTTP/ReqSpec.hs:154:
  4) Network.HTTP.Req, receiving POST data back (multipart form data), works
       expected: Object (fromList [("args",Object (fromList [])),("json",Null),("data",String ""),("url",String "https://httpbin.org/post"),("headers",Object (fromList [("Content-Type",String "multipart/form-data; boundary=----WebKitFormBoundaryJIJzCBnSJaoTXT4L"),("Accept-Encoding",String "gzip"),("Host",String "httpbin.org"),("Content-Length",String "242")])),("files",Object (fromList [])),("form",Object (fromList [("foo",String "foo data!"),("bar",String "bar data!")]))])
        but got: Object (fromList [("args",Object (fromList [])),("json",Null),("data",String ""),("url",String "https://httpbin.org/post"),("headers",Object (fromList [("Content-Type",String "multipart/form-data; boundary=----WebKitFormBoundaryJIJzCBnSJaoTXT4L"),("X-Request-Id",String "d4ee58e9-ebc5-4e0f-ad61-0c884431e555"),("Connect-Time",String "0"),("Accept-Encoding",String "gzip"),("Connection",String "close"),("Host",String "httpbin.org"),("Content-Length",String "242"),("Total-Route-Time",String "0"),("Via",String "1.1 vegur")])),("files",Object (fromList [])),("form",Object (fromList [("foo",String "foo data!"),("bar",String "bar data!")]))])

  httpbin-tests/Network/HTTP/ReqSpec.hs:180:
  5) Network.HTTP.Req, receiving PATCHed file back, works
       expected: Object (fromList [("args",Object (fromList [])),("json",Null),("data",String "User-agent: *\nDisallow: /deny\n"),("url",String "https://httpbin.org/patch"),("headers",Object (fromList [("Accept-Encoding",String "gzip"),("Host",String "httpbin.org"),("Content-Length",String "30")])),("files",Object (fromList [])),("form",Object (fromList []))])
        but got: Object (fromList [("args",Object (fromList [])),("json",Null),("data",String "User-agent: *\nDisallow: /deny\n"),("url",String "https://httpbin.org/patch"),("headers",Object (fromList [("X-Request-Id",String "81ddef2a-ff74-4076-b386-ab93f01e3adf"),("Connect-Time",String "0"),("Accept-Encoding",String "gzip"),("Connection",String "close"),("Host",String "httpbin.org"),("Content-Length",String "30"),("Total-Route-Time",String "0"),("Via",String "1.1 vegur")])),("files",Object (fromList [])),("form",Object (fromList []))])

  httpbin-tests/Network/HTTP/ReqSpec.hs:201:
  6) Network.HTTP.Req, receiving PUT form URL-encoded data back, works
       expected: Object (fromList [("args",Object (fromList [])),("json",Null),("data",String ""),("url",String "https://httpbin.org/put"),("headers",Object (fromList [("Content-Type",String "application/x-www-form-urlencoded"),("Accept-Encoding",String "gzip"),("Host",String "httpbin.org"),("Content-Length",String "18")])),("files",Object (fromList [])),("form",Object (fromList [("foo",String "bar"),("baz",String "5"),("quux",String "")]))])
        but got: Object (fromList [("args",Object (fromList [])),("json",Null),("data",String ""),("url",String "https://httpbin.org/put"),("headers",Object (fromList [("Content-Type",String "application/x-www-form-urlencoded"),("X-Request-Id",String "ca6878af-3f04-4b49-a2f9-2ed36f412bc3"),("Connect-Time",String "0"),("Accept-Encoding",String "gzip"),("Connection",String "close"),("Host",String "httpbin.org"),("Content-Length",String "18"),("Total-Route-Time",String "0"),("Via",String "1.1 vegur")])),("files",Object (fromList [])),("form",Object (fromList [("foo",String "bar"),("baz",String "5"),("quux",String "")]))])

Randomized with seed 678382428

Finished in 2.1131 seconds
72 examples, 6 failures

How to pass GET, POST as a parameter to my function?

I have these functions which are identiacal except that one sends a GET request, the other POST one.

send1 :: IO BS.ByteString
send1 = do
    runReq defaultHttpConfig $ do
        result <- req GET (https "some_domain" /: "some_path") NoReqBody bsResponse getHeaders
        pure (responseBody result :: BS.ByteString)
    
    -- [some stuff...]



send2 :: IO BS.ByteString
send2 = do
    runReq defaultHttpConfig $ do
        result <- req POST (https "some_domain" /: "some_path") NoReqBody bsResponse getHeaders
        pure (responseBody result :: BS.ByteString)


    -- [some stuff]

I want to pass GET and POST as a parameter and thus merge the functions into a single one. I haven't found a way.

Later on I might want to add PUT and DELETE too. For now I need only GET and POST

GET and POST are both distinct data types in the library.

    data GET = GET
    data POST = POST

The HttpMethod class won't help either - it won't compile:

    send1 :: (HttpMethod meth) => meth -> IO BS.ByteString
    send1 meth = do
        runReq defaultHttpConfig $ do
            result <- req meth (https "some_domain" /: "some_path") NoReqBody bsResponse getHeaders
            pure (responseBody result :: BS.ByteString)

I don't want to create yet another new data type - MyHttpMethod, as a workaround, that'll have 2 constructors - GET and POST. I don't want to introduce new types.

Then how else?

"Request" local files

Hi there. First of all, thanks for this great work.

I'm currently developing an application which needs to request some user provided URLs. My issue is related with testing it.

Testing has a lot to do with trusting, and I trust that req works fine because it has its own test suite and it is being used by the community. For this reason, I don't like and I don't see the reason why my tests should be non-deterministic because of the need to reach external end-points which could be not available at some point.

For my own confidence, it would be enough to be able to provide in my test suite a location for a local file which content would act as a response body.

Of course, I see some inconvenient with this. Maybe it is pushing the abstraction out of its scope, because, which method should that request have and which headers should the response have?

Maybe another module Nework.HTTP.Req.Testing could be added implementing the needed logic, but surely some changes in the core system would be necessary. Maybe adding another Scheme constructor called Local, or maybe to make req function not depend on Url but on something implementing a new Url type class.

How do you see it? Am I missing a better approach?

Thanks

Parsing urls with unknown schema

This follows from a conversation in Haskell Cafe.

When parsing an URL, I think it is a common scenario not knowing about the schema of the resulting URL. Right now, there are two functions for that purpose:

parseUrlHttp :: ByteString -> Maybe (Url Http, Option scheme)
parseUrlHttps :: ByteString -> Maybe (Url Https, Option scheme)

Probably, as suggested by Tom Ellis in one of the messages, it would make sense having something like:

parseUrl :: ByteString -> Maybe (Either (Url Http) (Url Https), Option scheme)

Of course it can be done manually, but I think it would make sense in order to avoid some boilerplate.

I'm new to Haskell, but maybe I could try a PR. But in case you find it useful and have the time, don't hesitate to go ahead and implement it.

Memory blowup on stack overflow.

This code quickly allocates a huge amount of memory (actually, all of it on my machine) when run with -K1K:

sample_url = "https://www.avito.ru/ekaterinburg/avtomobili/chevrolet_cruze_2012_1030458230"

main :: IO ()
main = for_ (parseUrlHttps sample_url) $ \(url, scheme) -> do
  body <- runReq def $ responseBody <$> req GET url NoReqBody bsResponse scheme
  BS8.putStrLn body

Here is a self-contained project that allows to reproduce this behavior: req-blowup.

A very nice lib by the way.

How to do testing/mocking with req

This is more of a question than an issue. Do you know any good patterns for testing code that uses req, without actually connecting to the real server?

Sorry if this is the wrong place to ask and if my question is unclear. Thanks for a nice library!

Allow to construct custom Authentication functions?

As it stands Option is opaque, and withRequest and asFinalizer are not part of the public API. As such as a consumer, I can't just define some custom authentication function.

As such building some authenticator that depends on the uri (including the query string) starts to become rather bothersome.

Maybe this could be exposed as Req.Internal?

Prevent redirects

How can I prevent redirects?
httpConfigRedirectCount = 1 -- single redirect, OK.
httpConfigRedirectCount = 0 -- zero redirects and Exception.
Can I ban redirects and avoid exception?..

My goal is to get "Location" header, it's present in response to first request. http-client reportedly holds all history of redirects, but it's strange to make 5 requests consequently with Req and the last with other tool.


Mark, thanks for your library, I managed to get it working. "Get it working" means not the quality of library, it's beyond questions, but the lack of my Haskell skills.

Working with cookies (question)

How can I add custom names to cookies?
L.cookie_name and all other constructors from
https://hackage.haskell.org/package/http-client-0.5.1/docs/Network-HTTP-Client.html
data Cookie are OK, but my original cookie has

"storeId": "firefox-default"
"firstPartyDomain": ""

Truly speaking, everything works with HTTP-Client cookies only, but, as we are in Haskell domain, "it's not pure enough" (pun intended).


Mark, thanks a lot for your library, it's my second attempt to use it as a test site for my humble Haskell skills. But not everything is terrible with them only. It took maybe 5 hours of tinkering to create a working snippet of simple task "log into cite having cookies from FF". In any other language (even F#) I could do this from scratch (with zero previous knowledge) maybe for 20 minutes just copy-pasting from Stackoverflow.

I might be searching in wrong places, but it seems there is no single example of using cookies. Now I have one of my own, if you are still interested in this project, I may give it to you, you may add it to docs (after honing my terrible childish Haskell style).

Failing httpbin-tests

Probably something has changed on httpbin.org - it looks like it doesn't send Connection: close header anymore so the tests using it start to fail:

Failures:

  httpbin-tests/Network/HTTP/ReqSpec.hs:67:7: 
  1) Network.HTTP.Req, receiving request headers back, works
       expected: Object (fromList [("headers",Object (fromList [("Accept-Encoding",String "gzip"),("Connection",String "close"),("Foo",String "bar"),("Baz",String "quux"),("Host",String "httpbin.org")]))])
        but got: Object (fromList [("headers",Object (fromList [("Accept-Encoding",String "gzip"),("Foo",String "bar"),("Baz",String "quux"),("Host",String "httpbin.org")]))])

  To rerun use: --match "/Network.HTTP.Req/receiving request headers back/works/"

  httpbin-tests/Network/HTTP/ReqSpec.hs:80:7: 
  2) Network.HTTP.Req, receiving GET data back, works
       expected: Object (fromList [("args",Object (fromList [])),("url",String "https://httpbin.org/get"),("headers",Object (fromList [("Accept-Encoding",String "gzip"),("Connection",String "close"),("Host",String "httpbin.org")]))])
        but got: Object (fromList [("args",Object (fromList [])),("url",String "https://httpbin.org/get"),("headers",Object (fromList [("Accept-Encoding",String "gzip"),("Host",String "httpbin.org")]))])

  To rerun use: --match "/Network.HTTP.Req/receiving GET data back/works/"

  httpbin-tests/Network/HTTP/ReqSpec.hs:96:7: 
  3) Network.HTTP.Req, receiving POST JSON data back, works
       expected: Object (fromList [("args",Object (fromList [])),("json",String "foo"),("data",String "\"foo\""),("url",String "https://httpbin.org/post"),("headers",Object (fromList [("Content-Type",String "application/json; charset=utf-8"),("Accept-Encoding",String "gzip"),("Connection",String "close"),("Host",String "httpbin.org"),("Content-Length",String "5")])),("files",Object (fromList [])),("form",Object (fromList []))])
        but got: Object (fromList [("args",Object (fromList [])),("json",String "foo"),("data",String "\"foo\""),("url",String "https://httpbin.org/post"),("headers",Object (fromList [("Content-Type",String "application/json; charset=utf-8"),("Accept-Encoding",String "gzip"),("Host",String "httpbin.org"),("Content-Length",String "5")])),("files",Object (fromList [])),("form",Object (fromList []))])

  To rerun use: --match "/Network.HTTP.Req/receiving POST JSON data back/works/"

  httpbin-tests/Network/HTTP/ReqSpec.hs:120:7: 
  4) Network.HTTP.Req, receiving POST data back (multipart form data), works
       expected: Object (fromList [("args",Object (fromList [])),("json",Null),("data",String ""),("url",String "https://httpbin.org/post"),("headers",Object (fromList [("Content-Type",String "multipart/form-data; boundary=----WebKitFormBoundaryqF5ZRQWq6uj4q2w0"),("Accept-Encoding",String "gzip"),("Connection",String "close"),("Host",String "httpbin.org"),("Content-Length",String "242")])),("files",Object (fromList [])),("form",Object (fromList [("foo",String "foo data!"),("bar",String "bar data!")]))])
        but got: Object (fromList [("args",Object (fromList [])),("json",Null),("data",String ""),("url",String "https://httpbin.org/post"),("headers",Object (fromList [("Content-Type",String "multipart/form-data; boundary=----WebKitFormBoundaryqF5ZRQWq6uj4q2w0"),("Accept-Encoding",String "gzip"),("Host",String "httpbin.org"),("Content-Length",String "242")])),("files",Object (fromList [])),("form",Object (fromList [("foo",String "foo data!"),("bar",String "bar data!")]))])

  To rerun use: --match "/Network.HTTP.Req/receiving POST data back (multipart form data)/works/"

  httpbin-tests/Network/HTTP/ReqSpec.hs:147:7: 
  5) Network.HTTP.Req, receiving PATCHed file back, works
       expected: Object (fromList [("args",Object (fromList [])),("json",Null),("data",String "User-agent: *\nDisallow: /deny\n"),("url",String "https://httpbin.org/patch"),("headers",Object (fromList [("Accept-Encoding",String "gzip"),("Connection",String "close"),("Host",String "httpbin.org"),("Content-Length",String "30")])),("files",Object (fromList [])),("form",Object (fromList []))])
        but got: Object (fromList [("args",Object (fromList [])),("json",Null),("data",String "User-agent: *\nDisallow: /deny\n"),("url",String "https://httpbin.org/patch"),("headers",Object (fromList [("Accept-Encoding",String "gzip"),("Host",String "httpbin.org"),("Content-Length",String "30")])),("files",Object (fromList [])),("form",Object (fromList []))])

  To rerun use: --match "/Network.HTTP.Req/receiving PATCHed file back/works/"

  httpbin-tests/Network/HTTP/ReqSpec.hs:169:7: 
  6) Network.HTTP.Req, receiving PUT form URL-encoded data back, works
       expected: Object (fromList [("args",Object (fromList [])),("json",Null),("data",String ""),("url",String "https://httpbin.org/put"),("headers",Object (fromList [("Content-Type",String "application/x-www-form-urlencoded"),("Accept-Encoding",String "gzip"),("Connection",String "close"),("Host",String "httpbin.org"),("Content-Length",String "18")])),("files",Object (fromList [])),("form",Object (fromList [("foo",String "bar"),("baz",String "5"),("quux",String "")]))])
        but got: Object (fromList [("args",Object (fromList [])),("json",Null),("data",String ""),("url",String "https://httpbin.org/put"),("headers",Object (fromList [("Content-Type",String "application/x-www-form-urlencoded"),("Accept-Encoding",String "gzip"),("Host",String "httpbin.org"),("Content-Length",String "18")])),("files",Object (fromList [])),("form",Object (fromList [("foo",String "bar"),("baz",String "5"),("quux",String "")]))])

  To rerun use: --match "/Network.HTTP.Req/receiving PUT form URL-encoded data back/works/"

Randomized with seed 1847904547

Question about parsing Urls dynamically.

I'm writing a 404-checker for funsies and I'm currently wondering what the best way to go about this is. The program is given a list of websites and checks their sitemap, grabs every page of the site, and then gets the response code of every link on every page.

Now, this is hilariously inefficient, so the first thing to do is to put every single link on the website into a Set and filter out all anchor tags that aren't relative links and http/https links. With this, I have a problem; requests will be of type Url 'Https and Url 'Http as well.

Essentially, I can't figure out how to have type-safe urls without almost writing the entire program twice. Is there a way to do this? Or should I think about re-designing the program somehow (although I'm not sure how I could, save for just using raw bytestring everywhere somehow)

(PS: I like the correct by construction promise, but I've run into lots of small little paper-cuts with it and I've actually gotten farther by writing large pattern matching functions and manually building urls than I have by using the parseUrl functions in the library. I'm not quite sure what to do about that to make the Urls easier to use.)

req test suite failure re: httpbin.org/status/102 (InvalidStatusLine "0")

Failures:

  httpbin-tests/Network/HTTP/ReqSpec.hs:370:
  1) Network.HTTP.Req, receiving status code 102, works
       uncaught exception: HttpException (VanillaHttpException (HttpExceptionRequest Request {
         host                 = "httpbin.org"
         port                 = 443
         secure               = True
         requestHeaders       = []
         path                 = "/status/102"
         queryString          = ""
         method               = "GET"
         proxy                = Nothing
         rawBody              = False
         redirectCount        = 10
         responseTimeout      = ResponseTimeoutDefault
         requestVersion       = HTTP/1.1
       }
        (InvalidStatusLine "0")))

Randomized with seed 1813816718

I was able to reproduce this locally like so:

stack unpack req-1.0.0 && cd req-1.0.0
echo 'resolver: lts-11.7' > stack.yaml
stack build --haddock --test --bench --no-run-benchmarks

I'm guessing this is just httpbin changing things, so I will simply mark the test suite as an expected failure in stackage until this is addressed.

Add `Default HttpConfig` instance

Just upgraded req and it broke my codebase in a bunch of places because of def no longer typechecking.

I understand also exporting defaultHttpConfig for people who don't want to specify data-default-class in their cabal file.

However I don't see a reason for not continuing to support def for those (like myself) who prefer it. It's an incredibly light dependency.

Monadic Retry Policies

I'm finding myself in a situation where Monadic retries (of the RetryPolicyM sort) would be potentially useful, but I don't have access to them within Req's httpConfigRetryPolicy or httpConfigRetryJudge.

My specific use case would be to either perform some kind of logging action and/or update a counter (e.g. TVar shared with the rest of my program) on each failed request.

Do you feel like the design of req is amenable to this sort of thing? The only way I can think of this would be to do what RetryPolicyM does and parameterize it around some m that represents the context you're operating in...

Concurrency and Latency

Let's get this out the door first. This is a very nice library to work with!

The documentation is a bit thin on parallelism/concurrency though. I would
expect that

runReq def $ (,) <$> (req GET ...) <*> (req GET ...)

implicitly runs concurrently, is this correct? Could the documentation be improved
in that regard?

On latency, this might be a bit far stretched, but is it possible to pull the request
latency from req somehow? I know I can set my custom alternative manager and
e.g. set the timeout to 1s instead of 30s. But that's only the upper bound I'd like
to get a handle on the min/max/avg/median request latency for an application that
does a lot of requests.

parseUrlHttp port not working

main = do
  let (url, options) = fromJust (parseUrlHttp "http://127.0.0.1:7001/api")
  r <- req GET
        url
        NoReqBody
        jsonResponse
        options
  print (responseBody r :: Value)

output

*Req Data.ByteString> :main []
*** Exception: VanillaHttpException (HttpExceptionRequest Request {
  host                 = "127.0.0.1:7001"
  port                 = 80
  secure               = False
  requestHeaders       = []
  path                 = "/api"
  queryString          = ""
  method               = "GET"
  proxy                = Nothing
  rawBody              = False
  redirectCount        = 10
  responseTimeout      = ResponseTimeoutDefault
  requestVersion       = HTTP/1.1
}
 (ConnectionFailure getAddrInfo: does not exist (nodename nor servname provided, or not known)))

SOCKS5 proxy support

Use case

Some systems, like the Tor network, expose their client functionality through a local SOCKS5 proxy.

Motivation

One of the cool functionalities of Tor is that it provides a mechanism to make servers publicly accessible through a random looking address, whose location is hidden. There is no need for a domain, and you can host the website in any machine running the Tor binary.

This servers, called Onion services, have two big advantages over normal web servers:

  • The identity of the user and the location of the server are hidden.
  • They are cheap to deploy and run, as there is no need to register for a domain.

How to not url encode query parameters?

I am using an API where the following characters (and perhaps more) need to be used without url encoding, as values of query parameters: "|" and ":". It can look something like http://www.domain.com/v2/results.asp?cmd=activate&target=firstname%20lastname|123|456&DateTime=2019-02-01%2015:00 . As you can see, some things need to be url encoded, but if | and : are url encoded, I get an empty response from the server.

When I try to do a GET request using Req, it automatically url encodes | and :. I have tried the normal way (building Url and Option separately in a req request) and parsing a full url using parseUrlHttp and then using the req function, but both url encode those characters, so my request fails.

I am not sure what characters are allowed unencoded as query parameters and googling it made my head spin.

Is there any way to make a request where the query parameters are not url encoded, i.e. some raw/unsafe way where I promise that it will work out? That might not be the right thing to do, but I just would like a way to go forward.

EDIT: This is in req version 1.2.1 by the way, installed via Nix. For some reason it didn't get 2.0 that is listed as the latest in Hackage.

port not part of Url

It would be nice if port was part of the Url, the same goes for params. Follow the RFC.

Maybe something like:

https "something.com" /: post

https "something.com" ::: 8080 /: post /? ("param1", "value1")

Rendering a `Url`

This would be useful for logging purposes. I currently use show url, but that's insufficient. Url is an opaque type (constructors not exposed), so would you be open to a renderUrl function? Or is there an alternative approach logging urls when using req?

GADT for Url to ensure correctness

Currently Url could have a different Scheme at the type level vs the value level. This appears to be hidden from the user due to the un-exported constructor, but I could still see it being beneficial to use a GADT to fully ensure the correctness of this.

Returning http or https based on string

Thanks so much for this incredible library!

I'm trying to write a function that accepts a string and returns either http or https and I'm getting stuck on getting it to compile. There must be something about the internals of req I'm not getting. Any help would be appreciated!

schm :: String -> (Text -> Url scheme)
schm "http" = http
schm "https" = https
error:
    * Couldn't match type `scheme' with 'Http
      `scheme' is a rigid type variable bound by
        the type signature for:
          schm :: forall (scheme :: Scheme). String -> Text -> Url scheme   
        at src\Sut.hs:15:1-38
      Expected type: Text -> Url scheme
        Actual type: Text -> Url 'Http
    * In the expression: http
      In an equation for `schm': schm "http" = http
    * Relevant bindings include
        schm :: String -> Text -> Url scheme (bound at src\Sut.hs:16:1)     
   |
16 | schm "http" = http
   |               ^^^^

Integration with retry

The retry package seems to be nice: https://hackage.haskell.org/package/retry.

It may make sense to learn more about it and integrate Req with it, so we could provide automatic retrying where it makes sense (we can put config into getHttpConfig). Certainly dealing with this manually is not an interesting task for the end user.

Better ergonomics around arbitrary URLs

The ergonomics of this library when directly making requests to URL's that are fairly statically specified is very nice.

However when parsing / storing / rendering URLs the ergonomics decrease significantly.

In my specific case I am parsing a variety of URLs from third parties, and then I need to send a request based on that URL, as well as store and render it.

There isn't really a nice way to do this, as once I parse the URL I lose the ability to render it or store it, so I have to leave it in an unparsed format like Text, with no static guarantees that parsing it later will succeed.

In my eyes parseURL should really only return a URL, and that URL should include things like query parameters and fragments, which are generally expected to be part of a URL.

Option makes sense for things like headers, but in my eyes it really shouldn't be stealing away things that users expect to see in a rendered URL.

This shouldn't have any negative affect on the ergonomics for directly sending requests, as you can have nice combinators like /? or /# or similar.

Perhaps the URI type from modern-uri could be used directly instead?

Include response body in JsonHttpException

Currently JsonHttpException only contains the parse error, making it somewhat difficult to get at the response that failed to parse. In my experience I almost always want to see that response when I get a parse error, because usually my parser is wrong and the response will help me fix the parser. Are you open to adding the response body as another field to JsonHttpException?

I can see an argument for making it configurable, e.g. to avoid leaking full response bodies into logs, but it's not immediately obvious to me how to do that cleanly.

Better loosen HTTPS restrictions on basic auth maybe?

Hi. Not sure about OAuth 2, but basic auth does not enforce HTTPS (although it is bad practice).
IMHO it should be done properly on the server side. For a client library this is a loss of functionality.

In my case we have an interal HTTP + basic auth server. it then becomes impossible to make requests except by using Option which is dumb.

Maybe it is better to add a compile-time warning? Or do nothing at all.

An easy way to avoid throwing on non-200 statuses

With vanilla Network.HTTP.Client one can just setRequestIgnoreStatus. It looks like that with req I'd need to define MonadHttp with handleHttpException, but this looks like too much hassle if I just want to run requests in IO and handle all responses irregardless of their status codes.

Am I missing something in the config?

AWS Signatures

I read your section on supporting AWS and was curious if you thought about implementing UNSIGNED-PAYLOAD for aws signature generation?

I'm not certain the extent of it's support for various AWS offerings, but, for large uploads to S3, it is common to use UNSIGNED-PAYLOAD for signature generation.

Popular AWS libraries use this, such as boto3 (Python official AWS SDK), jets3t (third-part s3 library), &c.

installation advice on --force-reinstalls

I'm trying to install req with cabal install req, I'm getting this message:

cabal: The following packages are likely to be broken by the reinstalls:
HTTP-4000.3.9
Use --force-reinstalls if you want to install anyway.

Should I force-install or there are any other workarounds? Many thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.