Git Product home page Git Product logo

light-my-request's Introduction

Light my Request

CI NPM version js-standard-style

Injects a fake HTTP request/response into a node HTTP server for simulating server logic, writing tests, or debugging. Does not use a socket connection so can be run against an inactive server (server not in listen mode).

Example

const http = require('node:http')
const inject = require('light-my-request')

const dispatch = function (req, res) {
  const reply = 'Hello World'
  res.writeHead(200, { 'Content-Type': 'text/plain', 'Content-Length': reply.length })
  res.end(reply)
}

const server = http.createServer(dispatch)

inject(dispatch, { method: 'get', url: '/' }, (err, res) => {
  console.log(res.payload)
})

Note how server.listen is never called.

Async await and promises are supported as well!

// promises
inject(dispatch, { method: 'get', url: '/' })
  .then(res => console.log(res.payload))
  .catch(console.log)

// async-await
try {
  const res = await inject(dispatch, { method: 'get', url: '/' })
  console.log(res.payload)
} catch (err) {
  console.log(err)
}

You can also use chaining methods if you do not pass the callback function. Check here for details.

// chaining methods
inject(dispatch)
  .get('/')                   // set the request method to GET, and request URL to '/'
  .headers({ foo: 'bar' })    // set the request headers
  .query({ foo: 'bar' })      // set the query parameters
  .end((err, res) => {
    console.log(res.payload)
  })

inject(dispatch)
  .post('/')                  // set the request method to POST, and request URL to '/'
  .payload('request payload') // set the request payload
  .body('request body')       // alias for payload
  .end((err, res) => {
    console.log(res.payload)
  })

// async-await is also supported
try {
  const chain = inject(dispatch).get('/')
  const res = await chain.end()
  console.log(res.payload)
} catch (err) {
  console.log(err)
}

File uploads (multipart/form-data) or form submit (x-www-form-urlencoded) can be achieved by using form-auto-content package as shown below:

const formAutoContent = require('form-auto-content')
const fs = require('node:fs')

try {
  const form = formAutoContent({
    myField: 'hello',
    myFile: fs.createReadStream(`./path/to/file`)
  })

  const res = await inject(dispatch, {
    method: 'post',
    url: '/upload',
    ...form
  })
  console.log(res.payload)
} catch (err) {
  console.log(err)
}

This module ships with a handwritten TypeScript declaration file for TS support. The declaration exports a single namespace LightMyRequest. You can import it one of two ways:

import * as LightMyRequest from 'light-my-request'

const dispatch: LightMyRequest.DispatchFunc = function (req, res) {
  const reply = 'Hello World'
  res.writeHead(200, { 'Content-Type': 'text/plain', 'Content-Length': reply.length })
  res.end(reply)
}

LightMyRequest.inject(dispatch, { method: 'get', url: '/' }, (err, res) => {
  console.log(res.payload)
})

// or
import { inject, DispatchFunc } from 'light-my-request'

const dispatch: DispatchFunc = function (req, res) {
  const reply = 'Hello World'
  res.writeHead(200, { 'Content-Type': 'text/plain', 'Content-Length': reply.length })
  res.end(reply)
}

inject(dispatch, { method: 'get', url: '/' }, (err, res) => {
  console.log(res.payload)
})

The declaration file exports types for the following parts of the API:

  • inject - standard light-my-request inject method
  • DispatchFunc - the fake HTTP dispatch function
  • InjectPayload - a union type for valid payload types
  • isInjection - standard light-my-request isInjection method
  • InjectOptions - options object for inject method
  • Request - custom light-my-request request object interface. Extends Node.js stream.Readable type by default. This behavior can be changed by setting the Request option in the inject method's options
  • Response - custom light-my-request response object interface. Extends Node.js http.ServerResponse type

API

inject(dispatchFunc[, options, callback])

Injects a fake request into an HTTP server.

  • dispatchFunc - listener function. The same as you would pass to Http.createServer when making a node HTTP server. Has the signature function (req, res) where:
    • req - a simulated request object. Inherits from Stream.Readable by default. Optionally inherits from another class, set in options.Request
    • res - a simulated response object. Inherits from node's Http.ServerResponse.
  • options - request options object where:
    • url | path - a string specifying the request URL.
    • method - a string specifying the HTTP request method, defaulting to 'GET'.
    • authority - a string specifying the HTTP HOST header value to be used if no header is provided, and the url does not include an authority component. Defaults to 'localhost'.
    • headers - an optional object containing request headers.
    • cookies - an optional object containing key-value pairs that will be encoded and added to cookie header. If the header is already set, the data will be appended.
    • remoteAddress - an optional string specifying the client remote address. Defaults to '127.0.0.1'.
    • payload - an optional request payload. Can be a string, Buffer, Stream or object. If the payload is string, Buffer or Stream is used as is as the request payload. Oherwise it is serialized with JSON.stringify forcing the request to have the Content-type equal to application/json
    • query - an optional object or string containing query parameters.
    • body - alias for payload.
    • simulate - an object containing flags to simulate various conditions:
      • end - indicates whether the request will fire an end event. Defaults to undefined, meaning an end event will fire.
      • split - indicates whether the request payload will be split into chunks. Defaults to undefined, meaning payload will not be chunked.
      • error - whether the request will emit an error event. Defaults to undefined, meaning no error event will be emitted. If set to true, the emitted error will have a message of 'Simulated'.
      • close - whether the request will emit a close event. Defaults to undefined, meaning no close event will be emitted.
    • validate - Optional flag to validate this options object. Defaults to true.
    • server - Optional http server. It is used for binding the dispatchFunc.
    • autoStart - Automatically start the request as soon as the method is called. It is only valid when not passing a callback. Defaults to true.
    • signal - An AbortSignal that may be used to abort an ongoing request. Requires Node v16+.
    • Request - Optional type from which the request object should inherit instead of stream.Readable
  • callback - the callback function using the signature function (err, res) where:
    • err - error object
    • res - a response object where:
      • raw - an object containing the raw request and response objects where:
        • req - the simulated request object.
        • res - the simulated response object.
      • headers - an object containing the response headers.
      • statusCode - the HTTP status code.
      • statusMessage - the HTTP status message.
      • payload - the payload as a UTF-8 encoded string.
      • body - alias for payload.
      • rawPayload - the raw payload as a Buffer.
      • trailers - an object containing the response trailers.
      • json - a function that parses a json response payload and returns an object.
      • stream - a function that provides a Readable stream of the response payload.
      • cookies - a getter that parses the set-cookie response header and returns an array with all the cookies and their metadata.

Notes:

  • You can also pass a string in place of the options object as a shorthand for {url: string, method: 'GET'}.
  • Beware when using the Request option. That might make light-my-request slower. Sample benchmark result run on an i5-8600K CPU with Request set to http.IncomingMessage:
Request x 155,018 ops/sec ±0.47% (94 runs sampled)
Custom Request x 30,373 ops/sec ±0.64% (90 runs sampled)
Request With Cookies x 125,696 ops/sec ±0.29% (96 runs sampled)
Request With Cookies n payload x 114,391 ops/sec ±0.33% (97 runs sampled)
ParseUrl x 255,790 ops/sec ±0.23% (99 runs sampled)
ParseUrl and query x 194,479 ops/sec ±0.16% (99 runs sampled)

inject.isInjection(obj)

Checks if given object obj is a light-my-request Request object.

Method chaining

There are following methods you can used as chaining:

  • delete, get, head, options, patch, post, put, trace. They will set the HTTP request method and the request URL.
  • body, headers, payload, query, cookies. They can be used to set the request options object.

And finally you need to call end. It has the signature function (callback). If you invoke end without a callback function, the method will return a promise, thus you can:

const chain = inject(dispatch).get('/')

try {
  const res = await chain.end()
  console.log(res.payload)
} catch (err) {
  // handle error
}

// or
chain.end()
  .then(res => {
    console.log(res.payload)
  })
  .catch(err => {
    // handle error
  })

By the way, you can also use promises without calling end!

inject(dispatch)
  .get('/')
  .then(res => {
    console.log(res.payload)
  })
  .catch(err => {
    // handle error
  })

Note: The application would not respond multiple times. If you try to invoking any method after the application has responded, the application would throw an error.

Acknowledgements

This project has been forked from hapi/shot because we wanted to support Node ≥ v4 and not only Node ≥ v8. All the credits before the commit 00a2a82 goes to the hapi/shot project contributors. Since the commit db8bced the project will be maintained by the Fastify team.

License

Licensed under BSD-3-Clause.

light-my-request's People

Contributors

allevo avatar arb avatar arniu avatar chrisdickinson avatar climba03003 avatar davidebianchi avatar delvedor avatar dependabot-preview[bot] avatar dependabot[bot] avatar eomm avatar fdawgs avatar fralonra avatar franckstauffer avatar geek avatar greenkeeper[bot] avatar hueniverse avatar iamdoron avatar johnbrett avatar jsumners avatar kanongil avatar lidongjies avatar matteobaglini avatar mcollina avatar mtharrison avatar ovhemert avatar rafaelgss avatar segevfiner avatar simenb avatar uzlopak avatar zekth avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

light-my-request's Issues

Allow all application/*+json media types to be parsed with response.json()

🚀 Feature Proposal

Currently, light-my-request explicitly requires the response's content type to start with exactly "application/json".

if (res.headers['content-type'].indexOf('application/json') < 0) {

However, there are several adjacent subtypes, which imply JSON.parse will work, theoretically all:

application/[tree "."] subtype +json* [";" parameter]

Motivation

I set the response's content type to "application/geo+json", my response.json() calls started throwing errors.

Example

According to IANA, this applies to roughly 100 more media types: Source

Add a custom not found handler

Imagine that you are using fastify.inject to communicate with other endpoints of your app and sometime in the future you split to a microservices architecture; your code will break because the specific endpoint is no longer available in your app.
Would be cool have a custom 404 handler that will deal with that situation?

Eg:

const http = require('http')
const inject = require('light-my-request')

const dispatch = function (req, res) {
  const reply = 'Hello World'
  res.writeHead(200, { 'Content-Type': 'text/plain', 'Content-Length': reply.length })
  res.end(reply)
}

const notFound = function (req, res) {
  asyncCall(req, (err, body) => {
	res.end(body)
  })
}

const server = http.createServer(dispatch)

inject(dispatch, { method: 'get', url: '/', notFound }, (err, res) => {
  console.log(res.payload)
})

Then in Fastify we can think about how to encapsulate the inject's 404 handling as well.
Thoughts?

cc @fastify/fastify

TODO

  • explain why we forked the original project
  • MIT license
  • fix Node v8
  • Promises support

cc @mcollina

Support custom Request types

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

Support for the request object (instance of the Request (lib/request.js) class) to inherit from a custom class besides stream.Readable.

Motivation

Starting from v12.0.9, next.js validates the request object's type is an http.IncomingMessage instance. If it's not, no error is thrown but the request will fail due to an internal type conversion not being made.

This is an issue for using light-my-request to test API requests that use next.js. Specifically, this is currently blocking bumping next.js' version in fastify-nextjs (fastify/fastify-nextjs#514) because tests don't pass with the current structure of light-my-request.

Example

Supporting a new options flag, namely customRequestType and adding the following snippet in lib/request.js:124 (Request function definition) should do the trick:

  if (options.customRequestType) {
    util.inherits(this.constructor, options.customRequestType)
  }

Multi-value request headers are serialized to comma-separated string

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

4.23.2

Plugin version

No response

Node.js version

20.x

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

13.0

Description

The .inject() method allows to inject request headers which values can be a string or an array of strings. When an array is used, it gets serialized to an comma-separated string.

Steps to Reproduce

await fastifyInstance.inject({ method: 'get', url: '/', headers: { foo: 'bar', baz: ['first', 'second'] } });

gives me the following request headers in the route handler:

{
  baz: 'first,second',
   foo: 'bar',
  host: 'localhost:80',
   'user-agent': 'lightMyRequest'
}

Expected Behavior

I would expect to get an array of strings in the route handler as well.

Omitting .end() is not possible when using TypeScript because of missing types

🐛 Bug Report

The docs are stating we can also omit .end() when using the chainable API and it is returning a promise. When using TypeScript this is not possible because the Chain interface does not seem to return a Promise. Using .then() without .end() in TypeScript gives the error TS2339: Property 'then' does not exist on type 'Chain'.

To Reproduce

Use with TS:

fastify.inject().get('/test').then(() => {});

TS2339: Property 'then' does not exist on type 'Chain'.

Expected behavior

It should also work with TypeScript without type errors to be able to omit .end() when not needed.

Your Environment

  • node version: 14
  • fastify version: >=3.5.1
  • os: WSL2
  • light-my-request version: 4.2.1

Unexpected end of JSON input

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

4.23.2

Plugin version

No response

Node.js version

20.8

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

Latest

Description

Whenever the Fastify endpoint returns 'invalid' JSON;
.response.send(null)

The Fastify endpoint still returns without throwing errors, but the light-my-request throws an error whenever handeling this kind of response.

Failed: Unexpected end of JSON input
SyntaxError: Unexpected end of JSON input
    at JSON.parse (<anonymous>)
    at Object.parseJsonPayload [as json] (/node_modules/light-my-request/lib/response.js:143:17)

Steps to Reproduce

  1. Make a API endpoint in Fastify that returns null as response
fastify.get('/test', async (request, reply) => {
  reply.type('application/json').code(200)
  reply.send(null);
  return reply;
})
  1. Make request using light-my-request
await fastify.inject({
 'url': '/test',
 'method': 'GET'
});
  1. Crash

Expected Behavior

Expected the response to be empty, like the browser response of Fastify in this case
image

An in-range update of @types/node is breaking the build 🚨

The devDependency @types/node was updated from 12.12.5 to 12.12.6.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

@types/node is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build failed (Details).

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

About switching to MIT or more developer friendly LICENSE

Hi,

Thanks for building fastify - fast and low overhead framework for Node.js!
As it's approaching release 1.x.x, we're evaluating it to use in our next project.

While doing license reviews of fastify dependencies, we realized that light-my-request uses BSD-3-Clause. It was changed to MIT License in this commit, but it was reverted back to the previous one

Is there a reason for not using MIT or more developer friendly License for light-my-request?
I really want to use fastify 1.x.x in production and do some performance analysis while comparing with expressjs, but this license is blocking me from doing that.

CC: @delvedor

Regards,
Trivikram

Implement an Axios Adapter

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

An interesting use case can be to be able to plug light-my-request as an Axios adapter so you can use an existing HTTP REST client instead of hand crafting requests.

Motivation

We have a generated Axios HTTP client and a Fastify Server which wraps up light-my-request under it with an inject method. It would be nice to be able to use the Axios client and point it to the Fastify server's light-my-request so we don't have to hand-craft requests for testing.

Example

const client = new Client({ baseOptions: { adapter: new LightMyRequestAdapter(server) }}); // Where client accepts Axios instance options under baseOptions
client.helloWorld();

use cookie logic from @fastify/cookie?

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the issue has not already been raised

Issue

No response

Extend request object with custom properties

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

Add an easy way to extend Request object.

Motivation

Allow to mock features.
For example, @fastify/passport add a user property to FastifyRequest when a user is connected. I want to test my private endpoint without call an endpoint to connect the user by directly add the corresponding user property into request object.

Example

Add the ability to write something like this:

const res = await app.inject({
  method: "GET",
  url: "/api/my-private-endpoint",
  extend : {
    // Add user property to request object
    user : {
      id: "user_id",
      ...
    },
   ...
});
// My endpoint handler
export default async function myPrivateEndpoint(
  req: FastifyRequest,
  res: FastifyReply
) {
  const { id } = req.params;
  const loggedUser: PassportUser = req.user;  // Use user property here
}

Response.json() return error without header

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

=3.0.0

Plugin version

5.x.x

Node.js version

16.14.0

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

12.3.1

Description

Response.json() throws an error when the does not exist payload. This is common when we have an empty response like using http code 201 or 204.

Steps to Reproduce

Add this test case and the error will be printed.

test('Response.json() should return undefined if the payload does not exist', (t) => {
    t.plan(2)
  
    const dispatch = function (req, res) {
      res.writeHead(204)
      res.end()
    }
  
    inject(dispatch, { method: 'GET', path: 'http://example.com:8080/hello' }, (err, res) => {
      t.error(err)
      const { json } = res
      t.equal(json(), undefined)
    })
  })

Expected Behavior

I'd expect this to return an undefined, not an error.

Inject a payload that should not be an object

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the issue has not already been raised

Issue

I created an integration test for my controller that basically do this:

             return await app                                                                                                                  
             .inject({                                                                                                                     
                 method: 'PUT',                                                                                                            
                 url: `/accounts/${username}/password`,                                                                                    
               payload:  {                                                                                                
                     password: "newpassword"                                                                                               
                 }                                                                                                                        
             }) 

The controller is:

  @Put(':username/password')                                                                                                              
    async updatePassword (@Param('username') username: string, @Body() password: string):

And I have correctly tested using curl by basically doing this:

curl --location --request PUT 'http://localhost:5000/v1/accounts/root/password' \
--header 'Content-Type: application/json' \
--data-raw '"[email protected]"'

However I can't get my integration test get anything different than 502 BadRequest. Someone could explain why this happen?

response body not being parsed automatically (and should be!)

🚀 Feature Proposal

I'm not sure if this is a Feature Proposal or a bug report. But unlike normal Fastify (which parses request bodies automatically), response bodies don't seem to be parsed automatically when using light-my-request (via app.inject()`).

let response = await app.inject({
  method: "POST",
   url: `/someurl`,
   payload,
  });
let body = JSON.parse(response.payload).data;  

Motivation

Every single time app.inject() is used we have to decode the response. Decoding JSON when the Content-Type is JSON is a reasonable default used by most popular clients like superagent and axios, and also consistent with Fastify itself (Fastify routes decode request bodies).

Example

let response = await app.inject({
  method: "POST",
   url: `/someurl`,
   payload,
  });
console.log(response.body)

Thanks for reading! 🙂

.inject simulate options doesn't work

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

3.27.4

Plugin version

No response

Node.js version

14.18.1

Operating system

Linux

Operating system version (i.e. 20.04, 11.3, 10)

10

Description

The options to simulate some events in the request, doesn't work as it should.

e.g:

inject({
    url: "/hello",
    simulate: {
      close: false,
      split: false,
      end: true,
      error: true
    }
})

The request happens normally but it should return some kind of error.

Steps to Reproduce

Run tests in another terminal with

yarn test

https://codesandbox.io/s/reverent-architecture-iec1qu?file=/package.json

Expected Behavior

From what I understand from the Fastify documentation it should probably return an object of the type

{
	statusCode: number,
	error: string,
	message: 'Simulated'
}

I'm not sure how should return because it's my first time trying to use this option to simulate.

Typing issue with DispatchFunc in TypeScript

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

N/A (light-my-request 5.0.0)

Plugin version

No response

Node.js version

16.15.1

Operating system

Windows

Operating system version (i.e. 20.04, 11.3, 10)

10.0.19044.0

Description

I'm getting the following TypeScript error trying to pass a dispatch func to light-my-request:

src/index.ts:12:33 - error TS2345: Argument of type '(req: IncomingMessage, res: ServerResponse) => void' is not assignable to parameter of type 'DispatchFunc'.
  Types of parameters 'req' and 'req' are incompatible.
    Type 'Request' is missing the following properties from type 'IncomingMessage': aborted, httpVersionMajor, httpVersionMinor, complete, and 4 more.

12   const response = await inject(dispatch, {
                                   ~~~~~~~~


Found 1 error in src/index.ts:12

A sample repository is linked in Steps to Reproduce.

Steps to Reproduce

  1. Clone https://github.com/segevfiner/light-my-request-typing-issue
  2. Run npm install && npm run build

Expected Behavior

I expect to be able to pass dispatch functions that use the plain types from the http module.

inject cookie

🚀 Feature Proposal

Add the feature to send a request with cookies and read the cookies in the response easily

Motivation

Simplify the cookie usage

Example

inject(dispatch, { method: 'get', url: '/', cookies: {
 foo: 'bar'
} }, (err, res) => {
  console.log(res.payload)
  console.log(res.cookies()) // returned by the server (same usage of .json() feature)
})

Remove throw utility

The throw utility defined in the following snippet should be removed since it’s going to add one more level in the stack trace (and there is no need for it).

light-my-request/index.js

Lines 180 to 182 in 56d2e1a

function throwIfAlreadyInvoked () {
throw new Error('The dispatch function has already been invoked')
}

It's ok to use a constant at the top of the file.

Add typescript type definition

While I'm refactoring Fastify types I realized we are defining types for this lib in the fastify/fastify.d.ts file. Should we move these to their own light-my-request.d.ts file and ship them from this module? (And then let the Fastify types consume them for the fastify.inject method)

Dispatch function Request.body is undefined

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

4.12.0

Plugin version

5.9.1

Node.js version

18.14.2

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

Version 13.2.1 (22D68)

Description

I have middleware that I'm testing that validates the request.body via AJV. It's checking query string and body parameters against a known schema. I'm trying to setup my routes like this example. Its jest tests use your plugin behind the scenes like this example. For whatever reason, the dispatch or plugin function has no handle to the request.body when the body object is passed in as the payload. With request.body being undefined, I'm unable to confirm my middleware is validating the body parameters.

I cloned your repo and simply put in a console.log where I thought it would be in this file. See the console.log that I added.

test('supports body option in Request and property in Response', (t) => {
  t.plan(3)
  const dispatch = function (req, res) {
    console.log('req.body', req.body) // it's undefined making it impossible to test middleware functions
    res.writeHead(200, { 'content-type': req.headers['content-type'] })
    req.pipe(res)
  }

  inject(dispatch, { method: 'POST', url: '/test', body: { a: 1 } }, (err, res) => {
    t.error(err)
    t.equal(res.headers['content-type'], 'application/json')
    t.equal(res.body, '{"a":1}')
  })
})

I expected req.body to be populated with the injected body: { a: 1 } but it's undefined.

Steps to Reproduce

Add the following line:

console.log('req.body', req.body) // it's undefined making it impossible to test middleware functions

at line 595 of https://github.com/fastify/light-my-request/blob/master/test/index.test.js#L595

Expected Behavior

The request body (req.body) in the code examples to be populated with the injected body object { a: 1 } but it's undefined.

Remove ajv from dependancies

Thanks to ajv 8, it is possible to avoid the compile call building the schema with the standalone feature.

const optsValidator = ajv.compile(schema)

In this way, it is possible to move the ajv dependency to the devDependancies and provide a faster initialization of this module and a smaller package.

Doing so, could be a minor semver feature since the compiled code is node.js 10 compatible.

Support server-sent events

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

I would like this module to support server-sent events (SSE).

(If helpful, I can submit a pull request.)

Motivation

Currently, the best way to test SSE doesn't use this package at all. It seems the state-of-the-art is to start the server and use an EventSource, possibly with the eventsource package. This works but requires the server to be active, which this module tries to avoid.

It would be nice if this package had an answer for testing SSE.

This was briefly discussed in #105.

Example

There are several possible ways this API could be implemented. My favorite idea is a re-implementation of the W3C EventSource; something like this:

const source = new inject.EventSource(dispatch, { path: '/my-events' })

source.onmessage = (event) => {
  console.log(event)

  if (event.data === 'please close') {
    source.close()
  }
}

But there are many possible options, which we can discuss (if we even want to implement this feature at all!).

How can we use path parameters in the inject options

Currently in the inject options we have facility to add query params but not for path parameters.
We can probably use the ready made url with static values of the path params, but just wanted to know if we can have an option to provide dynamic values to the params

My use case is for writing a test case where i am adding a test for onSend hook
https://www.fastify.io/docs/latest/Hooks/#onsend
to decorate a prometheus metric and there i am trying to customize a custom metric by grouping all urls with path params in them into the same Histogram route (bucket)
https://prometheus.io/docs/concepts/metric_types/ > Histogram
Node client:
https://github.com/siimon/prom-client#histogram

Please let me know if you have any suggestion or please ignore if you feel its not relevant to the scope of this plugin.

Thank you :)

Support cancellation via `.destroy()` and `AbortSignal`, and setting a timeout

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

It would be nice to support a way to cancel the request via a .destroy() method and/or an AbortSignal like the plain Node.js http.request, and also setting a timeout with a similar interface to http.request or something more suitable for light-my-request.

Motivation

https://github.com/segevfiner/axios-light-my-request-adapter

Example

const controller = new AbortController();
const req = inject(/* ... */, {signal: controller.signal, timeout: 1000});
controller.abort();
const res = await req;

An in-range update of ajv is breaking the build 🚨

Version 5.4.0 of ajv was just published.

Branch Build failing 🚨
Dependency ajv
Current Version 5.3.0
Type dependency

This version is covered by your current version range and after updating it in your project the build failed.

ajv is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.

Status Details
  • continuous-integration/travis-ci/push The Travis CI build could not complete due to an error Details

Release Notes v5.4.0

Option logger to disable logging or to specify a custom logger (#618, @meirotstein).

Commits

The new version differs by 20 commits.

  • f336cda 5.4.0
  • 00be319 Merge branch 'meirotstein-master'
  • 89a80ca check that console has log, warn and error methods as well
  • 5ba22a3 remove Logger class, resolve logger object on Ajv ctor instead
  • e0c7eac create logger instance per Ajv instance
  • 4cdfcaa Merge branch 'master' into master
  • 4fe1c21 update readme with logger option
  • ceb552a logger option tests
  • b0e28ee logger component tests
  • 91374ac add logger option
  • cdd93a6 Merge pull request #621 from stuartpb/patch-1
  • 0196611 Update draft-06 meta-schema
  • 0cafcf6 docs: version 6 beta
  • c73ff44 Merge pull request #616 from kpping/patch-1
  • bddda60 make available types more noticeable

There are 20 commits in total.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Fix deprecation warning, _headers is deprecated in Node 12

(node:34152) [DEP0066] DeprecationWarning: OutgoingMessage.prototype._headers is deprecated
    at new Response (/Users/matteo/Repositories/fastify-static/node_modules/light-my-request/lib/response.js:16:17)
    at /Users/matteo/Repositories/fastify-static/node_modules/light-my-request/index.js:86:19
    at new Promise (<anonymous>)
    at inject (/Users/matteo/Repositories/fastify-static/node_modules/light-my-request/index.js:84:12)
    at /Users/matteo/Repositories/fastify-static/node_modules/fastify/fastify.js:292:21
    at processTicksAndRejections (internal/process/task_queues.js:89:5)

It is necessary that the headers: user-agent and content-type could not be filled in automatically when setting the flag in the settings

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

Many tools, such as curl, do not automatically fill in these parameters, so during testing of services it is necessary to disable the automatic filling of these headers in order to test the response of the service to their absence.

curl -X POST http://localhost:3000/route --body "{}"

I suggest adding a parameter to the settings that would regulate this behavior

An option to make the callback function called at the beginning of stream response

🚀 Feature Proposal

I noticed that light-my-request seems impossible to test stream responses (e.g. Server-Sent Events).

If there is an option that invoke callback at the beginning of the response, it is possible to test the stream data through res.raw.

Motivation

To test stream responses.

Example

Maybe

inject(dispatch, { method: 'get', url: '/', stream: true }, (err, res) => {
  console.log(res.raw)
})

Testing a streamed response using `light-my-request`?

💬 Question here

First off, finally got my company using Fastify, success!

Using light-my-request, is it possible to see a streamed response's individual chunks?

Unfortunately, I can't provide an example due to it exposing too much internal info/workings but essentially I have a GET endpoint that returns a JSON-SEQ stream, with each chunk being a JSON Text Sequence, below is the test:

    it('Returns a stream of JSON Text Sequence results', async () => {
      const response = await server.inject({
        method: 'GET',
        url: '/',
        query: {
          boop: 'test',
        },
        headers: {
          accept: '*/*',
        },
      })

      console.log(response.raw.res._lightMyRequest.payloadChunks)

      expect(response.headers).toMatchObject({
        'content-type': 'application/json-seq; charset=utf-8',
      })
      expect(response.statusCode).toBe(200)
     // expect(response.body).toBe('�{"test": "I\'m a test"}\n')
    })

I can see the individual chunks are stored in an array of buffers inside response.raw.res._lightMyRequest.payloadChunks but obviously _lightMyRequest is intended to be private. Is there was a nicer way of getting at them?

Your Environment

  • node version: 18.*, 20.*
  • fastify version: 4.26.1
  • light-my-request version: 5.12.0
  • os: Windows, Linux

query is typed as string | { [k: string]: string | string[] } but passing a string doesn't work

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

N/A (light-my-request 5.0.0)

Plugin version

No response

Node.js version

16.16.0

Operating system

Windows

Operating system version (i.e. 20.04, 11.3, 10)

Windows 10.0.19044.0 x64

Description

query is typed as string | { [k: string]: string | string[] } but passing a string doesn't work, it gets treated as an object and produces a broken query string.

Steps to Reproduce

  1. Pass a string to url.query, path.query or query.

Expected Behavior

A string gets treated as a verbatim query string.

Missing socket

Hi,
I'm writing some test for a Fastify plugin that uses some koa handler to manage routes.

The test is failing due this check of koa lib:

if (this.socket.encrypted) return 'https';

https://github.com/koajs/koa/blob/master/lib/request.js#L402

Considered that:

  • the final solution would be to implement a native plugin for Fastify
  • light my request says "Does not use a socket connection.."
  • I can use listen instead of inject to write my tests

Do you think adding a this.socket = {} to the light-my-request is a bad idea?
How would you cover this use case?

Error: res.getHeader is not a function (incorrect TypeScript types)

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

3.27.0

Plugin version

No response

Node.js version

16.13.2

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

12.1

Description

The TypeScript types for light-my-request indicate there should be a getHeader() method on the LightMyRequest.Response object, because the types indicate it inherits from http.ServerResponse which has this method. However, it appears that it does not include http.ServerResponse in the inheritance chain.

Thanks for taking a look at this!

Steps to Reproduce

The following code is valid according to the TypeScript types:

import fastify from 'fastify'

async function main() {
  const server = fastify()
  const res = await server.inject().get('/')
  res.getHeader('host')
}

main()

However, running the generated JavaScript produces the following error:

test.js:8
    res.getHeader('host');
        ^

TypeError: res.getHeader is not a function
    at main (test.js:8:9)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)

Expected Behavior

Either TypeScript should produce an error to indicate the code is incorrect, or it should run without error.

Headers with value set to null are wrongly emitted as string in the response

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

4.18.0

Plugin version

5.10.0

Node.js version

16.x and above

Operating system

Linux

Operating system version (i.e. 20.04, 11.3, 10)

20.04

Description

Our test suite, based on the test application provided by fastify, is suddenly failing even though we did not change fastify version.

This is likely because of this change:

23f4fa0

Before, setting a response header value to null in the route handler was successfully propagated as null in the response returned by fastify test application.

Now, setting a response header value to null in the route handler leads to the string "null" being propagated in the response returned by fastify test application. Making our test suite fails.

This is a breaking change: this library has always been supporting setting a header to null. It does not support it anymore.

Steps to Reproduce

Just set a header value to null using the header or headers function of fastify response and check the value of the header in the response returned by the test application. It will be the string "null" instead of the null value.

Expected Behavior

No response

An in-range update of @types/node is breaking the build 🚨

The devDependency @types/node was updated from 12.12.9 to 12.12.10.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

@types/node is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build failed (Details).

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

It cannot be used with pipeline()

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

unknown

Plugin version

unknown

Node.js version

v16.16.0

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

unknown

Description

Under Node.js 16 ( v16.16.0 ), inject() cannot be used with pipeline() at the same time.

Under Node.js 14 ( v14.20.0 ), it works.

Steps to Reproduce

  1. Code.
const { pipeline } = require("stream");

const { inject } = require("light-my-request");

(async () => {
  const resp = await inject((req, res) => {
    const data = `
  <!DOCTYPE html>
  <html>

    <head>
      <meta charset="UTF-8">
      <title>Document</title>
    </head>

    <body>
      <p>basic:ejs</p>
    </body>

  </html>
  `;

    res.statusCode = 200;

    pipeline(data, res, () => res.end());
  })
    .get("/")
    .end();

  console.log(resp.body);
})();
  1. Run it.
  <!DOCTYPE html>
  <html>

    <head>
      <meta charset="UTF-8">
      <title>Document</title>
    </head>

    <body>
      <p>basic:ejs</p>
    </body>

  </html>
  
node:internal/streams/end-of-stream:206
    if (stream.req) stream.req.removeListener('finish', onfinish);
                               ^

TypeError: stream.req.removeListener is not a function
    at cleanup (node:internal/streams/end-of-stream:206:32)
    at pump (node:internal/streams/pipeline:166:5)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)

Expected Behavior

No response

Type definition for Response.json() is inaccurate

🐛 Bug Report

The type definition for the Response interface has a .json() function, which returns object. object in TS is the same as {}, which is not what the json() response value is.

Personally I'd suggest using something like the Json example here: https://www.typescriptlang.org/docs/handbook/release-notes/typescript-3-7.html#more-recursive-type-aliases which is

type Json =
    | string
    | number
    | boolean
    | null
    | { [property: string]: Json }
    | Json[];

Another option would be to copy Node's own JSON.parse() return which is any, or even to return unknown.

To Reproduce

I personally stumbled across then when using fastify.inject, and then attempted to destructure a key from the .json() response.

const createResult = server.inject({ method: 'POST', url: '/create'});
const { id } = createResult.json();

This will error, as it "correctly" tells me that id is not an attribute on {}.

Expected behavior

I'd expect this to be more correct, or at least more loosely typed. As it stands, its strict and wrong, I believe.

Your Environment

  • node version: 13
  • fastify version: >=2.0.0
  • os: Linux

Cookie support?

I'm interested in this so I can simulate requests that require a session cookie. Thoughts?

An in-range update of tap is breaking the build 🚨

The devDependency tap was updated from 12.5.1 to 12.5.2.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

tap is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build failed (Details).

Commits

The new version differs by 10 commits.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Version 10 of node.js has been released

Version 10 of Node.js (code name Dubnium) has been released! 🎊

To see what happens to your code in Node.js 10, Greenkeeper has created a branch with the following changes:

  • Added the new Node.js version to your .travis.yml

If you’re interested in upgrading this repo to Node.js 10, you can open a PR with these changes. Please note that this issue is just intended as a friendly reminder and the PR as a possible starting point for getting your code running on Node.js 10.

More information on this issue

Greenkeeper has checked the engines key in any package.json file, the .nvmrc file, and the .travis.yml file, if present.

  • engines was only updated if it defined a single version, not a range.
  • .nvmrc was updated to Node.js 10
  • .travis.yml was only changed if there was a root-level node_js that didn’t already include Node.js 10, such as node or lts/*. In this case, the new version was appended to the list. We didn’t touch job or matrix configurations because these tend to be quite specific and complex, and it’s difficult to infer what the intentions were.

For many simpler .travis.yml configurations, this PR should suffice as-is, but depending on what you’re doing it may require additional work or may not be applicable at all. We’re also aware that you may have good reasons to not update to Node.js 10, which is why this was sent as an issue and not a pull request. Feel free to delete it without comment, I’m a humble robot and won’t feel rejected 🤖


FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

inject hangs when used with Fastify async handler fulfilling undefined

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

4.x but happening in previous versions as well

Plugin version

No response

Node.js version

16

Operating system

Linux

Operating system version (i.e. 20.04, 11.3, 10)

all

Description

This is an issue related to how this library behaves when used with Fastify. Fastify supports async route handlers and it supports those handlers fulfilling the returned promise with undefined (e.g. not returning anything from the handler).

Example:

fastify.get('/', async () => {
  // implicitly returns a promise fulfilled with undefined
})

When .inject is used on such routes, it never returns/resolves, and the reason is that Fastify has a special handling for these cases, which is captured ]here](https://github.com/fastify/fastify/blob/d3c3ad78060dd356ef174a45c9585fb36643db7e/lib/wrapThenable.js#L20-L21).

You can see that it handles the behavior of these route handlers so that either they do not resolve undefined, OR the other condition, which requires that:

reply.sent === false && reply.raw.headersSent === false && reply.request.raw.aborted === false

The first 2 conditions are satisfied by this library, whereas the third isn't. The fix is fairly straightforward and it's to add that aborted = false property to the Request object exposed by this library. Nonetheless before doing this fix I thought it would make sense to discuss whether this change should be done here as it's so specific to how Fastify uses this library.

Steps to Reproduce

See fastify/fastify#4207

Expected Behavior

No response

100% code coverage

🚀 Feature Proposal

This module could easily reach 100% code coverage. There are just 2 lines missing:

----------------------|----------|----------|----------|----------|-------------------|
File                  |  % Stmts | % Branch |  % Funcs |  % Lines | Uncovered Line #s |
----------------------|----------|----------|----------|----------|-------------------|
All files             |    99.15 |    97.48 |    98.04 |     99.1 |                   |
 light-my-request     |      100 |      100 |      100 |      100 |                   |
  index.js            |      100 |      100 |      100 |      100 |                   |
 light-my-request/lib |    98.72 |     96.2 |    97.06 |    98.66 |                   |
  parseURL.js         |      100 |      100 |      100 |      100 |                   |
  request.js          |    98.55 |    98.36 |      100 |    98.51 |               121 |
  response.js         |    98.65 |    85.71 |    95.24 |    98.55 |               130 |
----------------------|----------|----------|----------|----------|-------------------|

Follow redirect

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

When a response has 301/302 status code, the inject should follow the redirects

Motivation

Ref fastify/help#664

Example

const response = await app.inject({
      url: '/asd',
      method: 'GET',
      followRedirect: true // default max 1 redirect
      // OR
      followRedirect: 10 // follow max 10 redirect
    });

handle errors emitted through connection.destroy()

Unfortunately the current interface does not allow us from surfacing those errors.

I propose that we change:

  1. from inject(data, callback(res)) to inject(data, callback(err, res))
  2. from inject(data).then(res) to inject(data).catch().then().

Provide RESTful and chainable apis to make things easier

🚀 Feature Proposal

Give direct apis for every HTTP methods and others like header and body.

Motivation

Currently we use properties in options object passed to inject to tell light-my-request what is the request url, what is the method, and some other things.

I think it would make coding simpler if we provide standalone apis for such things, and make them chainable. As a result, the options can be omitted, and the callback can be moved to the last method of the chain.

Example

const inject = require('light-my-request')

const dispatch = (req, res) => {
  // some code
}

inject(dispatch)
  .get('/test')
  .header({ foo: 'bar' })
  .end((err, res) => {
    // some debug code
  })

inject(dispatch)
  .post('/test')
  .body({ foo: 'bar' })
  .end((err, res) => {
    // some debug code
  })

// you can follow the old way as well
inject(dispatch, { method: 'get', url: '/test' }, (err, res) => {
    // some debug code
  })

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.