webpro / dyson Goto Github PK
View Code? Open in Web Editor NEWNode server for dynamic, fake JSON.
Node server for dynamic, fake JSON.
Is there a way to generate endpoints from a swagger definition file ?
Hi.
The mentioned way to handle status code in endpoint config doesn't works. There's some points:
res.send(XXX).body(YYY)
throws a is not a function
errorres.status(XXX).send(YYY)
or res.status(XXX).end(YYY)
) it works, but raises an error in node. The famous Error: Can't set headers after they are sent.
As the default render
function is just a facade res.send()
, we can use it to responses the expected status. Example:
render: function(req, res) {
if(req.params.id === "999") {
return res.status(404).end('Feature not found');
}
res.send(200, res.body);
}
I can send a PR to fix this issue too :)
Is it possible to persist changes to data so I can mimic an update feature and return the updated data in another request?
Fails on parsing config in file loader.js:23 if property with null value found in config object. TypeError: Cannot use 'in' operator to search for 'path' in null. Seems to need check obj[key] is not a null.
I seem to have maxed the number of end points supported by dyson, though I don't see it anywhere in the documentation.
Is there a way to override this limit? I can rename the 26th file and make it come before another alphabetically, and then it works, but the newest 26th file is no longer available.
tried using patch method with dyson. I created a new folder named patch and copied the dummy from put. but when calling it I get:
Method PATCH is not allowed by Access-Control-Allow-Methods.
(in my browser console, dyson doesn't show any mistake)
is there a simple way to add patch (and maybe other methods that could be useful such as HEAD ) to dyson?
thanks!
This request is based on the discussion that took place in Issue #45.
I'd like to have the option in Dyson to create such configuration that would allow part of the requests to the same resource get proxied randomly, while some of them would generate mocked (locally defined) response.
For example, let's say if the same resource URL GET /api/Clients/ is requested 3 times, it gets proxied to the remote API twice, and one time Dyson generates mocked response from the template or callback: functionName
. I'd like to proxy most of the requests to the remote API, and generate ugly errors once in a while.
Possible reasons for such configuration could include:
In my opinion, such setting should be available at the individual resource config level, and the probability number could be an integer, say between 1 and 10, just to avoid playing with decimals and make this setting more clear to the users. Perhaps simply by allowing the proxy value to be true
(proxying all requests), false
(proxying no requests, all responses generated locally) or a number
that represents the likeliness the request will be proxied.
Express does, as long as you have the .crt and the .key files
I'll probably do a PR soon to accept those in the options.
Hi again! I'm using a lot your module and now I having problems to mock a server status 500 error for my tests.
I built this mock: GET /hello
{
path: "/hello",
cache: false,
collection: false,
render: function(req, res) {
res.status(500).json({});
}
}
But even the server returning status 500, I can't get the request error when I run this code...
var request = require("request");
app.get("/another-hello", function(req, res) {
request.get("/hello", function(err, resp, body) {
if (err) console.log(err); // I can't get this error....
//...
});
});
Do you know another way to mock an server error?
api.127.0.0.0.1:portnumber/users
and not just /users
Thanks!
I'm having issues trying to proxy application/json content to remote API.
module.exports = {
path: '/api/Upload/Chunked/Files',
proxy: true,
cache: false
};
POST http://localhost:3000/api/Upload/Chunked/Files
Authorization: Bearer {...bearer...}
Content-Type: application/json
{
"JobID": 24269,
"FolderID": 0,
"Filename": "test.png",
"FileSize": 50889,
"NumChunks": 1
}
500 INTERNAL SERVER ERROR: https://{...domain...}:443/api/Upload/Chunked/Files
and in the calling application (tried with DHC by Restlet Chrome plugin and HTTPRequester Firefox plugin) I see the following response:
500 Internal Server Error
X-Powered-By: Express
Vary: Origin
access-control-allow-credentials: true
Date: Thu, 05 Nov 2015 23:22:44 GMT
Connection: keep-alive
Transfer-Encoding: chunked
Interesting thing is that the content (JSON) sent as "application/json" at the remote API is received as an empty binary (0-length binary content). When the same content is sent as "text/json", all works as expected (body content filled with data, no 500 errors).
I'm using the latest versions of nodejs and dyson (installed today) on Mac. Not having issues with proxying other requests (GET, PUT, etc.)
Thanks in advance for your suggestions and comments.
Hi, is there a way to run Dyson as mock api with mocha + supertest? Can I require("dyson") inside my tests code?
Because I have an api which consumes another apis and I'd like to build some mocks for theses apis using dyson for test purposes.
I was wondering if there was a way (or plans?) to allow the template to be raw data, as opposed to an object.
Because of poor server-side code, I need to support an endpoint that returns a raw boolean, not wrapped in a json object. (Basically I want to specify that template: true). Can Dyson do this?
Hi, I would like to know if there is any way to specify HTTP method in the endpoint configuration instead of assigning a specific method to a folder, because I would like to group code by entity and not by method (e.g. "orders.js" implementing an "orders" service, with GET/POST/PUT, clients.js being "clients" service, with same methods...)
Don't want to bother sending this as an issue so feel free to close this mercilessly if this is not the appropriate place to ask this question.
Hello.
First of all, congrats for this awesome tool.
I was trying to use Chance or Faker insted of dyson-generators, but not luck.
With Chance, I could make it work but it's generating the same data for each child when is a collection.
With Faker, Is throwing an error when I want to start dyson.
Can you share an example code of each one please?
Thanks.
While you can simulate server-side errors:
res.status(400).send({"error": "invalid_grant"});
res.status(500).send("Internal server error");
...you can't simulate "connection interrupted/dropped", "connection timeout", "slow network" and similar server/network failure scenarios. It would be an awesome addition to Dyson. Useful especially in situations where you are mocking or testing applications that are not executed in Web browsers (where some of these errors can be simulated using e.g. Chrome developer tools).
Maybe integrating with or implementing similar ideas as in Toxy project? ;)
It would be nice if collection
could be a function, allowing the same file to render a collection and an individual member.
Hey thanks for the quick fix in the other bug. I'm trying to make dyson work for the following use-case : I want to share the same generated value between keys. For example, I have a src property generator that depends on a name being fed to it, ideally the same name that was used for the name property. I also need different selected names per item in the collection.
var games = function () {
var name = function () {
return _.random(gameNames.length - 1);
}
return {
path : '/games',
cache : false,
collection : true,
template : {
name : name(),
src : sg.src(name())
}
};
}();
Any thoughts on whether or not this is something I should be trying to achieve with dyson?
Hi there!
I can't make the parameter delay
to work, it looks like a bug to me.
I tried with the following:
var myDataCollection = {
path: '/services/collection',
collection: true,
cache: false,
template: myData.template,
delay: 3000,
container: {
data: {
items: function(params, query, data) {
return data;
}
}
}
};
module.exports = [myData, myDataCollection];
Thanks!
I've attempted to set up a proxy exactly how it is in the examples, IE:
{
"proxy" : true,
"proxyHost" : "http://this.is.my.site.com",
"proxyPort" : "3001"
}
I put this in a dyson.js file located in my base directory (I ran: dyson init REST, and put the dyson.js file in npm/REST/) It still initializes the server to listen on port 3000, and ignores the proxy information.
Any help? I'm on version 0.2.1
bodyParser has very conservative limit of 100kb.
Is there any way to configure this value from dyson?
Hi,
In my current scenario i'd like to use a connect app as static file server (via grunt-contrib-connect) and create a fake API endpoint at /api
.
I was wondering if it's possible to use dyson as a connect middleware. This way i won't have to use plugins like grunt-concurrent
keeping the stack more lightweight.
Hello,
in README file it's saying next for init and running new dyson project:
npm install dyson -g
dyson init [dir]
dyson [dir] 3000
When you try to run project with
dyson [dir] 3000
You will get error
Error: Cannot find module 'dyson'
Problem here is that dyson is installed with -g (globally) and modules that are installed globally you can not require like on line 1 of /dir/get/dummy.js
var g = require('dyson').generators;
Solution here is that user require dyson in devDependencies (npm package.json file) or just to install dyson without -g flag.
Best,
Milos
Hi im trying to figure out how to use collections in POST, is it possible to have you write an example.
This is such a great tool and one I'm often finding new uses for in my testing.
I hadn't checked for updates in awhile since Dyson has been working great without any major issues, but this evening I decided to update it and performed a routine:
$ npm update -g dyson
But, now I'm runing into a build error when the mmmagic dependency is "updated/installed" to [email protected]. The exact error followed by other specifics are below.
I've even tried a complete removal and attempted fresh installation of dyson, but as with the update, it fails at the [email protected] dependency.
Anyone else running into this issue?
gyp ERR! System Windows_NT 6.1.7601
gyp ERR! command "node" "C:\\Program Files\\nodejs\\node_modules\\npm\\node_modules\\node-gyp\\bin\\node-gyp.js" "rebuild"
gyp ERR! cwd C:\Users\swade\AppData\Roaming\npm\node_modules\dyson\node_modules\dyson-image\node_modules\mmmagic
gyp ERR! node -v v0.8.19
gyp ERR! node-gyp -v v0.8.4
gyp ERR! not ok
npm ERR! [email protected] install: `node-gyp rebuild`
npm ERR! `cmd "/c" "node-gyp rebuild"` failed with 1
npm ERR!
npm ERR! Failed at the [email protected] install script.
npm ERR! This is most likely a problem with the mmmagic package,
npm ERR! npm -v 1.2.10
Hello, I'm using dyson 0.7.0 and not able to use container object to wrap my response.
For example using the below endpoint returns empty result when accessing http://localhost:3000/jobs
Can you please check? Thanks
module.exports = {
path: '/jobs/:id?',
collection: true,
template: {
id: g.id,
title: 'Freelancer belly-dancer in Abu Dhabi with a very professional skills',
date: '21/07/13',
salary_range: '20,000 - 29,000 AED/month',
location: 'Abu Dhabi, a place far far away from Dubai and Sharjah',
employment_type: 'Full Time',
benefits: 'commissions',
min_work_experience: '0-1 Years',
min_education_level: 'High-School / Secondary School',
company_size: '11-50 Employees',
career_level: 'Junior',
description: 'Please read the details bellow carefully: bla bla bla bla bla bla',
owner: '1',
status: 'OK'
},
container: {
jobs: function(params, query, data) {
return data;
}
}
};
@fizerkhan, in #2 you mentioned:
The structure of the project is not quite easy to understand. It has to be changed like 'express' library.
What exactly do you think is unclear? I would be glad to explain things, or see how I can implement your suggestions to improve.
Here is some explanation anyway (it might help): Express is the foundation for dyson. The callback function in each service template is actually middleware to Express (which I think is how Express should be used). This middleware from dyson for GET requests has a default implementation, which is generateResponse()
(in /lib/get/defaults.js). It is registered to Express in registerServices()
(in /lib/dyson.js).
http://dyson.jit.su/user?param=,
crashes the server, I get a 502 with any other requests afterwards...
... well, for some reason it's back up and doesn't crash anymore after trying a few minutes later. Well. I'm confused. That server still hangs with that URL, just doesn't stop the server when going to valid urls. Can anyone confirm? It still happens with me locally.
Resolving response for /api/server?limit=15&offset=0&search= (cached)
Resolving response for /api/server?limit=15&offset=0&search= (cached)
404 NOT FOUND: /api/
Resolving response for /api/server?limit=15&offset=
undefined:0
^
SyntaxError: Unexpected end of input
at Object.parse (native)
at IncomingMessage.<anonymous> (C:\Users\Alexander\AppData\Roaming\npm\node_modules\dyson\lib\multiRequest.js:41:32)
at IncomingMessage.EventEmitter.emit (events.js:117:20)
at _stream_readable.js:920:16
at process._tickCallback (node.js:415:13)
Hi.
On startup there is a message
static-icon deprecated; switch to module serve-favicon
Also, if I run the example from the readme: res.send(401, 'Forbidden');
I get a message that this is deprecated and res.send(401).body('Forbidden');
should be used instead
So I need to format a json to work with my ember app, should be something like that:
{
"jobs": [
{
"id": 1,
"title": "Freelancer belly-dancer in Abu Dhabi with a very professional skills",
"date": "21/07/13",
"salary_range": "20,000 - 29,000 AED/month",
"location": "Abu Dhabi, a place far far away from Dubai and Sharjah",
"employment_type": "Full Time",
"benefits": "commissions",
"min_work_experience": "0-1 Years",
"min_education_level": "High-School / Secondary School",
"company_size": "11-50 Employees",
"career_level": "Junior",
"description": "Please read the details bellow carefully: bla bla bla bla bla bla",
"owner": "1",
"status": "OK"
},
{
"id": 2,
"title": "FANTASTIC Sales Lady required for a fashion accessories store",
"date": "21/07/13",
"salary_range": "20,000 - 29,000 AED/month",
"location": "Abu Dhabi, a place far far away from Dubai and Sharjah",
"employment_type": "Full Time",
"benefits": "commissions",
"min_work_experience": "0-1 Years",
"min_education_level": "High-School / Secondary School",
"company_size": "11-50 Employees",
"career_level": "Junior",
"description": "Please read the details bellow carefully: bla bla bla bla bla bla",
"owner": "1",
"status": "OK"
}
]
}
and I'm trying to achieve this with the following configuration:
module.exports = {
path: '/jobs/:id?',
template: { jobs: [ {
id: function(params) {
return params.id || 1;
},
title: 'Freelancer belly-dancer in Abu Dhabi with a very professional skills',
date: '21/07/13',
salary_range: '20,000 - 29,000 AED/month',
location: 'Abu Dhabi, a place far far away from Dubai and Sharjah',
employment_type: 'Full Time',
benefits: 'commissions',
min_work_experience: '0-1 Years',
min_education_level: 'High-School / Secondary School',
company_size: '11-50 Employees',
career_level: 'Junior',
description: 'Please read the details bellow carefully: bla bla bla bla bla bla',
owner: '1',
status: 'OK'
} ] }
};
but it doesn't generates the id for some reason, Am I doing something wrong? I tried using "container:" as well but didn't have any success.
Do you have any recommendations to integrating a dyson server with connect?
I'm hoping to combine these...
http.createServer(app).listen(port);
dyson.bootstrap({
configDir: "services",
port: 3000
});
To get dyson to work with the default Ember RESTAdapter, I need dyson to return rooted JSON responses. I've read the docs but may have missed something... is there a way to respond with this:
# GET /users/8
{
"user": {
"id": 8,
"first": "David",
"last": "Tang",
"pets": [ 1, 2, 3 ],
"company": 7
}
}
# GET /users
{
"users": [
{
"id": 8,
"first": "David",
"last": "Tang",
"pets": [ 1, 2, 3 ],
"company": 7
},
{
"id": 9,
"first": "Jane",
"last": "Doe",
"pets": [ 4 ],
"company": 7
}
]
}
instead of this:
# GET /users/8
{ "id": 8, "first": "David", "last": "Tang" }
# GET /users
[
{ "id": 8, "first": "David", "last": "Tang" },
{ "id": 9, "first": "Jane", "last": "Doe" }
]
Is there a way to run dyson during a grunt build? I used concurrent during our grunt serve and it always gets hung up on the dyson [dir] command (I'm using grunt-shell to run the command).
Any hints?
On line #78 in lib/response.js, if _.isObject is met after the initial check of it being a function, it will call itself again. This does not achieve the desired result if you pass in an array as your value (if you want the value to be an array of items which is very possible).
obj[key] = _.isFunction(value) ? value.apply(scope || obj, params) : _.isObject(value) && !when.isPromise(value) && !_.isArray(value) ? setValues(value, params, obj) : value;
Is my proposed fix for adding one more additional check to see if it's an array.
Please add an option for turning off multirequest for developers who want to use commas as commas. Alternatively please allow developers to change the delimiter from comma to some other value.
Use Cases:
developer wants to allow real commas in the url
for example
/myurl/search?q=user,data,has,commas
Thanks
I made a quick writeup about Dyson and JSONP. I can extend the documentation with a section based on this example. Is it OK to you?
While you can delay a response in a resource config files:
setTimeout(function(){
res.status(400).send({"error": "I am a delayed response"});
next();
}, 10000);
...there's no simple way to delay a proxied response. It would be awesome to have such option in Dyson.
A delay could be treated as a random value with limits, for example:
module.exports = {
path: '/api/clients',
proxy: true,
proxyResponseDelay: [0,120] // Delay between 0 and 120 seconds
};
or am I stuck with localhost:3000?
My example use case is the following: I use dyson to mock the API server before it exists and progressively proxy requests to the real server as it becomes implemented. Eventually, I still need dyson in order to support e2e testing by simulating edge cases that cannot be triggered on the server under normal operation (most notably unexpected server errors).
For that, I need to be able to decide for each request if dyson should respond or proxy. I think a flexible solution would be to have an option in each config that let decide whether this config should be used for a given request or not. If not, another config matching the route could be used or, depending on environment config, the request could be proxied to another server.
Here's an example that illustrates what, ideally, I'd like to be able to achieve:
[
// Special case
{
path: '/login',
// decision function
// - use the config if returns true, else skip
// - accepting a promise as return value could unlock more applications
match: function(req) {
return req.params.username === 'please-crash';
},
template: {
'type': 'http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html',
'detail': 'Status failed validation',
'status': 500,
'title': 'Internal Server Error'
},
// would also be nice
status: 500,
headers: {
'Content-Type': 'application/problem+json'
}
},
// We can have multiple (exclusive) special cases
{
path: '/login',
match: function(req) {
return req.params.username === 'network-error';
}
// ...
},
// Default response
// - used as a stub for development
// - eventually proxied to real API
{
path: '/login',
proxy: true,
template: {
token: '...'
// ...
}
}
]
The concept of a decision/match function is easy to grasp, and one single option could thus enable many use cases. Already uncovered ones include my need to decide whether to proxy or not based on request, and the one expressed by @pjacekm in #48.
This option would also be a step toward a more flexible organization of definitions in dyson, that would allow to group them by concern (or scenario) rather than endpoints topology. To be possible though, it would be necessary to have a notion of priority between definitions. Since introducing an explicit option for priority would be cumbersome to manage (remember z-index...), I think that simply stating that definitions with a "match" function are registered before definitions without one would be a good trade off.
First off ... great project. I'd like to start using dyson for faking json data in my prototypes, but I am missing something in what should be a simple workflow.
It may very well be user error .. but Ive installed globally, and can run demo just fine.
I can init directories in my current projects, and even fire up dyson - to receive:
Dyson listening at port 3000
When I hit localhost:3000/dummy
or whatever config objects Ive placed in the directory, I get:
Cannot GET /dummy
with no exceptions thrown. Quiet console.
I'm going to dig in and figure this out, but thought Id log this for others and/or if you may know off the top of your head what I might be missing.
Hi, I am trying to mock an OPTIONS call, however the fake server does not register it during initialization. I tried:
module.exports = {
'path': '/user',
'method': 'OPTIONS',
'proxy': false,
'cache': false,
'template': userOptions
};
However, if I change to GET or any other method it works. So, how can I register a handler for an endpoint OPTIONS?
Thanks in advance!
My web application assume that the API services run on a different server/port than the application itself, and uses JSONP to bypass the same origin rule. Is there a way to make dyson wrap JSON responses in a callback function, provided by the client?
It doesn't seem that the function to determine the size of a collection gets passed any query parameters. It'd be nice if a request to /api/users?count=20
could allow the count query parameter to be accessible from the size function.
module.exports = {
path: '/api/users',
collection: true,
size: function(params, query) {
console.log(this, arguments)
console.log(query)
if (query != null && query.count != null) {
return query.count
}
else {
return 50;
}
},
template: {
id: g.id,
name: g.name
}
};
Maybe you can - and I'm just missing something.
Thanks!
(This is a follow-up of a discussion in #45)
In certain situation, proxied requests can fail with 404 (when a remote API is behind a reverse proxy) or 500 (when a connection to a remote API is made over HTTPS) errors. The crossing point of those errors is the Host
header.
The Host header is added by the party making the request, and is mandatory according to HTTP specs. The problem is that currently Dyson is just indiscriminately sending all the original request headers (in which Host point to the dyson server) to the target endpoint. This causes two issues. On Dyson's side, node's https module chokes when it verifies the certificate identity because (I think) it relies on the Host header to determine who we think we are talking to. On the other end, the Host header is needed by the reverse proxy (e.g. nginx) to determine the actual target server. When the Host header contains a forwarded host name, the reverse proxy can't find the destination host based on the request Host header, and returns a 404 error.
Current workaround includes replacing this line by the following:
headers: require('lodash').omit(req.headers, ['host'])
...which is not an ideal solution.
The issue can be treated in 2 ways:
module.exports = {
path: '/api/clients',
proxy: true,
proxyHeaders: {
'host':'www.myHost.com',
'X-My-Always-Added-Header':'value'
}
};
@fizerkhan, in #2 you mentioned:
More importantly, the generator code is not enough for fake data. You can use FakerJS https://github.com/marak/Faker.js/
Thanks for the suggestion! I might add it to the project as a dependency. You can actually use it directly yourself, by adding it as a direct (dev)dependency to your project:
var Faker = require('./Faker');
module.exports = {
path: '/user/:id',
template: {
id: function(params) {
return params.id;
},
name: function() {
return Faker.Name.findName();
}
}
}
Hey, I noticed there hasn't much recent development. Are you wanting help in adding new features or are just trying to keep it light? I'm mostly just curious what you're thinking.
Currently template functions receive only params, query, body, and cookies. That would be great to also expose request headers to them. I've got an example use case where I use dyson to back my client e2e tests, and I need to select a response based on Authorization header.
In my fork, I've just added a fifth parameters for the headers, and it works. Unfortunately, that makes a lot of (potentially unused) arguments, which is not so great... In my opinion, it would be cleaner to expose an object wrapping these 5 arguments, or even the request object itself... even though I see how that could be considered hazardous. Unfortunately, those 2 solutions would be breaking for existing code bases. Maybe as an option? From what I've found, all calls are made in the same file, so that would probably be doable.
Let me know what you think of the idea, I can work a PR if needed.
At least in 0.10.0, overriding status
, as in the example code and demos, no longer sends along the generated output from the template. I've tried a few things (like res.send(200, res.body)
) but I'm not receiving anything. The only thing that seems to work is overriding render
instead.
If there's a new preferred way to allow data to be passed along, can the documentation be updated?
Edit: This update killed it: 03d4829
calling return res.end();
on line 19 of lib/response.js
basically keeps any default behavior from running past that point. Was there a reason for this? Is it a typo?
How i can make dyson accessible outside localhost?
If the endpoint is not configured dyson should proxy the request to the real API.
This allows you to develop with both real and fake data.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.