Git Product home page Git Product logo

node-csv-parse's Introduction

CSV Parser for Node.js has been merged into Node.js CSV monorepo!

You can find the latest version of the source code inside the Node.js CSV repository, where it will continue to be developed.

Please report any bugs to the Node.js CSV issue tracker directly.

The approach used for merging this repository into monorepo is described in this article.

An archive of the source code before the merge is available here.

node-csv-parse's People

Contributors

aghost-7 avatar ajaz-ur-rehman avatar ajitkaller avatar bells17 avatar dhull avatar drl-max avatar ginden avatar hakatashi avatar khorwood avatar kibertoad avatar markstos avatar mcauser avatar odykyi avatar pdehaan avatar raghavendhrak avatar rauno56 avatar rmm5t avatar rschmukler avatar ryota-ka avatar sergkudinov avatar thealien avatar vamzz1 avatar varunoberoi avatar vityas-off avatar wdavidw avatar willfarrell avatar winton avatar x3cion avatar yeiniel avatar yonilerner avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

node-csv-parse's Issues

How to get Parsing counters

Hi,

I am using the package to read csv. I want to know the lines that are read and if any line was not getting parsed,then know that row number.
Below is my code
var filename = __dirname + '/../../Excel/Order1.csv';

var inputCSVFileStream = fs.createReadStream(filename);

var outputArr = [];

var csvParseOptions = {
  auto_parse: true, // ensures that numeric values remain numeric
  columns: null, // setting this to true will create an array of objects, otherwise it will be an array of arrays rows[(columns[])].
  delimiter: ',',
  /*quote: '',
  relax: true,
  rowDelimiter: 'auto', // this is an issue, I had to set the \n here as 'auto' wasn't working, nor was 'windows'.  Maybe look at auto-detecting line endings?
  skip_empty_lines: false*/
};

var csvParser = csv.parse(csvParseOptions);

csvParser.on('error', function(error) {
  throw error;
});

csvParser.on('readable', function() {
  var record;

  while (record = csvParser.read()) {
    if (!csvParser.lines) {
      // placeholder:  This is the column header, add additional column headers here
    } else {
      // placeholder:  This is a row of data, add the appropriate field data here aligning with new column headers
    }

    outputArr.push(record);
  }

});

csvParser.on('finish', function() {});

csvParser.on('end', function() {
  convertToOrderObject(outputArr);
  zipcodeBasedsorting();
});

inputCSVFileStream.pipe(csvParser);

Basically i want the parser to ignore any line that it couldn't parse and go ahead.Also at the end of the process i want the counters like the rows that are read and the rows which were not parsed

Error on delimiter within comment line

The following example shows how a CSV file with a double-quote within a comment line is not parsed correctly. Removing the comment lines solves the error.

var fs = require("fs");
var path = require("path");
var parse = require("csv-parse");
var util = require("util");

// write a file
var testFile = path.join(__dirname, "testdata.csv");
var csv =  "Region,Site,Rack,Name,IP\n" +
"\"Alaska\",\"Site1\",\"Rack1\",\"RTU-1\",\"192.168.1.3\"\n" +
"\"Alaska\",\"Site1\",\"Rack1\",\"RTU-2\",\"192.168.1.4\"\n" +
"\"Alaska\",\"Site1\",\"Rack1\",\"RTU-3\",\"192.168.1.5\"\n" +
"\"Alaska\",\"Site1\",\"Rack1\",\"RTU-4\",\"192.168.1.6\"\n" +
"\"Alaska\",\"Site1\",\"Rack1\",\"RTU-5\",\"192.168.1.7\"\n" +
"# Comment lines start with a hash sign.\n" +
"# Use a comma to separate fields.\n" +
"# Always use double-quotes to delimit fields.\n" +
"# Empty lines are ignored.\n" +
"# If you want to use a double-quote within a field, use two. E.g. \"Some \"\"special\"\" site\"  (==> Some \"special\" site)\n";
fs.writeFileSync(testFile, csv, "utf8");

// read it again
var input = fs.createReadStream(testFile, "utf8");
var parser = parse({
    auto_parse: false, // don't convert to native datatypes
    columns: null, // return arrays not objects
    comment: "#",
    delimiter: ",",
    escape: "\"",
    quote: "\"",
    rowDelimiter: "\n",
    skip_empty_lines: true,
    trim: true
});
var output = input.pipe(parser);
output.on('error', function(error){
    throw error;
});

output.on('readable', function(){
});

Error in samples/pipe.js

In running through the samples I noticed this error.

var transform = require('../../stream-transform');

Stream-transform was not found until I ran npm-install stream-transform then changed the line to this:

var transform = require('stream-transform');

EDIT: this is after I cloned the repo and was working out of the project, not using it in another project via npm yet.

Parsing Non-english data issue

I am trying to upload csv file which is having non-english data inside. Once file placed in uploadDir, I am trying to read that file line by line and saving them in mongoDB.

But when I read that file I am getting data as ????????? in database. Also I tried to write those data to another file with below code :

var path = './mycsv/' + csvFile.path + csvFile.basename;
                csv()
                    .from.options({
                        delimiter: '\t',
                        columns: false
                    })
                    .from.path(path)
                    .on('record', function(row, index) {
                        if (index != 0) {
                            console.log(row[1]);
                            var wstream = fs.createWriteStream('./mycsv/test2.csv');
                            wstream.write(row[1] + '\n');
                            wstream.end();
                        }
                        //console.log(row);
                    })
                    .on('end', function() {
                        console.log('done parsing');
                        cb();
                    })
                    .on('error', function(err) {
                        console.error(err);
                    });

but still it print data শà§à¦­ নববরà .

Any help ??

preserve empty rows

csv.parse('a\nb\n\nc', function(err, rows) {
    console.log(rows);
});

results in [ [ 'a' ], [ 'b' ], [ 'c' ] ]

whereas I expected [ [ 'a' ], [ 'b' ], [''], [ 'c' ] ]

Dealing with both single- and double-quotes in data

currently using this library on a large dataset where some fields contain various combinations of single and double quotes. some examples:

? 27-2005, 2007 ADM CODE FIRE EGRESS DEFECTIVE. REMOVE OBSTRUCTING BARS OR UNLAWFUL GATES FROM WINDOW TO FIRE ESCAPE OR PROVIDE APPROVED TYPE GATE .. IN THE 1st ROOM FROM NORTH AT WEST LOCATED AT APT B32, 3rd STORY, 1st APARTMENT FROM WEST AT NORTH , SECTION "BLDG. "B"", 2nd FROM SOUTH AT WEST

? 27-2005 ADM CODE PROPERLY SECURE THE LOOSE WASH/BASIN'S BASE IN THE BATHROOM LOCATED AT APT 6E, 6th STORY, 1st APARTMENT FROM WEST AT NORTH , SECTION AT EAST

the parser seems to fail on conditions such as these with an Invalid opening quote at line 1

I've tried modifying the quote and escape options but nothing has ensured that the parser will work on all conditions - am I just doing something wrong?

Problem with unvisible "\r" symbols

I am parsing a simple multiline CSV file, that i exported from Google Spreadsheet and i get following error:

events.js:85
      throw er; // Unhandled 'error' event
            ^
Error: Invalid closing quote at line 96; found "\r" instead of delimiter ","
    at Error (native)
    at Parser.__write (/Users/edgarkuskov/Documents/FoodoWorkspace/foodo-api-server/node_modules/csv-parse/lib/index.js:303:151)
    at Parser._transform (/Users/edgarkuskov/Documents/FoodoWorkspace/foodo-api-server/node_modules/csv-parse/lib/index.js:157:10)
    at Transform._read (_stream_transform.js:179:10)
    at Transform._write (_stream_transform.js:167:12)
    at doWrite (_stream_writable.js:301:12)
    at writeOrBuffer (_stream_writable.js:288:5)
    at Writable.write (_stream_writable.js:217:11)
    at ReadStream.ondata (_stream_readable.js:540:20)
    at ReadStream.emit (events.js:107:17)

Does anyone have idea how to fix it?

Adding promise example

Add a sample of using the library with promise libraries, like one shown below. The following example reads in and parses the entire file in one operation and returns a promise object with the data.

var Promise = require('promise');
var csv = require('csv-parse');
var fs = require('fs');

function parseCSV(file) {
    return new Promise(function (resolve, reject) {
        var parser = csv({delimiter: ','},
            function (err, data) {
                if (err) {
                    reject(err);
                } else {
                    resolve(data);
                }
                parser.end();
            });
        fs.createReadStream(file).pipe(parser);
    });
}

// usage example:

parseCSV('myFile.csv').then(function(data){
    console.log(data); // successful data read;
},function(reason){
    console.error(reason); // error;
})

Double quote within field surrounded by double quotes

First off, great work! The parser works great on another CSV of mine that mixes double quotes in intermittently.

The parser chokes on this line:
"FOOBAR "MICK" SELF","FOOBAR COMPANY"

Spits our this error
[Error: Invalid closing quote at line 654; found "M" instead of delimiter ","]

All fields in this CSV are surrounded by double quotes.

Here's my code (using Meteor.js and coffeescript; sorry):

future = new Future()
  parse = Meteor.require('csv-parse')
  parse csv, {delimiter: ','}, (error, result) ->
    if error
      console.log "[csv_to_array] error: ", error
    else
      console.log "[csv_to_array] result: ", result
      future["return"](result)
  return future.wait()

Is there a way to get around this? I believe the parser is defaulted to deal with double quotes, but not in this way apparently.

retrieving by batch of records

will be nice to add this option, for retrieving array of records, not only a record.

var parser = parse({delimiter: ':', batch: 1000}) // retrieving by 1000 records
var transformer = transform(function(records, callback){
// array of records, the length of the array is specified by batch option
// the use case is inserting by batch to mongodb for performance (e.g. db.collection.insertMany(); 
});
input.pipe(parser).pipe(transformer).pipe(process.stdout);

Fails to parse newline correctly

Hi, I'm on version 1.0.0 and it doesn't parse my newlines correctly (I think).

Assuming I have a .csv file that contains the following (from a readable fs stream):

Name,Gender,"preferred language",startDate,endDate
Barbara McFinley,Female,JavaScript,2015-13-01,,

my output file would be empty (assuming your standard streaming example)

  • If I insert a newline at the beginning of the file (first line empty) it works correctly
  • If I insert two spaces (really, one doesn't work) on a line before the last empty line, it works correctly

(this is true for LF and CRLF options)

I'm using the atom editor, so it adds a newline at the end of the file, which I think should be enough, shouldn't it?

Correct output:

Name,Gender,preferred language,startDate,endDate
Barbara McFinley,Female,JavaScript,2015-13-01,,

Gist with some code and example file:
https://gist.github.com/JonathanMH/27b2e293cec1647f16c7

Tab delimiter cell with double quotes

Working with the tab delimiter '\t' and running across an issue on cells that contain double quotes.

Example of data

product_sku product_name      brand
3355574001  Plush Dodgers "Puig"    Rally Men

In this example the Plush Dodgers "Puig" does not begin with double quotes, nor does it escape out the tab character. The parser throws a Error: Invalid opening quote at line 9..Is this a bug or is this as desired since the entire cell is not escaped?

Now it does work perfectly fine if the data looks like

product_sku product_name      brand
3355574001  "Plush Dodgers ""Puig"""    Rally Men

I kind of feel that it may not be a bug as much as improper CSV data, any thoughts?

Line numbers

After CSV parsing I validate the content of the rows. I would like to give my users a line number with any errors that I find. But unfortunately I cannot count the line numbers myself as empty lines and comment lines are not output by csv-parse.

Would it be an idea to add a line number property on the array that is received in the on("data") event?

pipe mode - handling rows that span across data bursts

Thank you for sharing this package. I am new to this and have a question.

I need to parse a large http response stream (> 1GB) into csv. Looks like the pipe mechanism is a good fit.

I am not sure how the node streams / pipes work, but the chances are every burst of input data may not end at an end-of-line precisely (that is, a row may partially come in first read block, and then rest of the row comes in next read block).

The question I have: is this package capable of handling such scenarios in the pipes mode (where a row may span across two different bursts of input read)?

Support CSV Dialect Description Format

First off: great module - I and my colleagues use it all the time.

This is a suggestion to adopt the CSV Dialect Description Format: http://dataprotocols.org/csv-dialect/

Implementation would mean using or aliasing to the same naming for config options as in CSV DDF and/or support an object describing dialect as per CSV DDF in the options.

How to pipe transformed and parsed csv to JSON object

Hi, I am trying to parse a csv file like this:

1/03/15 0:15,77.456
1/03/15 0:30,75.76
1/03/15 0:45,78.16

having and option {columns:['dateTime', 'num']} and transform the object first column into a javascript object by using moment.js. So the code should look like this:

transformer = transform(function (record, cb) {
 record.dateTime = moment(record.dateTime, 'DD/MM/YY h:mm').format();
 cb(null, record);
});

And piping the whole code into stream using: readStream.pipe(parser).pipe(transformer).pipe(writeStream)
But I'm getting this error:

events.js:85
      throw er; // Unhandled 'error' event
            ^
TypeError: Invalid non-string/buffer chunk

Basically what I want is to

  1. Convert the first column into a correct javascript date object
  2. Invoke parser to create an object
  3. Save the data to the database (MYSQL).

how to parse a buffer?

Trying to parse a Buffer into csv with auto columns detection

csv.parse(req.file.buffer, { columns: true }, function (err, data) {
 // outputs `Error: Invalid data argument`
})

Is there a way to parse Buffers without transforming them to strings first ?

filtering csv data fields

would be nice to add some filtering csv data feature during retrieving entries.

e.g.

first_name, middle_name, last_name, city, country

something like this

var parser = parse({delimiter: ':', filter: {fields: [first_name, last_name, country]}})
var transformer = transform(function(records, callback){
  // only specified fields
  // the record object will be
 // {first_name: '', last_name: '', country: ''}
});
input.pipe(parser).pipe(transformer).pipe(process.stdout);

wish: document handling of character encodings

The README makes no mention of the character encoding of the file being parsed.

The supported character encodings should be documented, as well as the default encoding expected if no encoding is detected or provided.

Single-word column names produce invalid JSON key names (no quotes)

It seems that when using the columns option, column names with single words (no embedded spaces) produce JSON without quotes (columns with embedded spaces work just fine). I played around with the settings and experimented with passing a function in as a columns argument to add the quote, but that does not seem to work... Here are some captures of what I am seeing:

screen shot 2015-09-02 at 10 33 11 am

screen shot 2015-09-02 at 10 33 29 am

Column headers with embedded spaces look fine:

screen shot 2015-09-02 at 10 37 50 am

screen shot 2015-09-02 at 10 39 22 am

Please add some samples

Hello,
Can someone add some samples related to csv-parser.
I am looking for , How to read a CSV file from a path.
I tried to read file as per previous node-csv version but its not working .

Thanks,
Akshay

Breaking change between 0.0.9 and 0.1.1

This is actually a breaking change and so should be a major version update according to the
semver spec. Specifically number 8 in the 'Semantic Versioning Specification' list. We noticed this because one of our projects broke because we relied on this assumption.

Subclassed errors

Just wondering if there has been any consideration has been put to subclassing the syntax errors thrown from this module?

I'm using NodeCSV in Bluebird promise chains and would like to be able to catch errors by type.

For example:

Promise = require 'bluebird'
fs = Promise.promisifyAll require 'fs'
csv = Promise.promisifyAll require 'csv'
other = require './other'

fs.readFileAsync path
.then (data) -> csv.parseAsync data
.then (parsed) -> other(parsed)
.catch csv.CsvSyntaxError, (error) ->
  console.error 'CSV has a syntax error', error
.catch (error) ->
  console.error 'Unexpected error', error

At the moment the best way to differentiate errors is to do a string search for 'Invalid (opening|closing) quote' in error.message, which is quite brittle.

delimiter with more than 1 character

Hi,

Would it be possible to support delimiter with more than 1 character? node-csv-parse currently supports only 1 character.

I need to parse a CSV file with !# as delimiter (2 chars) and double quote as quotifier.
Is there any quick solution to do it with node-csv-parse? I use already intensively this lib elsewhere in my current app :)

Thanks in advance,

Mathieu

delimiter inside quoted fields

the parser fails if there are delimiter chars inside quoted strings. e.g.

"foo";"bar"
"lol;cat";"barf"

is there a combination of options to fix this or is it a bug?

this is the parserConfig:

{
    rowDelimiter: "\n"
    delimiter: ";",
    quote: '"',
    escape: '"',
    auto_parse: true,
    columns: true,
    skip_empty_lines: true,
    trim: true,
    relax: true
}

Quoted field not terminated at line 1

I'm having an issue with an Excel sheet I export as CSV. One of the columns contains snippets of code with newline characters. Excel exports it without complaint, but when I attempt to parse it using this module, I see the error:

Quoted field not terminated at line 1

I looked at the supported options for parse but it doesn't seem to have anything about ignoring newlines while still inside a record. Is parsing this kind of file using node-csv within the realm of possibility? If I delete all the newline characters, the csv parsing works flawlessly.

Whitespace at the end of a row

If last value of row is quoted followed by whitespace, parser throws error Invalid closing quote:

> parse = require('csv-parse')
> parse('a,b\n"1","2" \n"3","4"', function(err) { console.error(err.stack); })
Error: Invalid closing quote at line 2; found " " instead of delimiter ","
    at Parser.__write (/home/ubuntu/workspace/node_modules/csv-parse/lib/index.js:249:19)
    at Parser._transform (/home/ubuntu/workspace/node_modules/csv-parse/lib/index.js:138:10)
    at Transform._read (_stream_transform.js:179:10)
    at Transform._write (_stream_transform.js:167:12)
    at doWrite (_stream_writable.js:226:10)
    at writeOrBuffer (_stream_writable.js:216:5)
    at Writable.write (_stream_writable.js:183:11)
    at repl:1:14
    at REPLServer.self.eval (repl.js:110:21)
    at Interface.<anonymous> (repl.js:239:12)
    at Interface.emit (events.js:95:17)
    at Interface._onLine (readline.js:203:10)

trim option doesn't do anything about it, although relax seems to handle this somehow, but value is left unquoted:

> parse('a,b\n"1","2" \n"3","4"', {relax: true}, console.log)
null [ [ 'a', 'b' ], [ '1', '"2" ' ], [ '3', '4' ] ]

Simple example for reading the data from a source

Hi,

I want to read the content of a CSV file, like the first example of the legacy documentation :

csv().from.path('/tmp/data.csv').on('data', console.log);

I think a similar example with the new API would be very useful. I don't need all these pipes and streams, I just want to read a CSV file :-)

Pausing the parser?

I love your parser and I usually parse around 500K lines of CSV files. While this does well, I often noticed that during parsing, I run out of memory. The original code stored all records in array. I was wondering if I could just pause after say 500 entries in array and sleep for few a while before proceeding? This would give a chance for garbage collector to clean up some older objects for this to continue. Is this possible?

function parse_csv_internal_csv_stream(parseFile, onComplete){

log("[parse_csv_internal_csv_stream]:Parsing CSV internal csv starts", 7, conductor_log_modules.csv_functions); 
var finalData = [];
var readStream = fs.createReadStream(parseFile);

var parser = csv.parse({columns: true, relax: true});

readStream.on('open', function () {
// This just pipes the read stream to the response object (which goes to the client)
  readStream.pipe(parser);
});

readStream.on('error', function(err) {
  log(err.message, 3, conductor_log_modules.csv_functions);
  onComplete(err, null);
});

parser.on('readable', function(){
  while(record = parser.read()){
    finalData.push(record);
  }
});
// Catch any error
parser.on('error', function(err) {
  log(err.message, 3, conductor_log_modules.csv_functions);
  onComplete(err, null);
});

parser.on('finish', function () {
  log("[parse_csv_internal_csv_stream]:Parsing CSV internal csv ends", 7, conductor_log_modules.csv_functions); 
  onComplete(null, finalData);
  parser.end();
});

}

Options ignored

Hi,
Trying to use the csv-parser and the parser options seem somehow ignored. The default options are applied instead. Here under the snippet :

function(next){
    var data=[];
    var stream = fs.createReadStream(__dirname+'/../import_data/EquipmentSelector.csv');
    var parser = parse({delimiter:';',columns : true,comment : '#'})
    .on('readable', function(){
        while(record = parser.read()){
            data.push(record);
        }
    })
// Catch any error
    .on('error', function(err){
        console.log(err.message);
    })
// When we are done, test that the parsed output matched what expected
    .on('finish', function(){
        //console.log('Done',data);
        next();
    });
    stream.pipe(parser);
};

Comments are also ignored.

Thanx for any thoughts about that.

cell ends up truncated in the middle... possible streaming issue?

I'm using the streams interface. On line 703 of the input CSV, this comes across the wire as one of the fields:

 12345 Bob Wilson Driv

But the actual field value was:

12345 Bob Wilson Drive, San Diego, California, 92134

I wonder if this is not so much a parsing issue but a streaming issue, perhaps with the CSV parser running so much faster than the later operations that there's some sort of buffer overflow and truncation, as described here: http://stackoverflow.com/questions/11316382/writing-into-a-stream-ends-up-overflowing

Perhaps I just need to throw a pause() somewhere in my stream pipeline.

If the issue turns out to be caused by third-party code, it could still be nice to provide some guidance in the docs about how to avoid this kind of issue with streaming. I'm going to investigate this some now and see if I can get further to the root cause. (I'm using 0.0.6).

Read ints with leading zeroes as string

Looks like the parsing of ints with leading zeroes as string doesn't work.

if (@options.auto_parse and @regexp_int.test(@field))
    @line.push parseInt(@field)
else if (@options.auto_parse and @regexp_float.test(@field))
    @line.push parseFloat(@field)

Since the dot in the tested text is optional, it does also match strings like 01.

/^(\-|\+)?([0-9]+(\.[0-9]+)?([eE][0-9]+)?|Infinity)$/

I think the dot should be not be optional, since we've tested for ints before. Or if still wanted, a leading 0 should only be allowed if it's followed by a ..

I'm not sure why this test works:
https://github.com/wdavidw/node-csv-parse/blob/master/test/options.auto_parse.coffee#L53

What do you think?

Thanks!

rowDelimiter 'auto' option not working as expected

I just started using node-csv, and am using the parser with the following options:

var csvParser = csv.parse({
    auto_parse: true, // ensures that numeric values remain numeric
    columns: null,
    delimiter: ',',
    quote: '',
    relax: true,
    rowDelimiter: 'auto', // this is an issue, I had to set \n here as 'auto' wasn't working, nor was 'windows'.
    skip_empty_lines: false
});

According to the documentation: rowDelimiter (chars|constant):
String used to delimit record rows or a special value; special constants are 'auto', 'unix', 'mac', 'windows', 'unicode'; defaults to 'auto' (discovered in source or 'unix' if no source is specified).

I tried using 'auto' on a data file that uses \n line endings, but it wasn't parsing correctly. By setting the rowDelimiter parse option to 'auto', is it supposed to perform some auto-detection of line endings?

What about synchronous syntax?

I'm loading the csv-data at server startup. I don't want any asynchronous behaviour in that case. Is it possible to process all of data synchronously?

does not parse at all

        csv.parse(yargs.f, {delimiter: ";", quote:'"'}, function(err,output) {
          console.log(output);
        });

outputs: [ [ './csv/test.csv' ] ]

How come it does not output the content?
It's been one hour i'm trying to make it work.

Is `\"` not a valid double quote escape?

In the double quotes test, \"\"\"ok\"\"\" is read as "ok", a double quoted cell. When I pass it a csv file containing an escaped double quote, the parser interprets the escaped quote as a real quote.

test.csv:

57601,"\"ok\""

error:

[Error: Invalid closing quote at line 1; found "o" instead of delimiter ","]

Is there I am not understanding about CSV files? Why does the test expect an escaped double quote to look like \"\"\" instead of \"?

Using Pipe Function gives "no method 'join'" error

Can't seem to get the pipe function working properly as I am getting a "TypeError: Object # has no method 'join'" error.

var output = [];
var parser = parse({auto_parse: true, columns: true});
var input = fs.createReadStream('./uploads/' + req.body.file);
var transformer = transform(function (record, callback) {
    setTimeout(function () {
        callback(null, record.join(' ') + '\n');
    }, 500);
}, {parallel: 10});

transformer.on('error', function (err) {
    res.send(500,err);
    console.log(output);
});
transformer.on('finish', function () {
    console.log('finish');
    console.log(output);
});

input.pipe(parser).pipe(transformer).pipe(process.stdout);

csv-parse changes passed options argument

Hi,
I think that changing passed options object argument inside of lib is bad practice. Because of this I had bug that only manifested itself after second run of function that used parse (I got my options object as module's global variable).

Also I think that making default immutable options object is common scenario and doing something like this fails:
Example:

var options = {
    delimiter: ';',
    columns  : true
};

var opt = Object.freeze(options);

module.exports = function(options){
   if(!options) options = opt;
   parse(options, ...);
}

// Unhandled rejection TypeError: Can't add property objectMode, object is not extensible

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.