Git Product home page Git Product logo

json-2-csv's Introduction

json-2-csv

Convert JSON to CSV or CSV to JSON

NPM version Typings Downloads Minzipped Size

Build Status Coverage Status Maintainability

This node module will convert an array of JSON documents to a CSV string. Column headings will be automatically generated based on the keys of the JSON documents. Nested documents will have a '.' appended between the keys.

It is also capable of converting CSV of the same form back into the original array of JSON documents. The columns headings will be used as the JSON document keys. All lines must have the same exact number of CSV values.

Installation

$ npm install json-2-csv

CLI:

$ npm install @mrodrig/json-2-csv-cli

Upgrading?

Upgrading to v5 from v4? Check out the upgrade guide.

Usage

let converter = require('json-2-csv');
const csv = await converter.json2csv(data, options);

or

import { json2csv } from 'json-2-csv';

API

json2csv(array, options) => string

Returns the CSV string or rejects with an Error if there was an issue.

  • array - An array of JSON documents to be converted to CSV.
  • options - (Optional) A JSON document specifying any of the following key value pairs:
    • arrayIndexesAsKeys - Boolean - Should array indexes be included in the generated keys?
      • Default: false
      • Note: This provides a more accurate representation of the JSON in the returned CSV, but may be less human readable. See #207 for more details.
    • checkSchemaDifferences - Boolean - Should all documents have the same schema?
      • Default: false
      • Note: An error will be thrown if some documents have differing schemas when this is set to true.
    • delimiter - Document - Specifies the different types of delimiters
      • field - String - Field Delimiter.
        • Default: ,
      • wrap - String - Wrap values in the delimiter of choice (e.g. wrap values in quotes).
        • Default: "
      • eol - String - End of Line Delimiter.
        • Default: \n
    • emptyFieldValue - Any - Value that, if specified, will be substituted in for field values that are undefined, null, or an empty string.
      • Default: none
    • escapeHeaderNestedDots - Boolean - Should nested dots in header keys be escaped?
      • Default: true
      • Example:
      [
        {
          "a.a": "1"
        }
      ]
      • true will generate the following CSV:
      a\.a
      1
      
      • false will generate the following CSV:
      a.a
      1
      
      • Note: This may result in CSV output that does not map back exactly to the original JSON.
    • excelBOM - Boolean - Should a unicode character be prepended to allow Excel to open a UTF-8 encoded file with non-ASCII characters present.
    • excludeKeys - Array - Specify the string keys or RegExp patterns that should be excluded from the output. Provided string keys will also be used as a RegExp to help exclude keys under a specified prefix, such as all keys of Objects in an Array when expandArrayObjects is true (e.g., providing 'baz' will exclude 'baz.a' too).
      • Default: []
      • Note: When used with unwindArrays, arrays present at excluded key paths will not be unwound.
    • expandNestedObjects - Boolean - Should nested objects be deep-converted to CSV?
      • Default: true
      • Example:
      [
      	{
        "make": "Nissan",
        "model": "Murano",
        "year": 2013,
        "specifications": {
          "mileage": 7106,
          "trim": "S AWD"
        }
      }
      ]
      • true uses the following keys:
        • ['make', 'model', 'year', 'specifications.mileage', 'specifications.trim']
      • false uses the following keys:
        • ['make', 'model', 'year', 'specifications']
      • Note: This may result in CSV output that does not map back exactly to the original JSON.
    • expandArrayObjects - Boolean - Should objects in array values be deep-converted to CSV?
      • Default: false
      • Example:
      [
      	{ 
      		"specifications": [
      			{ "features": [...] },
      			{ "mileage": "5000" }
      		]
      	}
      ]
      • true uses the following keys:
        • ['specifications.features', 'specifications.mileage']
      • false uses the following keys:
        • ['specifications']
      • Note: This may result in CSV output that does not map back exactly to the original JSON. See #102 for more information.
    • keys - Array - Specify the keys that should be converted.
      • Default: These will be auto-detected from your data by default.
      • Keys can either be specified as a String representing the key path that should be converted, or as an Object of the following format:
      {
        "field": "string", // required
        "title": "string", // optional
        "wildcardMatch": false, // optional - default: false
      }
      • When specifying keys as an Object, the field property specifies the key path, while title specifies a more human readable field heading. Additionally, the wildcardMatch option allows you to optionally specify that all auto-detected fields with the specified field prefix should be included in the CSV. The list specified can contain a combination of Objects and Strings.
      • Examples:
        • [ 'key1', 'key2', ... ]
        • [ 'key1', { field: 'key2', wildcardMatch: true }]
        • [ { field: 'key1', title: 'Key 1' }, { field: 'key2' }, 'key3', ... ]
      • Key Paths - If you are converting a nested object (ie. {info : {name: 'Mike'}}), then set this to ['info.name']
    • parseValue - Function - Specify how values should be converted into CSV format. This function is provided a single field value at a time and must return a String. The built-in parsing method is provided as the second argument for cases where default parsing is preferred.
      • Default: A built-in method is used to parse out a variety of different value types to well-known formats.
      • Note: Using this option may override other options, including useDateIso8601Format and useLocaleFormat.
    • prependHeader - Boolean - Should the auto-generated header be prepended as the first line in the CSV?
      • Default: true
    • sortHeader - Boolean or Function - Should the header keys be sorted in alphabetical order? or pass a function to use a custom sorting function
      • Default: false
    • trimFieldValues - Boolean - Should the field values be trimmed?
      • Default: false
    • trimHeaderFields - Boolean - Should the header fields be trimmed?
      • Default: false
    • unwindArrays - Boolean - Should array values be "unwound" such that there is one line per value in the array?
      • Default: false
      • Example:
      [
          {
              "_id": {"$oid": "5cf7ca3616c91100018844af"},
              "data": {"category": "Computers", "options": [{"name": "MacBook Pro 15"}, {"name": "MacBook Air 13"}]}
          },
          {
              "_id": {"$oid": "5cf7ca3616c91100018844bf"},
              "data": {"category": "Cars", "options": [{"name": "Supercharger"}, {"name": "Turbocharger"}]}
          }
      ]
      • true will unwind the JSON to four objects, and therefore four lines of CSV values:
      _id.$oid,data.category,data.options.name
      5cf7ca3616c91100018844af,Computers,MacBook Pro 15
      5cf7ca3616c91100018844af,Computers,MacBook Air 13
      5cf7ca3616c91100018844bf,Cars,Supercharger
      5cf7ca3616c91100018844bf,Cars,Turbocharger
      
      • false will leave the values unwound and will convert the array as-is (when this option is used without expandArrayObjects):
      _id.$oid,data.category,data.options
      5cf7ca3616c91100018844af,Computers,"[{""name"":""MacBook Pro 15""},{""name"":""MacBook Air 13""}]"
      5cf7ca3616c91100018844bf,Cars,"[{""name"":""Supercharger""},{""name"":""Turbocharger""}]"
      
      • Note: This may result in CSV output that does not map back exactly to the original JSON.
    • useDateIso8601Format - Boolean - Should date values be converted to an ISO8601 date string?
      • Default: false
      • Note: If selected, values will be converted using toISOString() rather than toString() or toLocaleString() depending on the other options provided.
    • useLocaleFormat - Boolean - Should values be converted to a locale specific string?
      • Default: false
      • Note: If selected, values will be converted using toLocaleString() rather than toString()
    • wrapBooleans - Boolean - Should boolean values be wrapped in wrap delimiters to prevent Excel from converting them to Excel's TRUE/FALSE Boolean values.
      • Default: false
    • preventCsvInjection - Boolean - Should CSV injection be prevented by left trimming these characters: Equals (=), Plus (+), Minus (-), At (@), Tab (0x09), Carriage return (0x0D).
      • Default: false

csv2json(csv, options) => object[]

Returns the JSON object array (object[]) or rejects with an Error if there was an issue.

  • csv - A string of CSV
  • options - (Optional) A JSON document specifying any of the following key value pairs:
    • delimiter - Document - Specifies the different types of delimiters
      • field - String - Field Delimiter.
        • Default: ,
      • wrap - String - The character that field values are wrapped in.
        • Default: "
      • eol - String - End of Line Delimiter.
        • Default: \n
    • excelBOM - Boolean - Does the CSV contain a unicode character prepended in order to allow Excel to open a UTF-8 encoded file with non-ASCII characters present?
      • Default: false
    • headerFields - Array - Specify the field names (as strings) in place of a header line in the CSV itself.
      • Default: Parses the header fields directly from the CSV string
      • If you want to generate a nested object (ie. {info : {name: 'Mike'}}), then use . characters in the string to denote a nested field, like ['info.name']
      • If your CSV has a header line included, then don't specify the option to utilize the default values that will be parsed from the CSV.
    • keys - Array - Specify the keys (as strings) that should be converted.
      • Default: null
      • If you have a nested object (ie. {info : {name: 'Mike'}}), then set this to ['info.name']
      • If you want all keys to be converted, then specify null or don't specify the option to utilize the default.
    • parseValue - Function - Specify how String representations of field values should be parsed when converting back to JSON. This function is provided a single String and can return any value.
      • Default: JSON.parse - An attempt is made to convert the String back to its original value using JSON.parse.
    • trimHeaderFields - Boolean - Should the header fields be trimmed?
      • Default: false
    • trimFieldValues - Boolean - Should the field values be trimmed?
      • Default: false

CLI

Note: As of 3.5.8, the command line interface functionality has been pulled out to a separate package. Please be sure to install the @mrodrig/json-2-csv-cli NPM package if you wish to use the CLI functionality shown below:

$ npm install @mrodrig/json-2-csv-cli

json2csv

Usage: json2csv <jsonFile> [options]

Options:
  -V, --version                    output the version number
  -o, --output [output]            Path of output file. If not provided, then stdout will be used
  -f, --field <delimiter>          Optional field delimiter
  -w, --wrap <delimiter>           Optional wrap delimiter
  -e, --eol <delimiter>            Optional end of line delimiter
  -b, --excel-bom                  Excel Byte Order Mark character prepended to CSV
  -W, --without-header             Withhold the prepended header
  -s, --sort-header                Sort the header fields
  -H, --trim-header                Trim header fields
  -F, --trim-fields                Trim field values
  -S, --check-schema               Check for schema differences
  -E, --empty-field-value <value>  Empty field value
  -A, --expand-array-objects       Expand array objects
  -k, --keys [keys]                Keys of documents to convert to CSV
  -h, --help                       output usage information

csv2json

Usage: csv2json <csvFile> [options]

Options:
  -V, --version            output the version number
  -c, --csv <csv>          Path of json file to be converted
  -o, --output [output]    Path of output file. If not provided, then stdout will be used
  -f, --field <delimiter>  Optional field delimiter
  -w, --wrap <delimiter>   Optional wrap delimiter
  -e, --eol <delimiter>    Optional end of line delimiter
  -b, --excel-bom          Excel Byte Order Mark character prepended to CSV
  -H, --trim-header        Trim header fields
  -F, --trim-fields        Trim field values
  -k, --keys [keys]        Keys of documents to convert to CSV
  -h, --help               output usage information

Tests

$ npm test

To see test coverage, please run:

$ npm run coverage

Features

  • Header Generation (per document keys)
  • Allows for conversion of specific keys in both json2csv and csv2json via the options.keys parameter (as of 1.1.2)
  • Document schema verification functionality (field order is irrelevant) (as of 1.1.0)
  • Supports sub-documents natively
  • Supports arrays as document values for both json2csv and csv2json
  • Custom ordering of columns (see F.A.Q. for more information)
  • Ability to re-generate the JSON documents that were used to generate the CSV (including nested documents)
  • Allows for custom field delimiters, end of line delimiters, etc.
  • Wrapped value support for json2csv and csv2json (as of 1.3.0)
  • Support for multiple different schemas (as of 1.4.0)
  • RFC 4180 Compliance (as of 3.0.0)
  • CLI functionality (as of 3.0.0)
    • csv2json test.csv -o output.json
    • and
    • json2csv test.json -o output.csv -W -k arrayOfStrings -o output.csv
  • Empty field value option (as of 3.1.0)
  • TypeScript typings included (as of 3.4.0) - thanks to @GabrielCastro!
  • Synchronous use case support (as of 5.0.0) - thanks to @Nokel81

json-2-csv's People

Contributors

adivated avatar brendalf avatar chris-james avatar dependabot[bot] avatar dillten avatar eric-thelin avatar gabrielcastro avatar hdwatts avatar johnnyoshika avatar joostdebruijn avatar mebibou avatar mrodrig avatar nokel81 avatar null93 avatar peacechen avatar pustovitdmytro avatar rsgok avatar simonh1000 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

json-2-csv's Issues

"length" as a key is producing parsing errors

I was creating a json based on the joi language file: https://github.com/hapijs/joi/blob/master/lib/language.js

{
  "validation": {
    "joi": {
      "any": {
        "unknown": "is not allowed",
        "invalid": "contains an invalid value",
        "empty": "is not allowed to be empty",
        "required": "is required",
        "allowOnly": "must be one of {{valids}}",
        "default": "threw an error when running default method"
      },
      "alternatives": {
        "base": "not matching any of the allowed alternatives"
      },
      "array": {
        "length": "must contain {{limit}} items"
      },
      "boolean": {
        "base": "must be a boolean"
      },
      "binary": {
        "base": "must be a buffer or a string",
        "min": "must be at least {{limit}} bytes",
        "max": "must be less than or equal to {{limit}} bytes",
        "length": "must be {{limit}} bytes"
      }
    }
  }
}

Trying to parse it to csv I was missing some headers, turns out having "length" as a key breaks the conversion:

validation.joi.any.unknown,validation.joi.any.invalid,validation.joi.any.empty,validation.joi.any.required,validation.joi.any.allowOnly,validation.joi.any.default,validation.joi.alternatives.base,validation.joi.array,validation.joi.boolean.base,validation.joi.binary
is not allowed,contains an invalid value,is not allowed to be empty,is required,must be one of {{valids}},threw an error when running default method,not matching any of the allowed alternatives,must contain {{limit}} items,must be a boolean,must be a buffer or a string,must be at least {{limit}} bytes,must be less than or equal to {{limit}} bytes,must be {{limit}} bytes

Capital case?

Why the hell are the keys in the options object written in capital letters. No need to shout (and shitty to type!)

potential bug: empty value in input where 0 expected

I am not sure if this is expected behavior or a bug, so I am submitting it for further consideration. Any guidance is much appreciated!

Setup

Running node 0.10.26, json-2-csv 0.1.4:

var converter = require('json-2-csv');

var data = [
  {
    label: 'first',
    value: 10
  },
  {
    label: 'second',
    value: 0
  },
  {
    label: 'third',
    value: 20
  }
];

converter.json2csv(data, function (err, csv) {
  console.log(csv);
});

Observed

label,value
first,10
second,
third,20

Expected

label,value
first,10
second,0
third,20

Note that the expected output contains a 0 on the line associated with the label second as provided in the input data.

Cause?

This appears to be caused by a value || '' check and 0 evaluates to falsy.

CSV adds unnecessary header column when there is a schema difference

When we convert JSON to CSV and if there is a schema difference in the json, then the module adds unnecessary header and the row containing full values is copied twice.

For Eg, I have used the below json and tried to convert it to csv

[
{
Make: 'Nissan',
Model: 'Murano',
Year: '2013',
Specifications:null
},
{
Make: 'BMW',
Model: 'X5',
Year: '2014',
Specifications: {
Mileage: '3287',
Trim: 'M'
}
}
]

It outputs the below CSV
Make,Model,Year,Specifications,Specifications.Mileage,Specifications.Trim
Nissan,Murano,2013,,,
BMW,X5,2014,3287,M,3287,M

Module not found: Error: Cannot resolve 'file' or 'directory' ./constants

Hi. I installed json-2-csv through npm and I'm getting this error when I try to build my project using webpack:

ERROR in ./~/json-2-csv/lib/converter.js
Module not found: Error: Cannot resolve 'file' or 'directory' ./constants in node_modules/json-2-csv/lib
 @ ./~/json-2-csv/lib/converter.js 5:16-38

Any help would be appreciated. Thanks.

Comma in data not handled

Hi, great utility... I hope I'm not missing something obvious, but, it doesn't seem to handle commas in the CSV data during import even when there are double quotes on fields.

E.g. Take this CSV data where the data with a comma has double quotes.
header1,header2,header3
row1 column1,"row1 column2, with comma", row1 column3
row2 column1,row2 column2, row2 column3

This turns into the following JSON using csv-2-json and we loose row1 column3.
[{
"header1" : "row1 column1",
"header2" : "row1 column2",
"header3" : "with comma"
}, {
"header1" : "row2 column1",
"header2" : "row2 column2",
"header3" : "row2 column3"
}]

Worse still, if this is in the middle somewhere, the data pushes all other fields down.

http://en.wikipedia.org/wiki/Comma-separated_values

Again, great utility, but would be pretty much perfect with this modification.

unforeseen behaviour when parsing several times

Hello,

I'm observing a strange behaviour when parsing repeatedly:
I have the class method "amsbam" that is triggered, when a new file appears in a directory:

Import.amsbam = async (filepath) => {

	try {
		const data = await readFile(filepath)
		console.log(filepath, data.length)

	} catch(err) {
		console.log('ERROR!', err)
	}
}

readFile is the following promise:

const csv = require('csvtojson')
const converter = { delimiter: '|', checkColumn: true }
const readFile = (filepath) => {
	const promise = new Promise(function(resolve, reject) {
		csv(converter)
			.fromFile(filepath)
			.on('error', function(err) {
				reject(err)
			})
			.on('end_parsed', (obj,) => {
				resolve(obj)
			})
	})
	return promise
}

When amsbams is called repeatedly, readFile sometimes simply returns "[]" as you can see from my console:

resources/importdata/bam_od_2018-03-13.lst 36
resources/importdata/bam_od_2018-03-17.lst 36
resources/importdata/bam_od_2018-03-14.lst 36
resources/importdata/bam_oa_2018-03-14.lst 35
resources/importdata/bam_oa_2018-03-15.lst 35
resources/importdata/bam_oa_2018-03-16.lst 35
resources/importdata/bam_od_2018-03-13.lst 0
resources/importdata/bam_od_2018-03-14.lst 0
resources/importdata/bam_od_2018-03-17.lst 0
resources/importdata/bam_od_2018-03-18.lst 0
resources/importdata/bam_oa_2018-03-16.lst 0
resources/importdata/bam_oa_2018-03-14.lst 35
resources/importdata/bam_oa_2018-03-15.lst 35

This unforeseen behaviour appears from time to time without any pattern. Do you have a clou, what would be the problem?
Thanks in advance!

improper import when data contains comma

I am converting this CSV:

@id,@type,rdf:type,owl:sameAs,rdfs:label,dbo:name,dbo:state,dbo:country,sioc:has_container
"foafiaf:Moline,_Illinois",dbo:Place,schema:Place,"dbr:Moline,_Illinois","Moline, Illinois","Moline, Illinois",http://dbpedia.org/resource/Illinois,http://dbpedia.org/resource/United_States,foafiaf:Illinois
"foafiaf:Washington,_D.C.",dbo:Place,schema:Place,"dbr:Washington,_D.C.","Washington, D.C.","Washington, D.C.","http://dbpedia.org/resource/Washington,_D.C.",http://dbpedia.org/resource/United_States,http://dbpedia.org/resource/United_States

I am using these options:

var options = {
    delimiter : {
        wrap  : '"', // Double Quote (") character
        field : ',', // Comma field delimiter
        eol   : '\n' // Newline delimiter
    },
    trimHeaderFields : true,
    trimFieldValues  :  true
};

the resulting JSON is:

[
  {
    '@id': '"foafiaf:Moline',
    '@type': '_Illinois"',
    'rdf:type': 'dbo:Place',
    'owl:sameAs': 'schema:Place',
    'rdfs:label': '"dbr:Moline',
    'dbo:name': '_Illinois"',
    'dbo:state': '"Moline',
    'dbo:country': ' Illinois"',
    'sioc:has_container': '"Moline'
  },
  {
    '@id': '"foafiaf:Washington',
    '@type': '_D.C."',
    'rdf:type': 'dbo:Place',
    'owl:sameAs': 'schema:Place',
    'rdfs:label': '"dbr:Washington',
    'dbo:name': '_D.C."',
    'dbo:state': '"Washington',
    'dbo:country': ' D.C."',
    'sioc:has_container': '"Washington'
  }
]

as you can see even though the fields value is wrapped in "" the conversion splits the first field into two parts ignoring the ""

csv2json: EOL characters in fields are unsupported

Issue - csv2json does not support EOL characters in field values (even inside field wrap delimiters). Instead of interpreting the EOL character as part of the field value, the module splits by EOL, including any that appear inside quotation marks.

Versions affected: All, up to and including 2.4.0.

This will be addressed after the initial 3.0.0 release and will likely be addressed in v.???.

This will likely require adjusting the flow of csv2json to use splitLine() to handle the parsing of the csv, which will also need a branch for the EOL case.

Nested json object data is not supported

Hi,

I have same data like below

const data = [
     {name: 'ABC', title: 'title 1', description: {type: 'abc', count: 3, message: 'message 1'}},
     {name: 'PQR', title: 'title 1', description: {type: 'pqr', count: 3, message: 'message 1'}},
     {name: 'UVW', title: 'title 1', description: {type: 'uvw', count: 3, message: 'message 1'}},
]

Downloaded CSV file show all three records in one row?

Is there anyway to show data in csv as three rows instead of single row?

Don't throw Rejection Error

Why does it not simple use the superset which contains all fields of all objects use as the table schema instead of throwing the useless error Unhandled rejection Error: Not all documents have the same schema ? …or at least provide a flag to do so.

How to generate a .csv file that uses UTF-8 character encoding

The issue is when I generating a csv file and open it in Microsift Excel, It's showing blank bracket instead of emojis. For Exampale :

Actual String:

  1. 😸 To: You, From: Us ... Happy Thursday ...
  2. πŸ’ƒ You'll be dancing in the street with these hot items!

But in upper string it's not showing smilies. It's showing Square brackets. For reference see attached image.

screenshot from 2017-10-10 18-24-15

Please provide me some solution, I am stuck on this.
Thanks

allow keys to be json paths

Hi there, great lib. Was wondering if you'd ever consider allowing options.keys to be a json path string like with lodash.get? This would allow you to have keys that are deep, even nested in arrays.

An example would be having an array of objects. At the moment there is no way to specify that you're interested in the first object's mileage.

Need Ability to Repress Header

Hi there,

Excellent utility that I use with Bluebird Promises. I moved from the "json2csv" because you support promises. That said, that library had the option to not include (or repress) the header row (first line) from the CSV output. I've read through the documentation and code and do not see this option...

For now, I just remove the first line and use the library as-is without making any modifications.

Is this something you can provide support for? If not, should I fork, implement and pull request?

Error Schema are not the same

Hi,

I've got an exception with the module with the order of element.

line 36 of json-2-csv.js I had : console.log();
This is my log :
[ '"_id";"address";"name";"Tag"',
'"_id";"name";"address";"Tag"' ]

The schema is the same but not the order :-(

How can I fix it without update my mongo database ?

Thanks for your help

Herve

Escape nested quote with an additional quote

According to RFC 4180 the proper way to escape double-quotes in CSV data is to prepend an additional double-quote character:

If double-quotes are used to enclose fields, then a double-quote appearing inside a field must be escaped by preceding it with another double quote.

Currently we escape double-quotes using a backslash. Opening a file witch such content in Microsoft Excel and/or Apple Numbers breaks the column layout, i.e. the value containing the double quote is interpretted as multiple columns instead of one. On the contrary, escaping using an extra double-quote preserves the expected column layout in both Microsoft Excel and Apple Numbers.

Promisify it

It would be nice to make the library return a Promise.

This is not in contrast with the possibility of having a traditional callback.

Last version (converting values with toLocaleString) issue with Numbers

Hello.
The last update you made to convert values with .toLocaleString will break some existing code...

Although it could be interesting to convert values based on local, maybe you should not have release it as a simple patch (2.0.15)...

For example, if you try to convert the following JSON into CSV :

{
  key:"key",
  value:1000
}

Before 2.0.15, result was:

KEY,VALUE
key,1000

Since 2.0.15, we will obtain:

KEY,VALUE
key,1,000

quote escaping and comma in value problems

var j2c = require('json-2-csv')

j2c.json2csv({header: "what's up, \"doc\""}, console.log)
> 'header\nwhat\'s up, "doc"\n'

2 problems with the above:

  1. the ' was escaped correctly, but " was not.
  2. minor problem: because theres a field delimiter in the value ",", the whole value should have been wrapped with quotes
    just to clarify, i expected:'header\n"what\'s up, \"doc\""\n'

i consider (2) minor because the lib allows us to force a wrapper:

j2c.json2csv({header: "what's up, \"doc\""}, console.log, {delimiter:{wrap:'"'}})
> '"header"\n"what\'s up, "doc""\n'

but still the first (1) escaping problem can be an annoyance :)

otherwise, great simple library, thanks for publishing it!

Delimiter wrap bug in converter

Line 47

copyOption(opts, 'delimiterwrap', 'DELIMITER.WRAP');

should be

copyOption(opts, 'delimiter.wrap', 'DELIMITER.WRAP');

Documentation is wrong

Docs say:
json2csv uses trimHeaderFields
csv2json uses trimHeaderValues
Actual code uses trimHeaderFields

Also, the docs say there should be a trimFieldValues option but there's no reference to it in the code

What is the CSV format for nested arrays

I am interested to add functionality to support nested arrays when converting to CSV. I can get the headers with the dot-notation but I'm not sure how to structure the data itself - if I just flatten everything I'm not sure that the data will be re-encodable faithfully. Do you have any specifications for this as I can't find anything on the web easily.

Refactor tests

Change test file setup so that one file is for json2csv tests, another is for csv2json tests. Each specific test for commas, semicolons, quote wrapping, etc. can be kept in a separate describe block with its own beforeEach(..) to simplify the setup.
Currently I believe some tests may be duplicated across files (ie. json2csv Options not specified in testComma.js and testSemi.js) - remove duplicates when refactoring.

This will make it easier for future contributions to the code base by simplifying the layout and setting it up in a manner that makes more sense.

Example JSON missing a comma

Hi - your example JSON to CSV code is missing a comma in the JSON after Year : '2013'

took me a while to figure out why the package wasn't working. Put in a comma and voila!

Thought I'd share in case you have time to correct it. Might save some people some time.

Thanks for the package, it's is fantastic.

Options for delimiter-change (json2csv) doesn't work

Nothing changes when passing an options object to change the delimiter.
Tried the example code and changed field and array delimiter, but the output didn't change.

Am I doing something wrong or is there a bug?

make module promisifiable

Currently the module is not promisifiable - users should be able to promisify the json2csv and csv2json functions to allow for promise chaining.

Use of 'toLocaleString' when converting values

Hello:

This is the best library we have found for converting json to csv files but I think it would be very interesting if values were converted based on locale. For example, if we have a number as 3.14, instead of using value.toString(), we should use value.toLocaleString(). This number will become 3.14 for US format but 3,14 for European format.

Locale could be a new option when using the library and default value will be 'en-US'.

What do you think about this?

Regards

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.