flatfilers / csvjson-app Goto Github PK
View Code? Open in Web Editor NEWOnline conversion and formatting tools for JSON, CSV and SQL.
Home Page: https://www.csvjson.com
License: MIT License
Online conversion and formatting tools for JSON, CSV and SQL.
Home Page: https://www.csvjson.com
License: MIT License
In addition to copy/paste and upload, would be nice to input data from a URL. Especially for JSON Beautifier.
This should be valid JSON, however:
{ "values": [ [ 1, 2, [ 3 ], 4 ] ] }
Becomes:
{ "values": [ [ 1, 2, [1, 2, 3], 4 ] ] }
Notice how the list within the list is expanded from list with 1 item to 3 items.
Currently, this use case fails:
https://www.csvjson.com/csv2json/8176be8c26c4936b3a3eace16c7a1d6b
Its because of this CSV line which includes a line break:
"array mixed objects","[1,null,\"ball\"]","[2,{\"a\":10,\"b\":20},\"cube\"]",
,"a\"b","alert(\"Hi!\")","string with quotes"
,"bell is \u0007","multi
line
text","string with bell&newlines"
According to RFC-4180 seciton 2.6, this is valid CSV.
The column pId in the example is lost when converting to JSON:
The create table statement looks like:
pId
varchar(255) COLLATE utf8_bin NOT NULL,
...
KEY id_cross_refs_provider_id_p_id_object_type
(providerId
,pId
,objectType
),
...
Removing the line KEY ... fixes the problem.
http://www.csvjson.com/sql2json/2cd0ce8b18c8980322cdeda78a77b71e
For ex. if i have a two values of a string field, one is "abc" and other is "" (blank), the second value in the resulted JSON array is assigned an integer value 0, but it should be assigned a null string or blank string.
Some systems won't accept json as an input. I've tried several json-to-csv tools and none of then handle nested json arrays in a meaningful manner, creating rows only for the first level instead of doing it bottom-to-top. Consider:
An array with an arbitrary number of elements that in turn can have and arbitrary number of arrays, recursively.
{
"trees": [
{
"id": 1,
"name": "tree1",
"branches": [
{
"id": 1,
"name": "branch1",
"leaves": [
{
"id": 1,
"name": "leave1"
},
{
"id": 2,
"name": "leave2"
}
]
},
{
"id": 2,
"name": "branch2",
"leaves": [
{
"id": 1,
"name": "leave1"
},
{
"id": 2,
"name": "leave2"
}
]
}
]
}
]
}
Each leave becomes a csv row with all it's ancestor's fields. The example above would result in four rows with the following fields:
It may be necessary to limit the allowed tree's depth to preserve resources.
Hello, I'm not able to continue with the installation, I'll change the index as it was suggested.
"if (strpos($_SERVER['SERVER_NAME'], "localhost") !== FALSE) {
define('ENVIRONMENT', 'production');
} else {
define('ENVIRONMENT', 'development');
}"
When accessing CSV2JSON, everything is out of order, I am not able to convert csv to json. :(
Sorry for my bad English.
An image below: http://imgur.com/41yYocf
Have an option to force a string value on all values in the array
http://www.csvjson.com/sql2json/4ab2f3433f4f6a320988ee11b8f9428e
Same problem.
sql2json does not support INSERT INTO ... VALUES(a,b), (c,d) syntax
#11 by galz was closed on Aug 30 2016
You could tell me how to add low commas without the app separating them like fields. The link that you put to solve the issue 11, the save state is not operative.
Thank you very much for your application.
Hello ,
A very neat implementation , are there any plans to implement Select statements?
How to obtain this json format? Thanks in advance
From
item,Chios,Kos,Lesbos,Samos,Rhodes,Crete,Idomeni,Athens,Diavata,locations Food services,3515,4641,24550,5060,0,0,67989,39131,0,144886 Water bottles,3624,7709,28753,6194,0,0,76722,95277,28772,247051 "Sleeping bags, mats, and blankets",143,1871,9010,1702,0,0,12653,10465,6331,42175 Hygiene items,1847,4678,25399,3732,0,0,36257,27147,32215,131275 Clothing,0,0,5136,0,0,0,14098,0,250,19484 "Other NFIs (survival kits, backpacks, misc.)",3316,11106,62337,5850,200,242,103978,21703,13302,222034
to
var datasetLineChart = [ { group: "All", category: 2008, measure: 289309 }, { group: "All", category: 2009, measure: 234998 }, { group: "All", category: 2010, measure: 310900 }, { group: "All", category: 2011, measure: 223900 }, { group: "All", category: 2012, measure: 234500 }, { group: "Sam", category: 2008, measure: 81006.52 }, { group: "Sam", category: 2009, measure: 70499.4 }, { group: "Sam", category: 2010, measure: 96379 }, { group: "Sam", category: 2011, measure: 64931 }, { group: "Sam", category: 2012, measure: 70350 }, ];
Please add an option to minify the json output.
New option "parse JSON" does not work on arrays with multiple values, and objects with multiple attributes. Here is the test case:
https://www.csvjson.com/csv2json/f0b81b68dd4b73186930ede1d0310c4b
That's because CSV to JSON doesn't handle quoted values containing commas. Consider replacing the CSVtoArray function from:
https://stackoverflow.com/questions/1293147/javascript-code-to-parse-csv-data/1293163#1293163
to:
https://stackoverflow.com/questions/8493195/how-can-i-parse-a-csv-string-with-javascript-which-contains-comma-in-data
Feature Request!
I'd like to be able to have options for what to do with null fields. The tool turns row 1 of "foo" column into: "foo":"" currently. I'd like to be able to select from the following options for null fields:
Create empty string for null fields: "foo":""
Send null: "foo":null
remove nulls from response
There seems to be a bug in the SQL to JSON example on the site:
"name": "Africa');"
The '); part should not be in there.
Can we get the site to be hosted with HTTPS? I'd be hesitant to upload sensitive or private data if it's going to be transmitted as plain text.
my test case is https://www.csvjson.com/json_beautifier/cdd7305b4f9de717b1a122e9125472be
i just insert 133844120200010001 number. but i got 133844120200010000.
I think it might be Long type issue.
This came up on StackOverflow.
The following syntax for sql2jaon works well and shows both inserted rows:
CREATE TABLE `geo_tags` (
`gt_id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`gt_page_id` int(10) unsigned NOT NULL,
PRIMARY KEY (`gt_id`)
) ENGINE=InnoDB AUTO_INCREMENT=4507036 DEFAULT CHARSET=binary ROW_FORMAT=DYNAMIC;
INSERT INTO `geo_tags` VALUES (3,487781);
INSERT INTO `geo_tags` VALUES (4,487781);
But using a shorter syntax for the INSERT, which is usually used by mysqldump, would only show the first value:
CREATE TABLE `geo_tags` (
`gt_id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`gt_page_id` int(10) unsigned NOT NULL,
PRIMARY KEY (`gt_id`)
) ENGINE=InnoDB AUTO_INCREMENT=4507036 DEFAULT CHARSET=binary ROW_FORMAT=DYNAMIC;
INSERT INTO `geo_tags` VALUES (3,487781), (4,487781);
I used this tool for months. Since a couple of days its not working anymore, like it supposed to do.
Not sure what's changed lately.
See this example:
https://www.csvjson.com/csv2json/b324f54df1bb29e214f5eee8313341d7
Expected an output like this:
[
{
"id": "Project",
"fieldId": "U001",
"dataType": "string",
"label": "Project",
"length": 15,
"controlType": 5,
"decimals": 0,
"decimalsFieldId": "",
"fieldGroup": "01. Algemeen",
"subGroup": "",
"visible": 1,
"disabled": 1,
"suffix": "",
"defaultValue": "",
"RunFunctionOnBlur": "",
"updateConnectorFieldId": "PrId",
"typeVerslag": "1,2,3",
"options": ""
},
got this:
[
{
"id\tfieldId\tdataType\tlabel\tlength\tcontrolType\tdecimals\tdecimalsFieldId\tfieldGroup\tsubGroup\tvisible\tdisabled\tsuffix\tdefaultValue\tupdateConnectorFieldId\ttypeVerslag\toptions": "Project\tU001\tstring\tProject\t15\t5\t0\t\t01. Algemeen\t\t1\t1\t\t\tPrId\t1"
},
The "Poor light!\nSearching for document..." is converted into "Poor light!\\nSearching for document...".
And this double "\" slashes make some inconvenience.
https://www.csvjson.com/csv2json/e4d76c8a76516b743b62f6fda8e232d9
Check out how they do it here:
https://www.npmjs.com/package/csvtojson#from-string
where it says "empowered json parser"
It would be nice to create a structure like this, with nested arrays, using csv elements such as metadata.label[0]...
[
{
"file": “sample_doc.docx",
"metadata": [
{
"label": “lob",
"values": [
“APV"
]
},
{
"label": “state",
"values": [
“NY”,”CA"
]
},
{
"label": “company",
"values": [
“GE”,”GG”,”GI”,”GC”,”CCM-GI”,”CCM-GC"
]
}
]
},
...
]
Since we switched from pegjs for parsing CSV, seems the " is not well supported. We gained two double quotes support "" (for nested double quotes) and nested line break support. However we lost " support. A bug was filed against that library. Wait and see or find a better one.
In the JSON beautifier app it would be nice if we can set whether to use single or double quote around the properties.
Let me know if this is something you feel you will like to know, I do not mind sending a PR.
If you prettify your app.min.js file, around line #387:
change
bindDownload() {
to
bindDownload: function() {
It will run as expected. This is the only E6 issue I found which prevents it from running in IE11.
It's also the only method that you haven't done this way.
http://www.csvjson.com/csv2json/0d006b8ff1894ec02d15c4ccefe36abf
CSV columns containing the keywords false and true are converted to strings. I would like an option to keep them as booleans. Thank you.
According to RFC-4180, section 2.6, a nested double quote should be doubled. This test case fails currently:
https://www.csvjson.com/csv2json/0571e7604c0b937c40c90b9cec1b775c
DROP TABLE IF EXISTS shops
;
CREATE TABLE shops
(
id
int(11) unsigned NOT NULL AUTO_INCREMENT,
name
varchar(50) NOT NULL,
city
varchar(50) DEFAULT NULL,
address
varchar(50) DEFAULT NULL,
postalcode
varchar(50) DEFAULT NULL,
province
varchar(100) DEFAULT NULL,
province2
varchar(2) DEFAULT NULL,
region
varchar(100) DEFAULT NULL,
nation_id
smallint(5) unsigned NOT NULL,
logitude
varchar(100) DEFAULT NULL,
latitude
varchar(100) DEFAULT NULL,
idremote
int(11) DEFAULT NULL,
urlremote
varchar(255) DEFAULT NULL,
jsondataremote
varchar(255) DEFAULT NULL,
typemarket
varchar(255) DEFAULT '' COMMENT 'catena di appartenenza',
groupmarket
varchar(255) DEFAULT NULL,
logo
varchar(255) DEFAULT NULL,
phone
varchar(255) DEFAULT NULL,
PRIMARY KEY (id
,nation_id
),
UNIQUE KEY idremote_groupmarket
(idremote
,groupmarket
),
KEY shop_nation
(nation_id
),
KEY id
(id
)
) ENGINE=InnoDB AUTO_INCREMENT=2065 DEFAULT CHARSET=utf8;
-- Records of shops
INSERT INTO shops
VALUES ('1', 'Carrefour Market', 'BUSTO ARSIZIO', 'Via Duca D\'Aosta, 19', '21052', null, 'VA', 'Lombardia', '109', '8.854601', '45.616616', '1913817', '/punti-vendita/supermercato-carrefour-market-busto-arsizio-duca-daosta-19', null, 'Carrefour Market', 'carrefour', null, null);
try this!
TypeError: values[k] is undefined
The CSVJSON format is a variant of the common CSV format (as described in RFC 4180) that removes the ambiguity that afflicts the common CSV format (delimiters, escaping rules, encoding, etc.) by relying on the rules of the well known JSON format.
The CSVJSON format is described at http://csvjson.org
An example of JSON to CSVJSON conversion is:
[
{
"type": "number",
"value1": 1,
"value2": 2
},
{
"type": "boolean",
"value1": false,
"value2": true
},
{
"type": "null",
"value1": null,
"value2": "non null"
},
{
"type": "array of numbers",
"value1": [1],
"value2": [1,2]
},
{
"type": "simple object",
"value1": {"a": 1},
"value2": {"a":1,
"b":2}
},
{
"type": "array mixed objects",
"value1": [1,null,"ball"],
"value2": [2,{"a": 10, "b": 20},"cube"]
},
{
"string": "string with quotes",
"value1": "a"b",
"value2": "alert("Hi!")"
},
{
"string": "string with bell&newlines",
"value1": "bell is \u0007",
"value2": "multi\nline\ntext"
}
]
and the resulting CSVJSON encoded CSV file is:
"index","value1","value2"
"number",1,2
"boolean,false,true
"null",null,"non null"
"array of numbers",[1],[1,2]
"simple object",{"a": 1},{"a":1, "b":2}
"array mixed objects",[1,null,"ball"],[2,{"a": 10, "b": 20},"cube"]
"string with quotes","a"b","alert("Hi!")"
"string with bell&newlines","bell is \u0007","multi\nline\ntext"
The conversion is, of course, reversible.
Note that I chose to maintain the whitespaces inside the JSON values - only newlines outside of strings need to be eliminated when converting a JSON value to CSV because CSV lines must not be broken.
The conversion steps are rather simple (can be optimized):
What's the max file size per file upload?
I have this case:
http://www.csvjson.com/sql2json/ce29e8deafbacbe33642e38f16cd85b0
This is what I have in my Create table statement:
...
status enum('OPEN','HAS_DEP','CLOSED'),
perfectMatch int(1) NOT NULL,
collisionContentId int(11) NOT NULL,
...
And this is the JSON output:
...
"status": "OPEN",
"{{1}}": "1",
"{{2}}')": "2",
...
Apparently if I change status column type from enum to varchar it works fine.
Embedded commas in a quote delimited field does not work.
count it ' ' with this symbol for MySQL, as this ' ' (quotation mark) symbol separates the entries in MySQL.
When converting output from SQL I have a TSV with NULL which converts to
ID NullableField
1234 NULL
2345 Not Null Value
[{
"ID": 1234,
"NullableField": "NULL"
}, {
"ID": 2345,
"NullableField": "Not Null Value"
}]
It would be nice to have an option to convert NULL
into either undefined
or null
, or leave it alone.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.