Git Product home page Git Product logo

neo4j-etl's Introduction

Neo4j ETL

Neo4j ETL allows importing data from relational databases into Neo4j.

Features

  • Neo4j-ETL UI in Neo4j Desktop

  • Manage multiple RDBMS connections

  • automatically extract database metadata from relational database

  • derive graph model

  • visually edit labels, relationship-types, property-names and types

  • visualize current model as a graph

  • persist mapping as json

  • retrieve relevant CSV data from relational databases

  • run import via neo4j-import, bolt-connector, cypher-shell, neo4j-shell

  • bundles MySQL, PostgreSQL, allows custom JDBC driver with Neo4j Enterprise

License

This tool is licensed under the NEO4J PRE-RELEASE LICENSE AGREEMENT.

Issues & Feedback & Contributions

Download & Run Command Line Tool

Download & unzip the latest neo4j-etl.zip.

Examples of command usage:

Minimal command line
./bin/neo4j-etl export \
 --rdbms:url <url> --rdbms:user <user> --rdbms:password <password> \
 --destination $NEO4J_HOME/data/databases/graph.db/ --import-tool $NEO4J_HOME/bin \
 --csv-directory $NEO4J_HOME/import
Full set of command line options
./bin/neo4j-etl export \
 --rdbms:url <url> --rdbms:user <user> --rdbms:password <password> --rdbms:schema <schema> \
 --using { bulk:neo4j-import | cypher:neo4j-shell | cypher:shell | cypher:direct | cypher:batch | cypher:fromSQL } \
 --neo4j:url <neo4j url> --neo4j:user <neo4j user> --neo4j:password <neo4j password> \
 --destination $NEO4J_HOME/data/databases/graph.db/ --import-tool $NEO4J_HOME/bin \
 --csv-directory $NEO4J_HOME/import --options-file import-tool-options.json --force --debug
Additional command line options for cypher:batch and cypher:fromSQL import modes:
 --unwind-batch-size <value> (Batch size that will be used for unwind data) \
 --tx-batch-size <value> (Transaction Batch size that will be used for unwind commit) \

For detailed usage see also the: tool documentation.

Neo4j-Desktop

Use the Application URL https://r.neo4j.com/neo4j-etl-app in the "Graph Apps" tab of Neo4j Desktop.

neo4j etl install graph app icon

Then the next time you start Neo4j Desktop you’ll see Neo4j ETL as a UI to be used interactively.

Configure Driver Load Mapping Edit Mapping Import Data

driver

load mapping

edit mapping

import data

We put detailed usage instructions for the Neo4j ETL Tool in the Neo4j Developer Pages.

JDBC Drivers

The drivers for MySQL and PostgreSQL are bundled with the Neo4j-ETL tool.

To use other JDBC drivers use these download links and JDBC URLs. Provide the JDBC driver jar-file to the command line tool or Neo4j-ETL application. And use the JDBC-URL with the --rdbms:url parameter or in the JDBC-URL input field.

Database JDBC-URL  Driver Source

Oracle

jdbc:oracle:thin:<user>/<pass>@<host>:<port>/<service_name>

MS SQLServer

jdbc:sqlserver://;servername=<servername>;databaseName=<database>;user=<user>;password=<pass>

IBM DB2

jdbc:db2://<host>:<port/5021>/<database>:user=<user>;password=<pass>;

Derby

jdbc:derby:derbyDB

Included since JDK6

Cassandra

jdbc:cassandra://<host>:<port/9042>/<database>

SAP Hana

jdbc:sap://<host>:<port/39015>/?user=<user>&password=<pass>

MySQL

jdbc:mysql://<hostname>:<port/3306>/<database>?user=<user>&password=<pass>

PostgreSQL

jdbc:postgresql://<hostname>/<database>?user=<user>&password=<pass>

neo4j-etl's People

Contributors

adam-cowley avatar jexp avatar jmhreif avatar mroiter-larus avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

neo4j-etl's Issues

ETL Tool(v1.3.2)_has blank screen with latest Neo4j Desktop(v1.1.14)

Latest ETL tool (v1.3.2) which fixed the blank screen on selection of non built in driver
(issue - #35) was fixed and verified by me against the previous version of Neo4j desktop

On upgrading to latest desktop (V1.1.14), on launching of ETL tool, i get an activity screen(spinning icon, grey screen) and on completion (developer tools does not indicate any error), there is a blank screen
etl-tool-deve-tools-2nd-feb

neo4j-etl error running on mac

I am trying to run this etl command on mac (works fine on linux):

./bin/neo4j-etl generate-metadata-mapping --rdbms:url jdbc:mysql://xx.xx.xx.xx:3306/test --rdbms:user test --rdbms:password tet --rdbms:schema test --output-mapping-file /tmp/CST/mapping.json

and I am getting this error:
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/neo4j/etl/NeoIntegrationCli : Unsupported major.minor version 52.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
It seems 'java' command is not available.
Please check your JAVA_HOME environment variable.
Also check if you have a valid Java 8 environment

running environment:

echo $JAVA_HOME
/Library/Java/Home

java -version
java version "1.6.0_65"
Java(TM) SE Runtime Environment (build 1.6.0_65-b14-468-11M4833)
Java HotSpot(TM) 64-Bit Server VM (build 20.65-b04-468, mixed mode)

Neo4J elt tool not escaping the characters in CSV

I am trying to export data from mysql to neo4j using neo4j-etl tool.

Neo4j Version : 3.3.4
Neo4j-etl Version : 1.2.0
Operating System : Mac OS X

I am running below command

./neo4j-etl export  --rdbms:url jdbc:mysql://localhost:3306/openmrs --rdbms:user root --rdbms:password asdf --rdbms:schema openmrs --neo4j:user neo4j --neo4j:password qwas --import-tool /usr/local/Cellar/neo4j/3.3.4/libexec/bin  --csv-directory /usr/local/Cellar/neo4j/3.3.4/libexec/import --mapping-file mappingNew.json --options-file options.json --using cypher:direct --quote '",' --debug --force

I am getting following error while importing. This is due to escape characters in a CSV file

org.neo4j.driver.v1.exceptions.DatabaseException: At /usr/local/Cellar/neo4j/3.3.4/libexec/import/csv-008/openmrs/NODE_openmrs.raxacoremessage_8e1dbe64-8a43-4c39-9513-1cd0aa175ab2.csv:44188 -  there's a field starting with a quote and whereas it ends that quote there seems to be characters in that field after that ending quote. That isn't supported. This is what I read: 'ghjg
","dd'
	at org.neo4j.driver.internal.net.SocketResponseHandler.handleFailureMessage(SocketResponseHandler.java:83)
	at org.neo4j.driver.internal.messaging.PackStreamMessageFormatV1$Reader.unpackFailureMessage(PackStreamMessageFormatV1.java:470)
	at org.neo4j.driver.internal.messaging.PackStreamMessageFormatV1$Reader.read(PackStreamMessageFormatV1.java:431)
	at org.neo4j.driver.internal.net.SocketClient.receiveOne(SocketClient.java:196)
	at org.neo4j.driver.internal.net.SocketConnection.receiveOne(SocketConnection.java:217)
	at org.neo4j.driver.internal.net.ConcurrencyGuardingConnection.receiveOne(ConcurrencyGuardingConnection.java:165)
	at org.neo4j.driver.internal.net.pooling.PooledSocketConnection.receiveOne(PooledSocketConnection.java:183)
	at org.neo4j.driver.internal.InternalStatementResult.receiveOne(InternalStatementResult.java:335)
	at org.neo4j.driver.internal.InternalStatementResult.consume(InternalStatementResult.java:291)
	at org.neo4j.etl.util.CypherBoltRunner.execute(CypherBoltRunner.java:64)
	at org.neo4j.etl.commands.rdbms.importer.BoltDriverImportFromRdbms.doLoadCsv(BoltDriverImportFromRdbms.java:29)
	at org.neo4j.etl.commands.rdbms.importer.AbstractLoadCsvImportFromRdbms.extractAndLoad(AbstractLoadCsvImportFromRdbms.java:58)
	at org.neo4j.etl.commands.rdbms.importer.BoltDriverImportFromRdbms.extractAndLoad(BoltDriverImportFromRdbms.java:16)
	at org.neo4j.etl.cli.rdbms.ImportFromRdbmsCli.run(ImportFromRdbmsCli.java:150)
	at org.neo4j.etl.util.CliRunner.run(CliRunner.java:42)
	at org.neo4j.etl.util.CliRunner.run(CliRunner.java:35)
	at org.neo4j.etl.NeoIntegrationCli.main(NeoIntegrationCli.java:43)

I have created a dummy database neocheck for replicating this issue

Here is the csv file

"1","1","p/a soft\","Ob"
"2","2","p/a soft\","Ob"
"3","3","dsfsad
dfsafsd
dsafds
dsafasd
dsfads
","Ob"
"4","4","dsaf
dfsadf
/dffas
dfsa","Ob"

When I am running this command

./neo4j-etl export  --rdbms:url jdbc:mysql://localhost:3306/neocheck --rdbms:user root --rdbms:password qwas --rdbms:schema neocheck --using bulk:neo4j-import  --neo4j:user neo4j --neo4j:password qwas --import-tool $NEO4J_HOME/bin --destination $NEO4J_HOME/data/databases/graph.db/ --csv-directory $NEO4J_HOME/import  --options-file options.json --debug --force

It is giving me the following error

Error in input data
Caused by:ERROR in input
  data source: BufferedCharSeeker[source:/Users/zakirsaifi/Desktop/Office/Task/ETL/neo4j-community-3.3.5/import/csv-001/neocheck/NODE_neocheck.OBS_6427d0d8-cdc5-47b9-9d39-c271c2646ed2.csv, position:4194376, line:0]
  in field: valueText:string:3
  for header: [:ID(neocheck.obs), id:long, valueText:string, :LABEL]
  raw field value: 1
  original error: At /Users/zakirsaifi/Desktop/Office/Task/ETL/neo4j-community-3.3.5/import/csv-001/neocheck/NODE_neocheck.OBS_6427d0d8-cdc5-47b9-9d39-c271c2646ed2.csv:0 -  there's a field starting with a quote and whereas it ends that quote there seems to be characters in that field after that ending quote. That isn't supported. This is what I read: 'p/a soft","OO'

WARNING Import failed. The store files in /Users/zakirsaifi/Desktop/Office/Task/ETL/neo4j-community-3.3.5/data/databases/graph.db are left as they are, although they are likely in an unusable state. Starting a database on these store files will likely fail or observe inconsistent records so start at your own risk or delete the store manually', DurationMillis: 1731 }]
	at org.neo4j.etl.process.ProcessHandle.await(ProcessHandle.java:84)
	at org.neo4j.etl.neo4j.importcsv.ImportFromCsvCommand.execute(ImportFromCsvCommand.java:29)
	at org.neo4j.etl.commands.rdbms.importer.Neo4jImportImportFromRdbms.doImport(Neo4jImportImportFromRdbms.java:63)
	at org.neo4j.etl.commands.rdbms.importer.Neo4jImportImportFromRdbms.extractAndLoad(Neo4jImportImportFromRdbms.java:41)
	at org.neo4j.etl.cli.rdbms.ImportFromRdbmsCli.run(ImportFromRdbmsCli.java:150)
	at org.neo4j.etl.util.CliRunner.run(CliRunner.java:42)
	at org.neo4j.etl.util.CliRunner.run(CliRunner.java:35)
	at org.neo4j.etl.NeoIntegrationCli.main(NeoIntegrationCli.java:43)
It seems 'java' command is not available.
Please check your JAVA_HOME environment variable.

This is sql file for the my dummy database.

https://drive.google.com/file/d/1y_KrA0QpZOZ0WMKxTWcFyy_Z2B1Mqz-p/view?usp=sharing(url)

When I am using neo4j-etl-cli-1.2.0-RC1. I was able to import the dummy database in offline bulk mode (using bulk:neo4j-import of etl) but not in any other mode. If I could import that in earlier version why in the new version it is not escaping the newline and other chracters.

I have discussed the same in below issue [#20]

add more mapping capabilities

In the wild almost every RDMBS stores denormalized data somewhere. To digest this into graph a simple "1 row == 1 node" approach is too restrictive.

Consider e.g. a table person having multiple email addresses email1, email2, email3 ...

In graph I want to map this to (p:Person)-[:HAS_EMAIL]->(:Email{name:email1}), (p)-[:HAS_EMAIL]->(:Email{name:email2}) , (p)-[:HAS_EMAIL]->(:Email{name:email3})

Error when trying to generate mapping from PostgreSQL

Hey,
I am trying to use Neo4j ETL to import some data from PostgreSQL. ETL is installed as part of Neo4j Desktop and it successfully connects to PostgreSQL. But when it attempts to generate the mapping, it produces an error and the following logs:

COMMAND: java -cp "/home/{USERNAME}/.config/Neo4j Desktop/Application/graphApps/neo4j-etl-ui/dist/neo4j-etl.jar" org.neo4j.etl.NeoIntegrationCli generate-metadata-mapping --rdbms:url "jdbc:postgresql://localhost:5433/{DATABASE NAME}?ssl=false" --rdbms:password "{PASSWORD}" --rdbms:user "{USERNAME}" --output-mapping-file "undefined/import/postgresql_{DATABASE NAME}_mapping.json"
- Skipping reading import options from file because file [] doesn't exist.
- Command failed due to error (FileNotFoundException: undefined/import/postgresql_{DATABASE NAME}_mapping.json (No such file or directory)). Rerun with --debug flag for detailed diagnostic information.

What are the issue and solution?
Thank you

missing output from import step

the user doesn't know what actually happened

the import step should also capture stdout/stderr (from cypher-shell / neo4j-shell / direct-import / import-tool)

or alternatively write to a file and report those outputs to the user

we should also check that data is actually added as required, as we know which labels/rel-types we can ask the running database before / after and show these counts + diff.

and for empty database / direct import we can show the output of the import tool
and then after we create the indexes show the counts in the running db.

Renaming nodes or relationships does not remove last character

This is more of a cosmetic nature but still a little bit of bad UX.
When renaming a node label or relationship type and the term gets completely removed, the last character will survive in the graph preview (pressing backspace).
If the term gets selected and I remove it all at once, let's say by using DEL, the old term completely survives in the preview.
I just assume that has something to do with model binding and events 😄
etl_renaming

Nothing happens when clicking "Test and Save Connection"

Using Neo4j Desktop 1.1.16 on Mac OS X Mojave, with ETL Tool 1.3.6. Allow background process is checked in app permissions. Attempted to create JDBC Connection to local mysql instance. Clicked on "Test and Save Connection" and got no response. (Same problem even if background process is NOT checked in app permissions.) Looked at console under developer tools and got this:

Desktop.0c0dadb70a857df9f5cf.bundle.js:141 [10:36:02:0451] Desktop API call failed at graph app 'neo4j-etl-ui'. Error: Invalid request. Requested permission 'backgroundProcess' is not declared in manifest.json file of 'Neo4j ETL Tool' graph-app. Please update the graph app or check app permissions in Settings -> Graph Applications.
at d.getDeclaredPermission (/Applications/Neo4j Desktop.app/Contents/Resources/app.asar/dist/main.prod.js:1:92493)
at d.requestPermission (/Applications/Neo4j Desktop.app/Contents/Resources/app.asar/dist/main.prod.js:1:91505)

Appreciate any help!

Every field from MYSQL mapped to strings

When I tried to import with --using cypher:direct, every field from MYSQL database got imported as strings in neo4j database. But imports is fine with correct data types if done without --using cypher:direct.
Is this a bug?

Azure MSSQL does not migrate relationships

It was a beautiful day in Azure
1
when suddenly...
image

> 'You need to have foreign-key constraints set up in your relational database, otherwise the database-metadata doesn't identify which tables are connected with each other.'

Proof of

foreign keys constraints

being correctly setup

6

SO posts were done, to no avail!

So here we are, making a last stand against this bug!

image

We know that AWS works:

2

we just happen to be working with Azure at the moment and MSSQL specifically, which is already causing a much more reasonable bug, currently. #33

Bugs like #33 are much easier to reason about; the exception is right there ready to be read.
Bugs like this one, however, are much more difficult to catch, and much more easier to dismiss.

For this reason we want to help providing with as much documentation as needed.

Import data from mysql by Command Line Tool stop on Closing connection pool towards localhost:7687

OS:Windows 7
When I use command

neo4j-etl.cmd export --rdbms:url jdbc:mysql://localhost:3306/dbtest?serverTimezone=UTC --force --debug --neo4j:user neo4j --neo4j:password admin --rdbms:user root --rdbms:password root --destination D:/neo4j-community-3.5.4/data/databases/graph1.db/ --import-tool D:/neo4j-community-3.5.4/bin --csv-directory D:/neo4j-community-3.5.4/import
the programs stop on
信息: Closing connection pool towards localhost:7687 详细: Executing command 'D:\neo4j-community-3.5.4\bin\neo4j-import.bat --f D:\neo4j-community-3.5.4\import\csv-038\neo4j-admin-import-params'
At the same time,the new db's debug log show
2019-04-24 02:15:45.091+0000 WARN [o.n.k.i.s.MetaDataStore] Missing counts store, rebuilding it. 2019-04-24 02:15:45.110+0000 WARN [o.n.k.i.s.MetaDataStore] Counts store rebuild completed. 2019-04-24 02:15:45.137+0000 INFO [o.n.u.i.b.ImportLogic] Import starting
WebUI:
image
image

But when I input "ctrl+c" on command line tool,the debug.log show
2019-04-24 02:15:45.091+0000 WARN [o.n.k.i.s.MetaDataStore] Missing counts store, rebuilding it. 2019-04-24 02:15:45.110+0000 WARN [o.n.k.i.s.MetaDataStore] Counts store rebuild completed. 2019-04-24 02:15:45.137+0000 INFO [o.n.u.i.b.ImportLogic] Import starting 2019-04-24 02:59:32.928+0000 INFO [o.n.k.i.s.c.CountsTracker] Rotated counts store at transaction 1 to [D:\neo4j-community-3.5.4\data\databases\graph1.db\neostore.counts.db.b], from [D:\neo4j-community-3.5.4\data\databases\graph1.db\neostore.counts.db.a]. 2019-04-24 02:59:32.975+0000 INFO [o.n.u.i.b.ImportLogic] Import completed successfully, took 43m 47s 808ms. Imported: 19 nodes 18 relationships 40 properties
and the WebUI show
image
And it's done

csv files for relationships uses backticks to quote start/end node id

e.g. from northwind I got:

`35`,`1`,`INVENTORY_TRANSACTION_TYPES`
`36`,`1`,`INVENTORY_TRANSACTION_TYPES`
`37`,`1`,`INVENTORY_TRANSACTION_TYPES`
`38`,`1`,`INVENTORY_TRANSACTION_TYPES`
`39`,`1`,`INVENTORY_TRANSACTION_TYPES`

In import more cypher-shell a toInt(row[0]) obviously fails.

composite indexes / FK in Postgres

From a user:

Yes, I'm finding that out. In particular it seems not to handle composite primary-keys and indices [ie: keys created from multiple 'columns' in same table] which is prevalent in RDBMS relationship modeling. I've used Apple's CoreData & Enterprise Objects Framework [EOF] extensively, which utilizes Object Graphs (similar to Neo4J ??) and also utilizes rdbmses [PostgreSQL, etc] as the 'back end' persistent storage containers. I'll be exploring a way to take my Object Graphs and map them 'directly' into Neo4J... I hope that works.

  • Command failed due to error (PSQLException: The column index is out of range: 2, number of columns: 1.). Rerun with --debug flag for detailed diagnostic information.

I just noticed that I am required/expected me to ‘re-model’ my RDBMS data model structure prior to importing with Neo4J-ETL. I thought the Neo4J-ETL migration would access

the existing dbms [jdbc connection] and create a ‘raw’ default/initial Neo4J-ETL graph model representation, and get subsequently refined as appropriate. That was my misunderstanding, I hope. The PSQLException error happens because I have the natural/traditional ‘composite property indexes and key references’ needed for dbms modeling, nut not used in graph models

Error message "Mapping error" can be more precise

When starting the mapping process, there are several possibilities that might lead to the error message "Mapping error". I just included one screenshot for a not yet started MySql server.

The configuration I was faced with this very generic message:

  • No SQL database defined at all
  • No SQL database selected
  • Connection error (Not available, wrong credentials...)
  • No Neo4j selected (after starting the tool there is no database selected by default)

I could imagine that there could be more cases that lead to the error message.

As a first improvement I would suggest that parsing for known errors in the logs and either linking to them when clicking the error message badge/banner/notification and/or mapping it to something more end-user readable and adding the cause to the notification would be an improvement.

Just thinking one step further highlighting the place where the error occurred (e.g. the SQL or Neo4j DB box that is causing it) might be also a good visual help.

etl_mapping_error

Error exporting from RDBMS to CSV

My steps:

  • Generating metadata mapping...
    ./bin/neo4j-etl postgresql generate-metadata-mapping --host myhost --port 5432 --user myuser --password 'mypassword' --database mydb > /tmp/ilikeit/mapping.json

  • Creating options.json...
    echo '{"multiline-fields":"true"}' > /tmp/ilikeit/options.json

  • Exporting from RDBMS to CSV...
    ./bin/neo4j-etl postgresql export --host myhost --port 5432 --user myuser --password 'mypassword' --database mydb --destination $NEO4J_HOME/data/databases/graph.db/ --import-tool $NEO4J_HOME/bin --csv-directory /tmp/ilikeit --mapping-file /tmp/ilikeit/mapping.json --options-file /tmp/ilikeit/options.json --debug --force --quote '"'

The output is:

errorneo4jetl

I see that in mapping.json it is added a prefix "null." to original table name.
In fact, in the attached output I see "ERROR: relation "null.ilk_user_accounts" does not exist", but my postgresql table name is ilk_user_accounts, not null.ilk_user_accounts.

Can you help me?

oracle disabled foreign-keys

If someone define a constraint but do not explicitly enable or disable it, Oracle enables it by default.
Any SQL INSERT, UPDATE or DELETE command applied to a table with constraints enabled has the possibility of failing. This is not true in case a constraint is disabled so Neo4j-etl, should, in my opinion be able to represent this situation while importing. If a table with a foreign key (which is enabled) is treated as a Join and imported as a node with a relationship, neo4j-etl should not import a table with a disabled foreign-key and inform any end-user of the event.

Import data from mssql drills down CommandFailedException

Hello,

I used the GUI Neo4j ETL Tool and installed it like the documentation said.
Everything went well (connection, mapping etc etc) but when I try to import the data in the import step of the Neo4j ETL Tool a CommandFailedException is thrown. It drills down the following:

COMMAND: java -cp "C:\Users\jjonkman.Neo4jDesktop\graphApps\neo4j-etl-ui/dist/neo4j-etl.jar;C:\Program Files\Microsoft JDBC Driver 6.0 for SQL Server\sqljdbc_6.0\enu\jre8\sqljdbc42.jar" org.neo4j.etl.NeoIntegrationCli export --mapping-file "C:\Users\jjonkman.Neo4jDesktop\neo4jDatabases\database-77d8ccaf-6573-4998-b813-cb68ab84343f\installation-3.5.0/import/mssql_netCore.Project_mapping.json" --destination "C:\Users\jjonkman.Neo4jDesktop\neo4jDatabases\database-77d8ccaf-6573-4998-b813-cb68ab84343f\installation-3.5.0/data/databases/graph.db/" --rdbms:password "testtest" --import-tool "C:\Users\jjonkman.Neo4jDesktop\neo4jDatabases\database-77d8ccaf-6573-4998-b813-cb68ab84343f\installation-3.5.0/bin" --rdbms:user "neo4j" --rdbms:url "jdbc:sqlserver://127.0.0.1:53407;databaseName=netCore.Project" --csv-directory "C:\Users\jjonkman.Neo4jDesktop\neo4jDatabases\database-77d8ccaf-6573-4998-b813-cb68ab84343f\installation-3.5.0/import" --options-file "C:\Users\jjonkman.Neo4jDesktop\neo4jDatabases\database-77d8ccaf-6573-4998-b813-cb68ab84343f\installation-3.5.0/import/import-tool-options.json" --using "bulk:neo4j-import" --force

  • Reading options from file C:\Users\jjonkman.Neo4jDesktop\neo4jDatabases\database-77d8ccaf-6573-4998-b813-cb68ab84343f\installation-3.5.0\import\import-tool-options.json.
  • Reading metadata mapping from file: C:\Users\jjonkman.Neo4jDesktop\neo4jDatabases\database-77d8ccaf-6573-4998-b813-cb68ab84343f\installation-3.5.0/import/mssql_netCore.Project_mapping.json
  • Running ETL on Neo4j 3.5.0 - ENTERPRISE
  • Exporting from RDBMS to CSV...
  • CSV directory: C:\Users\jjonkman.Neo4jDesktop\neo4jDatabases\database-77d8ccaf-6573-4998-b813-cb68ab84343f\installation-3.5.0\import\csv-003
  • Writing CSV headers for node NODE_msdb.dbo.sysoriginatingservers_1eeeaa17-b467-4779-999a-5746c4671ee3
  • Writing CSV headers for node NODE_msdb.dbo.sysutilitymismoobjectstocollectinternal_5e3fd1c1-ae66-4142-ba58-d059c09a10da
  • Writing CSV headers for node NODE_msdb.dbo.syscollectorconfigstoreinternal_337afb54-a8aa-4cfb-b1e5-af3ea6e5c776
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpmanagedinstancesinternal_544e24da-0ed8-4114-9bf8-6a53eeb10ad2
  • Writing CSV headers for node NODE_msdb.dbo.MSdbmsmap_ccf57588-9180-4ade-841e-4acc3d5aee89
  • Writing CSV headers for node NODE_msdb.dbo.syspolicypoliciesinternal_0c73b4db-642a-4036-874f-12dbd314da6c
  • Writing CSV headers for node NODE_msdb.dbo.logshippingmonitorerrordetail_61272eaf-9b29-4315-b173-0ac288fd363d
  • Writing CSV headers for node NODE_msdb.dbo.syscollectorexecutionstatsinternal_b5c585af-5975-4e5a-8dae-f5b01e2d9ec1
  • Writing CSV headers for node NODE_msdb.dbo.sysutilitymicpustageinternal_b9c1fb12-8772-4c76-93b3-5735a5aaa557
  • Writing CSV headers for node NODE_msdb.dbo.autoadminmasterswitch_69334a01-a7ab-44fa-b2bf-a4559e7c9198
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpdacfilespacehealthinternal_5129e24f-0b62-4975-9498-b64b1c34493e
  • Writing CSV headers for node NODE_msdb.dbo.sysssislog_4f6e3321-3fd9-4184-b63b-ce72da97ee87
  • Writing CSV headers for node NODE_msdb.dbo.logshippingprimarysecondaries_d4b7286a-088d-43c7-9003-c61005b861c9
  • Writing CSV headers for node NODE_msdb.dbo.sysmailsendretries_a315a52f-c041-444c-82b9-ad9ac670d2c4
  • Writing CSV headers for node NODE_msdb.dbo.autoadminsystemflags_6e49bf18-bd2e-4379-863d-f29d7a43a5b8
  • Writing CSV headers for node NODE_netCore.Project.sys.tracexeactionmap_ef00fb7e-6add-4fff-99ea-1710ecab4ed2
  • Writing CSV headers for node NODE_msdb.dbo.sysmaintplanlog_79a7c32f-b600-42fb-af7c-8cc78eceeaba
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpprocessingstateinternal_8e5be7fd-8bc8-40f4-b0c1-4ee4634f1266
  • Writing CSV headers for node NODE_msdb.dbo.syspolicyexecutioninternal_8457fb93-b5cd-49e5-9c65-b65f39aa67ce
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucplogfilesstub_5946f79a-1c52-4dae-ad72-bf41c90a1bb4
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpsupportedobjecttypesinternal_f6a4973e-9642-4fc7-90bd-62dab2673753
  • Writing CSV headers for node NODE_msdb.dbo.syscachedcredentials_240b680c-8b7a-4165-9eea-6ea88c424bb2
  • Writing CSV headers for node NODE_msdb.dbo.sysproxies_7088f96d-161e-4fac-9729-b6c31ec38e36
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpsnapshotpartitionsinternal_3b79a79a-1648-4642-a337-d990dc76d5d7
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpdachealthinternal_0223cc33-6322-497f-8403-b6c2dc665921
  • Writing CSV headers for node NODE_msdb.dbo.sysdownloadlist_44aeb25b-115b-41f8-a02d-640c2357bf69
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpdacsstub_d21ccdb4-fc9d-45bc-b70c-90f0560412c0
  • Writing CSV headers for node NODE_msdb.dbo.syspolicytargetsetsinternal_52ba5cb9-923e-47a9-93f5-00c73a264abf
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpvolumesstub_aa98b8d9-371d-496e-8fe0-f3f071f9d194
    null
  • Writing CSV headers for node NODE_msdb.dbo.sysmailattachments_f7638f1e-8cf2-45b6-bde2-62278629a947
  • Writing CSV headers for node NODE_msdb.dbo.sysutilitymisessionstatisticsinternal_acd192ff-6f76-4ff1-9e27-aa9b51eb52d8
  • Writing CSV headers for node NODE_msdb.dbo.sysmanagementsharedservergroupsinternal_c538ef04-1d26-4eb6-9bc7-7429b9af0c7e
  • Writing CSV headers for node NODE_msdb.dbo.backupmediaset_8f852abe-8f4d-416e-a6ab-87c546ebc58b
  • Writing CSV headers for node NODE_msdb.dbo.sysjobs_7ba092c1-132f-4f67-992c-74b297943a9e
  • Writing CSV headers for node NODE_msdb.dbo.sysmailservertype_0d921514-f07c-4744-a491-942ce96c5fde
  • Writing CSV headers for node NODE_msdb.dbo.sysmailprofileaccount_27ece2a1-af13-4b55-ae9a-50141d0e9712
  • Writing CSV headers for node NODE_netCore.Project.dbo.Course_9d528757-b893-4ba2-85b3-34291612e713
  • Writing CSV headers for node NODE_msdb.dbo.systargetservers_5ab51f54-ddae-42fb-aebc-5bf45ace08e7
  • Writing CSV headers for node NODE_msdb.sys.tracexeactionmap_183b940e-8bf5-4b5f-b6bc-1ef37d1f74bc
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpcomputersstub_81202085-c7b7-400b-ba30-566e57d702c4
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpfilegroupsstub_4e5d09fd-6aba-4078-a6c8-580e257d323f
  • Writing CSV headers for node NODE_msdb.dbo.syscollectorcollectionitemsinternal_9c03c003-dc92-4114-8729-5274960ffc2f
  • Writing CSV headers for node NODE_msdb.dbo.sysutilitymidacexecutionstatisticsinternal_19175d46-7b48-4946-a7a6-c08a72dc6119
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucppolicyviolationsinternal_45e9277c-c428-443a-90bb-b4f7387fc6ee
  • Writing CSV headers for node NODE_master.dbo.MSreplicationoptions_ac1f99af-5ba3-4173-8098-6e0e2365c88b
  • Writing CSV headers for node NODE_msdb.dbo.sysschedules_8b9d4521-4171-4f7d-83cd-6b8d3aab0d9a
  • Writing CSV headers for node NODE_msdb.dbo.syscollectorcollectionsetsinternal_6b3a8bb7-e963-451f-a35d-42a99fd02894
  • Writing CSV headers for node NODE_msdb.dbo.smartbackupfiles_257645de-d660-4c5e-9ec4-a9971e6218fe
  • Writing CSV headers for node NODE_msdb.dbo.sysdbmaintplanjobs_865dd7de-f574-4504-95f5-77fe1c8c463c
  • Writing CSV headers for node NODE_msdb.dbo.syspolicyconditionsinternal_16b6b69c-716f-470c-98b5-e87202f94487
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpmifilespacehealthinternal_769a2f74-868d-4cae-beba-64f75f034fe8
  • Writing CSV headers for node NODE_msdb.sys.tracexeeventmap_95e047d5-9fbe-441f-a326-d07c4a13b177
  • Writing CSV headers for node NODE_msdb.dbo.MSdbmsdatatype_70d28bae-50e7-4cf8-b4bb-22ddc485228f
  • Writing CSV headers for node NODE_msdb.dbo.sysjobsteps_7a6df0fd-23d2-4655-837b-e014b1c42c92
  • Writing CSV headers for node NODE_msdb.dbo.syscollectorcollectortypesinternal_e730e68f-df4f-44ef-877a-2c67bc6c3960
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpspaceutilizationstub_a4b8a153-f1cf-4254-9009-68deac4a433b
  • Writing CSV headers for node NODE_master.sys.tracexeactionmap_684d87bc-3b30-44b0-8115-54aa6e13b64c
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpcomputercpuhealthinternal_5b19b886-917d-4121-a76f-367d6a2c386d
  • Writing CSV headers for node NODE_msdb.dbo.syspolicypolicycategoriesinternal_8292012d-f026-4e36-9694-e8a5ffe5ae8e
  • Writing CSV headers for node NODE_master.sys.tracexeeventmap_034026ce-d19c-4616-b9c4-4e34566c1eae
  • Writing CSV headers for node NODE_msdb.dbo.sysmaillog_4fcc4d68-1e1d-4f0a-b040-4f7a770976a1
  • Writing CSV headers for node NODE_master.dbo.sptfallbackdev_6ee72686-7a76-439f-b5a8-4ef531df025e
  • Writing CSV headers for node NODE_msdb.dbo.sysutilitymivolumesstageinternal_1612f37e-9eca-4d8a-9fd3-005bb672d037
  • Writing CSV headers for node NODE_msdb.dbo.systargetservergroups_b71ac567-a81e-4915-818a-f1ab1001d550
  • Writing CSV headers for node NODE_msdb.dbo.logshippingmonitorprimary_166710cb-1805-427c-9080-432bdbebe399
  • Writing CSV headers for node NODE_msdb.dbo.syspolicypolicyexecutionhistoryinternal_d527e9d2-53b9-41a2-a3c6-f635c7c8effb
  • Writing CSV headers for node NODE_msdb.dbo.backupmediafamily_1828bc79-33c4-428a-9a60-b1907cd27e24
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpaggregatedmihealthinternal_0f459624-f346-40a6-a515-3a44dfefc345
  • Writing CSV headers for node NODE_msdb.dbo.syspolicyconfigurationinternal_74943f2d-e7be-4b6d-8035-8d2af7268f7d
  • Writing CSV headers for node NODE_msdb.dbo.logmarkhistory_6f562fda-4bcd-42da-9ac2-a2680b591dfb
  • Writing CSV headers for node NODE_msdb.dbo.sysdbmaintplanhistory_133ddeb5-b55d-4513-96db-2f1627fca664
  • Writing CSV headers for node NODE_msdb.dbo.logshippingmonitorsecondary_205a2e9b-ba15-4247-9927-4c8e573be9f2
  • Writing CSV headers for node NODE_msdb.dbo.syscollectorexecutionloginternal_5150e7ea-e713-41a3-a420-7d7d48c61d70
  • Writing CSV headers for node NODE_msdb.dbo.sysmailaccount_c2343d2a-9f54-4d8b-9ce1-74188dd09a5f
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpconfigurationinternal_0a8e846b-1693-4522-8d23-b575a4dce984
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpaggregateddachealthinternal_1680f03d-0fff-440b-afc1-67a07bd7b9a6
  • Writing CSV headers for node NODE_msdb.dbo.restorefile_cbcfaaa0-a01f-4cae-9cce-c79a4f44011d
  • Writing CSV headers for node NODE_msdb.dbo.syspolicyfacetevents_69ef90ff-1a86-42bd-b903-175245dcc90d
  • Writing CSV headers for node NODE_msdb.dbo.syspolicyobjectsetsinternal_a5f16f1a-ed36-48eb-add2-8c77a3872c66
  • Writing CSV headers for node NODE_msdb.dbo.logshippingmonitorhistorydetail_71d62a4e-cac1-4a59-ab7f-01d7b288d34e
  • Writing CSV headers for node NODE_msdb.dbo.syspolicymanagementfacets_210771e0-95e1-4baa-a8e9-99b13aea8f49
  • Writing CSV headers for node NODE_netCore.Project.dbo.Student_c457034c-6a3c-43c5-ae32-a10d4073ee6b
  • Writing CSV headers for node NODE_msdb.dbo.sysdbmaintplans_40b4d238-5f72-4643-9ccf-3bb0c654fb3c
  • Writing CSV headers for node NODE_msdb.dbo.sysproxylogin_bdc3aec7-a501-42ee-8546-c14c18bf83d7
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpcpuutilizationstub_48717dea-40c3-4594-8282-8ae981ed6694
  • Writing CSV headers for node NODE_master.dbo.sptfallbackusg_bb70aa03-e5e1-40cb-b4fb-c7d05a49b9ac
  • Writing CSV headers for node NODE_msdb.dbo.sysmaintplansubplans_b2c28267-a6bc-412e-b0ed-61c66e728052
  • Writing CSV headers for node NODE_msdb.dbo.sysdacinstancesinternal_8b860224-6fef-4c1f-97c1-5d27c88c0674
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucphealthpoliciesinternal_e78d4734-2ce4-4963-8902-f08f90f7f256
  • Writing CSV headers for node NODE_msdb.dbo.sysmaintplanlogdetail_b24717fd-5089-4cbf-aa59-ef0152c55844
    null
  • Writing CSV headers for node NODE_master.dbo.sptmonitor_3f1508cd-b4f9-463e-bbab-220eb750ce03
  • Writing CSV headers for node NODE_msdb.dbo.sysmailprincipalprofile_a183b54c-d9e1-4681-a0ce-9a8daf8f061d
  • Writing CSV headers for node NODE_msdb.dbo.sysoperators_3c0b6905-3245-4fd7-b55e-74b87f7aa75e
  • Writing CSV headers for node NODE_msdb.dbo.logshippingprimarydatabases_97ec72e6-67c5-44f9-9c83-367d3faa7ee3
  • Writing CSV headers for node NODE_msdb.dbo.logshippingsecondary_7c758d60-4d08-47f6-8629-66138559ca2f
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpmivolumespacehealthinternal_e682dae7-db3e-4fa0-b5e0-63f1c0c600fb
  • Writing CSV headers for node NODE_msdb.dbo.logshippingprimaries_edced672-8d57-4b10-8c50-0ee8eb83dc33
  • Writing CSV headers for node NODE_netCore.Project.sys.tracexeeventmap_4133a597-54da-4ff4-bd5b-fd5d8d227f0e
  • Writing CSV headers for node NODE_msdb.dbo.sysssispackages_8d3c4353-f41d-498e-b165-4d50aef4d7f6
  • Writing CSV headers for node NODE_msdb.dbo.restorefilegroup_fe6cee41-b60d-451d-a61c-acc14df895f2
  • Writing CSV headers for node NODE_msdb.dbo.sysjobstepslogs_01134257-d1e1-42aa-968c-51754d2bf5cb
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucppolicytargetconditionsinternal_9c1ed379-ab12-4bdf-89f7-785a9845a26b
  • Writing CSV headers for node NODE_msdb.dbo.logshippingsecondaries_901b2882-ca79-41d7-961d-15f8002d03f8
  • Writing CSV headers for node NODE_msdb.dbo.MSdbms_1c440cac-017d-45d4-8a65-133aedb9d0b1
  • Writing CSV headers for node NODE_msdb.dbo.sysdachistoryinternal_a0b188bf-ea6f-40d4-84b7-3ff33124feb7
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpmidatabasehealthinternal_1377bd4d-00c5-48eb-a182-ff07bc11ae02
  • Writing CSV headers for node NODE_msdb.dbo.syscollectortsqlquerycollector_e91e5e86-6f09-46bf-bdc9-d0c39608c67d
  • Writing CSV headers for node NODE_msdb.dbo.sysmailprofile_92ca024f-d5a0-4b28-8875-648319db4c80
  • Writing CSV headers for node NODE_msdb.dbo.sysjobservers_3d69725c-758a-4249-9eab-96a8d54d49d9
  • Writing CSV headers for node NODE_msdb.dbo.sysmanagementsharedregisteredserversinternal_df686d98-46d6-47e9-8f5b-3c3d0e540a5c
  • Writing CSV headers for node NODE_msdb.dbo.backupset_55f7edb0-1484-4ec1-9628-2ed77d66d85f
  • Writing CSV headers for node NODE_msdb.dbo.sysproxysubsystem_a4eed6f0-f3fe-4809-94ae-9ce5f9f88d5f
  • Writing CSV headers for node NODE_model.sys.tracexeeventmap_ee99a955-dbae-440a-ab7a-f7932839c637
  • Writing CSV headers for node NODE_msdb.dbo.sysmailattachmentstransfer_f1e2fb97-b5e8-4b65-bc89-5d225b2a952b
  • Writing CSV headers for node NODE_msdb.dbo.logshippingmonitoralert_a807b8c3-beff-474b-8a3a-554159f2bd97
  • Writing CSV headers for node NODE_msdb.dbo.backupfile_da96a6c3-f553-4728-a958-208f537ed6bc
  • Writing CSV headers for node NODE_msdb.dbo.MSdbmsdatatypemapping_2854a8c3-ac7f-44cb-a7be-7de05d4f82c0
  • Writing CSV headers for node NODE_msdb.dbo.suspectpages_e3d4d824-ed9d-4b40-9676-79d5e7495ab9
  • Writing CSV headers for node NODE_msdb.dbo.sysmailconfiguration_10dc2c8b-505c-4723-914a-4f08bfc46377
  • Writing CSV headers for node NODE_msdb.dbo.restorehistory_38cdcefb-53f4-45d3-ad0f-832e09b4153f
  • Writing CSV headers for node NODE_msdb.dbo.sysmailmailitems_69856dbb-da13-4e30-96a5-5035c12d040d
  • Writing CSV headers for node NODE_msdb.dbo.syspolicysystemhealthstateinternal_561218e3-c0a2-43a7-8c93-9b87b5a5d801
  • Writing CSV headers for node NODE_msdb.dbo.systargetservergroupmembers_e6d29850-816f-4904-9d1e-6ae385211ced
  • Writing CSV headers for node NODE_netCore.Project.dbo.sysdiagrams_32b71735-c8b4-41e8-96c5-c8be342e4e29
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucppolicycheckconditionsinternal_b5811036-5402-48b5-b386-328692a6f3c2
  • Writing CSV headers for node NODE_msdb.dbo.autoadmintaskagents_d3cd843c-e908-432a-be5e-b9c914d68ab6
  • Writing CSV headers for node NODE_msdb.dbo.autoadminmanageddatabases_cc33073e-255b-497c-94ad-b9e1d8489191
  • Writing CSV headers for node NODE_msdb.dbo.syscollectorblobsinternal_0b29f70f-4f18-4bb4-9d14-ff3dbe573d5d
  • Writing CSV headers for node NODE_msdb.dbo.sysjobhistory_a1eefe58-ecb1-4e0a-b7ee-a387e2be9159
  • Writing CSV headers for node NODE_msdb.dbo.sysutilitymismopropertiestocollectinternal_039bd0f0-4eb1-4ce1-bead-7377781cb7be
  • Writing CSV headers for node NODE_msdb.dbo.dmhadrautomaticseedinghistory_0aae5c99-be73-433b-82e8-be4208436059
  • Writing CSV headers for node NODE_tempdb.sys.tracexeeventmap_e37766b9-c64f-48d0-8bf2-a6a63bfaad2d
  • Writing CSV headers for node NODE_msdb.dbo.sysnotifications_e80dc19f-1ddd-4ea3-9f93-98d6a54357de
  • Writing CSV headers for node NODE_master.dbo.sptfallbackdb_7adb6afc-7ccb-4b95-a82b-92a4fd816a02
  • Writing CSV headers for node NODE_msdb.dbo.autoadmintaskagentmetadata_cbe9b1ae-5e51-48ac-b3ad-ef9a701b8a71
  • Writing CSV headers for node NODE_msdb.dbo.sysdbmaintplandatabases_927f8624-7a55-404b-a70f-513066f5e663
  • Writing CSV headers for node NODE_msdb.dbo.syssessions_91264b83-d27d-4624-bfde-19fde272929c
  • Writing CSV headers for node NODE_msdb.dbo.externallibrariesinstalled_207a2e64-500f-4754-974a-49ad2ca67e53
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpdatabasesstub_8fb23fbc-ccf8-405b-beac-c4682f124009
  • Writing CSV headers for node NODE_msdb.dbo.syspolicypolicyexecutionhistorydetailsinternal_7514aa9e-df10-4add-9b3a-081a9fc1d58e
  • Writing CSV headers for node NODE_tempdb.sys.tracexeactionmap_733831e7-3f04-4e69-8303-0a8ef7c2a7db
  • Writing CSV headers for node NODE_msdb.dbo.syssubsystems_e64896f3-10b0-4c38-85ab-492330326868
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpsmoserversstub_243944b1-6eec-4b28-979c-7271b3f34600
  • Writing CSV headers for node NODE_msdb.dbo.msdbversion_7c4a2faa-c0db-49eb-ad8b-626f89761592
  • Writing CSV headers for node NODE_msdb.dbo.sysmailquerytransfer_5c72b513-7232-40f4-a67a-898cdd28e0d4
  • Writing CSV headers for node NODE_msdb.dbo.backupfilegroup_50acb502-97bc-494a-936a-f3144793ba78
  • Writing CSV headers for node NODE_msdb.dbo.syspolicypolicycategorysubscriptionsinternal_01305b35-6518-4df6-b3f2-e59a435db883
  • Writing CSV headers for node NODE_model.sys.tracexeactionmap_24ff3bab-bdb3-4f31-96a0-244e44b97386
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpfilegroupswithpolicyviolationsinternal_e740265f-adc5-4d51-9f2c-97c723b32fdd
  • Writing CSV headers for node NODE_msdb.dbo.sqlagentinfo_313b7a48-582a-4910-8ce3-75965a1dd86b
  • Writing CSV headers for node NODE_msdb.dbo.sysutilitymismostageinternal_4750b35e-ce7d-441d-99b0-229d512f05c7
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpmihealthinternal_2b073e8c-67c9-48fe-a838-905486aa6bcf
  • Writing CSV headers for node NODE_msdb.dbo.sysssispackagefolders_25edced3-90ca-4b42-8c07-8a159dddd2ac
  • Writing CSV headers for node NODE_msdb.dbo.systaskids_11eac400-eced-4ad3-9495-76f123aa9791
  • Writing CSV headers for node NODE_msdb.dbo.sysalerts_c3c2f587-eb49-4bbd-bab9-cecec00c72bc
  • Writing CSV headers for node NODE_msdb.dbo.syscategories_90cd663a-1d30-4562-81e8-0ce8f5e3a354
  • Writing CSV headers for node NODE_msdb.dbo.logshippingsecondarydatabases_9cb1604c-e7c1-44be-a839-9b8e08133004
  • Writing CSV headers for node NODE_msdb.dbo.sysutilityucpdatafilesstub_094afc0c-bc00-490f-89f2-a429c1237344
  • Writing CSV headers for node NODE_msdb.dbo.sysutilitymiconfigurationinternal_ad0604e5-b4ba-40a8-a9a7-43adbdfd2dd0
  • Writing CSV headers for relationship REL_sysutilitymismoobjectstocollectinternal_51e53bff-fc20-4657-8303-27bbcab2b626
  • Writing CSV headers for relationship REL_backupmediaset_c7268356-d85c-40c0-b8c0-4289d3d028fe
  • Writing CSV headers for relationship REL_syspolicyconditionsinternal_9d36e21a-9c81-40f8-87cb-030b5a589a48
  • Writing CSV headers for relationship REL_sysdbmaintplans_0de8563a-d7da-4323-ab2d-f7ca8c7c0bac
  • Writing CSV headers for relationship REL_syspolicypoliciesinternal_ba5ac60d-b719-4441-82cd-028093972d5d
  • Writing CSV headers for relationship REL_sysjobs_a7738dc3-7d57-4fa1-834a-e83e1d09412f
  • Writing CSV headers for relationship REL_syspolicypoliciesinternal_0011754f-8498-4698-afba-25c350e0b97a
  • Writing CSV headers for relationship REL_sysmaintplanlog_92dbd414-06f5-45b0-9d46-3a86c1269171
  • Writing CSV headers for relationship REL_logshippingprimaries_b9936cf0-65f2-4e6a-9ac5-49e212173928
  • Writing CSV headers for relationship REL_sysdbmaintplans_6e06f593-e0a5-4017-b202-1a4a987fc8f9
  • Writing CSV headers for relationship REL_backupset_d3131322-ccfc-4b7c-b048-bc702c5b1ca1
  • Writing CSV headers for relationship REL_sysproxies_fb7798f1-5719-45de-8b46-7a29de1ff304
  • Writing CSV headers for relationship REL_sysjobs_df9b3caa-7c2b-4540-84ba-7ff5189c22df
  • Writing CSV headers for relationship REL_backupset_de0eb376-4d0d-460c-879c-3caa39104969
  • Writing CSV headers for relationship REL_syspolicypolicyexecutionhistoryinternal_aae207f3-8e84-4d7e-96b1-30d1eb14156b
  • Writing CSV headers for relationship REL_MSdbmsdatatype_e05f7329-fd73-4f9a-8167-d9911a0e8b5c
  • Writing CSV headers for relationship REL_syspolicymanagementfacets_cdfeeb5c-4914-47ec-9891-ea2a1f3da37a
  • Writing CSV headers for relationship REL_syspolicyconditionsinternal_0154148f-20e2-4b1d-bfc7-4f69901a4e0e
  • Writing CSV headers for relationship REL_syscollectorcollectortypesinternal_7f3eff7e-b396-4823-80d1-e1ab98a5ebea
  • Writing CSV headers for relationship REL_syspolicyobjectsetsinternal_a296cca1-280f-4ab3-af22-49d1d4e0779c
  • Writing CSV headers for relationship REL_backupmediaset_fdee91df-451b-453a-9db7-b9711ecd9f2b
  • Writing CSV headers for relationship REL_sysssispackages_8bea6cbb-6412-424c-941d-d0ce7d398f40
  • Writing CSV headers for relationship REL_syspolicymanagementfacets_fbd07748-a805-447d-8d8f-fa46b3f4aae4
  • Writing CSV headers for relationship REL_MSdbms_903e3a10-63c4-4644-944c-7cea7b38c71e
  • Writing CSV headers for relationship REL_syscollectorcollectionsetsinternal_18fb3bdd-ebe2-4e93-8482-cda12ed00757
  • Writing CSV headers for relationship REL_MSdbmsdatatypemapping_90736930-81ee-4500-9e0b-f80c6c7972b0
  • Writing CSV headers for relationship REL_sysmailaccount_cfd07ad0-89b9-42aa-9e9c-275ec1ea5fa9
  • Writing CSV headers for relationship REL_sysschedules_a8282c6c-c692-45c1-9bf4-5afa5371b333
  • Writing CSV headers for relationship REL_sysjobs_c7da18fe-a1a5-45f3-a4fc-e6ab6134ccc5
  • Writing CSV headers for relationship REL_restorehistory_0ac2ffb2-bb78-45c6-86cc-36ff90f6088a
  • Writing CSV headers for relationship REL_MSdbms_17834637-45d0-48f8-98ad-c8376cc238a5
  • Writing CSV headers for relationship REL_sysmailmailitems_eac4c376-c747-4d1a-8fe7-43678ec76b7c
  • Writing CSV headers for relationship REL_syscollectorexecutionloginternal_9fd57a4f-4c9c-4e4e-b011-bf7d2e847bea
  • Writing CSV headers for relationship REL_syspolicypolicycategoriesinternal_154e330f-02dc-4244-8af7-6c9c871a94f5
  • Writing CSV headers for relationship REL_syspolicypolicycategoriesinternal_6d55b306-614e-4bc5-b796-38753c6439c7
  • Writing CSV headers for relationship REL_syspolicyobjectsetsinternal_662dde96-43f0-4087-8f23-ed254cc3a315
  • Writing CSV headers for relationship REL_sysmanagementsharedservergroupsinternal_72d7c15b-a229-4f63-b417-72c61225d542
  • Writing CSV headers for relationship REL_sysmaintplansubplans_030ac924-ae43-41ff-aa04-b1ae895b1584
  • Writing CSV headers for relationship REL_syscollectorcollectionsetsinternal_4bd6a478-810f-46b7-83c2-5e6e154f5b7b
  • Writing CSV headers for relationship REL_sysjobsteps_3759c46f-5dbd-4954-bfe4-c07ca6edf4c7
  • Writing CSV headers for relationship REL_MSdbmsdatatype_58a42a3a-639a-43d6-95b3-57ffa39e594d
  • Writing CSV headers for relationship REL_MSdbms_14a3b4f9-f492-4d93-a6fb-5b03218449cd
  • Writing CSV headers for relationship REL_MSdbmsmap_d9561836-187c-47f2-abe8-eae27e038bdf
  • Writing CSV headers for relationship REL_restorehistory_95a0e8b2-6582-4261-967b-fb36cd692646
  • Writing CSV headers for relationship REL_sysjobs_d2354df0-c9e7-4041-8802-358b41adbb3a
  • Writing CSV headers for relationship REL_sysjobs_1ad7f304-ceed-495c-9bb9-156801e45f49
  • Writing CSV headers for relationship REL_syscollectorcollectionitemsinternal_2f1c4d45-a538-41a4-a32e-2a81d41e2141
  • Writing CSV headers for relationship REL_syspolicymanagementfacets_35949b66-34c9-4b17-a181-5f18c8974080
  • Writing CSV headers for relationship REL_sysssispackages_9b31cea6-9a2d-4388-9518-3ec413423f5e
  • Writing CSV headers for relationship REL_sysmailmailitems_4bbfbda2-6f81-4b01-ab9c-16c2afb0da61
  • Writing CSV headers for relationship REL_backupset_abe50bcb-9a88-4c0d-af5d-af65cf781114
  • Writing CSV headers for relationship REL_sysmailprofile_2c723567-38a0-47d1-ac66-e1bc14fbc984
  • Writing CSV headers for relationship REL_SYSPOLICYTARGETSETLEVELSINTERNAL_448f2584-3548-4fa4-acf1-0062860dcb22
  • Writing CSV headers for relationship REL_SYSMAILSERVER_a84b233b-4d62-4a9c-89d1-250a5e142277
  • Writing CSV headers for relationship REL_ENROLLMENT_4d226490-9a93-4142-9bde-2c3bdfc944c2
  • Writing CSV headers for relationship REL_SYSJOBACTIVITY_d468b627-379d-430e-a075-269d8cbb5ce4
  • Writing CSV headers for relationship REL_SYSJOBSCHEDULES_5820e24a-7c8e-4fc4-8b27-dcac1b7983e2
    null
    null
    null
    null
    Export time: 0.969 (s)
  • Creating Neo4j store from CSV...
  • Direct driver instance 2109798150 created for server address localhost:7687
  • Closing connection pool towards localhost:7687
  • Command failed due to error (CommandFailedException: Command failed [Command: 'C:\Users\jjonkman.Neo4jDesktop\neo4jDatabases\database-77d8ccaf-6573-4998-b813-cb68ab84343f\installation-3.5.0\bin\neo4j-import.bat --f C:\Users\jjonkman.Neo4jDesktop\neo4jDatabases\database-77d8ccaf-6573-4998-b813-cb68ab84343f\installation-3.5.0\import\csv-003\neo4j-admin-import-params', CommandResult { ExitValue: 1, Stdout: '', Stderr: 'WARNING: neo4j-import is deprecated and support for it will be removed in a future
    version of Neo4j; please use neo4j-admin import instead.

Input error: Missing argument 'into'
Caused by:Missing argument 'into'
java.lang.IllegalArgumentException: Missing argument 'into'
at org.neo4j.kernel.impl.util.Converters.lambda$mandatory$0(Converters.java:39)
at org.neo4j.helpers.Args.interpretOption(Args.java:542)
at org.neo4j.tooling.ImportTool.main(ImportTool.java:393)
at org.neo4j.tooling.ImportTool.main(ImportTool.java:349)
', DurationMillis: 2216 }]). Rerun with --debug flag for detailed diagnostic information.

Explain what Bulk and Online import mean

The ETL tool tries its best to explain every step that one should do to have a successful import at the end but IMHO fails at the point where the import mode gets chosen. A short explanation (tooltip/help bubble/fixed content) would be helpful to show what will happen if the user continues with Import Data.

Import from PostgreSql fails on boolean data type

If one of the tables has a boolean data type, the Import using Cypher Shell fails when the data is written to the csv. The error is:

  • Command failed due to error (RuntimeException: org.postgresql.util.PSQLException: Bad value for type byte : t). Rerun with --debug flag for detailed diagnostic information.

The complete error from the log is attached here:
etl_err_psql_boolean.txt

How do you specify the “—output-mapping-file” path

image

COMMAND: java -cp "C:\Users\FAS01.Neo4jDesktop\graphApps\neo4j-etl-ui/dist/neo4j-etl.jar" org.neo4j.etl.NeoIntegrationCli generate-metadata-mapping --rdbms:url "jdbc:mysql://192.168.249.241:3306/btc?autoReconnect=true&useSSL=false&useCursorFetch=true" --rdbms:password "neo4j" --rdbms:user "root" --schema "btc" --output-mapping-file "undefined/import/mysql_btc_btc_mapping.json"

  • Skipping reading import options from file because file [] doesn't exist.
  • Command failed due to error (FileNotFoundException: undefined\import\mysql_btc_btc_mapping.json (������ ��θ� ã�� �� �����ϴ�)). Rerun with --debug flag for detailed diagnostic information.

The above error occurs when the “Start Mapping” button is pressed. How do you specify the “—output-mapping-file” path?
Client OS : Windows 7
Neo4j Server OS : Ubuntu 14.04
mysql Server OS : Ubuntu 14.04

Get error when importing data from Oracle 12c in Docker Container.

I am using neo4j-etl tool to import data from Oracle 12c into Neo4j. Oracle is setup on Docker. However when I run neo4j-etl I get the following error.

java.lang.IllegalArgumentException: No enum constant org.neo4j.etl.sql.metadata.SqlDataType.BINARY_DOUBLE
    at java.lang.Enum.valueOf(Enum.java:238)
    at org.neo4j.etl.sql.metadata.SqlDataType.valueOf(SqlDataType.java:7)
    at org.neo4j.etl.sql.metadata.SqlDataType.parse(SqlDataType.java:69)
    at org.neo4j.etl.sql.metadata.TableInfoAssembler.lambda$createColumnsMap$3(TableInfoAssembler.java:66)
    at org.neo4j.etl.sql.metadata.TableInfoAssembler$$Lambda$10/1403704789.apply(Unknown Source)
    at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
    at java.util.HashMap$EntrySpliterator.forEachRemaining(HashMap.java:1683)
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:512)
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:502)
    at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
    at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
    at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
    at org.neo4j.etl.sql.metadata.TableInfoAssembler.createColumnsMap(TableInfoAssembler.java:70)
    at org.neo4j.etl.sql.metadata.TableInfoAssembler.createTableInfo(TableInfoAssembler.java:43)
    at org.neo4j.etl.commands.DatabaseInspector.buildSchema(DatabaseInspector.java:51)
    at org.neo4j.etl.commands.DatabaseInspector.buildSchemaExport(DatabaseInspector.java:39)
    at org.neo4j.etl.commands.rdbms.GenerateMetadataMapping.call(GenerateMetadataMapping.java:92)
    at org.neo4j.etl.commands.rdbms.GenerateMetadataMapping.call(GenerateMetadataMapping.java:28)
    at org.neo4j.etl.cli.rdbms.ExportFromRdbmsCli.createMetadataMappings(ExportFromRdbmsCli.java:260)
    at org.neo4j.etl.cli.rdbms.ExportFromRdbmsCli.run(ExportFromRdbmsCli.java:219)
    at org.neo4j.etl.util.CliRunner.run(CliRunner.java:42)
    at org.neo4j.etl.util.CliRunner.run(CliRunner.java:35)
    at org.neo4j.etl.NeoIntegrationCli.main(NeoIntegrationCli.java:44)

Here is my command

Rohans-MacBook-Pro:neo4j-etl-cli-1.1.0 rohankharwar$ ./bin/neo4j-etl oracle export --url jdbc:oracle:thin:H000/password@localhost:1521/xe --schema H000 --user H000 --password password --import-tool /Users/rohankharwar/Documents/neo4j-enterprise-3.2.2/bin  --options-file /Users/rohankharwar/tmp/nokia/options.json --csv-directory /Users/rohankharwar/tmp/nokia --destination /Users/rohankharwar/Documents/neo4j-enterprise-3.2.2/data/databases/nokiacomptel.db/ --quote '"' --force
- Skipping reading import options from file because file [/Users/rohankharwar/tmp/nokia/options.json] doesn't exist.
Creating RDBMS to CSV mappings...
Command failed due to error (IllegalArgumentException: No enum constant org.neo4j.etl.sql.metadata.SqlDataType.BINARY_DOUBLE). Rerun with --debug flag for detailed diagnostic information.

Mapping shows the previous mapping, not the one just generated

To reproduce, create two connections. First map one of the connections, click next, import.
Close the ETL tool.
Reopen the ETL tool.
Select the other connection.
Click Start Mapping
Previous mapping will show
Clicking Next will show the tables and columns and the node diagram from the first mapping.

space character causes mapping exception

during import, following exception is thrown if theres a space character in the property name e.g. if the mssql table contains a space character:

- Creating Neo4j store from CSV...

- Command failed due to error (CommandFailedException: Command failed [Command: 'C:\Users\user\.Neo4jDesktop\neo4jDatabases\database\installation-3.4.0\bin\neo4j-import.bat --into C:\Users\user\.Neo4jDesktop\neo4jDatabases\database\installation-3.4.0\data\databases\graph.db --nodes C:\Users\user\.Neo4jDesktop\neo4jDatabases\database\installation-3.4.0\installation-3.4.0\import\csv-009\Weather.dbo\NODE_Weather.dbo.WeatherShort_1fe6c2de-67d6-408c-b8d6-acce734d87a9_headers.csv,C:\Users\user\.Neo4jDesktop\neo4jDatabases\database\installation-3.4.0\installation-3.4.0\import\csv-009\Weather.dbo\NODE_Weather.dbo.WeatherShort_1fe6c2de-67d6-408c-b8d6-acce734d87a9.csv --delimiter , --array-delimiter ; --id-type STRING --multiline-fields false', CommandResult { ExitValue: 1, Stdout: '', Stderr: 'At line:1 char:835

+ ... d6-408c-b8d6-acce734d87a9.csv --delimiter , --array-delimiter ; --id- ...

+ ~

Missing closing ')' in expression.

At line:1 char:840

Neo4j ETL UI cannot save mapping

Hi Feedback,

I have run into an issue with the ETL UI wizard. I can connect and map relational database to graph but when I get to the “Explore and change your metadata” page I cannot save using the “Save Mapping” button and the "Next" button does not become active.

Example screen capture and I have tried this on two different databases.
image

The NeoIntegrationCli process having errors and not exiting

I was testing with an oracle HR example database (SDP_VM), and there where errors in the NeoIntegrationCli process and they were not shown in the etl-gui. In fact the etl-gui remains in the same initial state. Only after I killed the process then I saw the output of the NeoIntegrationCli in the etl-gui.
This was in ETLApp 0.0.13 and Neo4jDesktop 1.0.13 (1.0.13.477)

  • The NeoIntegrationCli process did not 'exited' on error

  • So no output of the NeoIntegrationCli is shown in etl-gui while it is processing (or the error occurs to fast and freezes the process).

Error: Could not find or load main class org.neo4j.etl.NeoIntegrationCli

I have installed neo4j-etl-cli v1.3.1, but am unable to get it to execute. Other java related things (like neo4j console seem to work fine)

$ pwd
/usr/local/Cellar/neo4j/3.5.3/libexec

$ uname -a
Darwin TWC-T86VGH03Q 17.7.0 Darwin Kernel Version 17.7.0: Thu Dec 20 21:47:19 PST 2018; root:xnu-4570.71.22~1/RELEASE_X86_64 x86_64 i386 MacBookPro13,3 Darwin

$ echo $JAVA_HOME
/Library/Java/JavaVirtualMachines/jdk1.8.0_212.jdk/Contents/Home

$ $NEO4J_HOME/bin/neo4j-etl
Error: Could not find or load main class org.neo4j.etl.NeoIntegrationCli
It seems 'java' command is not available.
Please check your JAVA_HOME environment variable.
Also check if you have a valid Java 8 environment

I'm guessing this is my environment, but haven't been able to figure it out. I've attempted to uninstall java and reinstall, but the same symptom persists.

Possible to avoid the LoadCSV step?

Hi @jexp / maintainers ,
I am trying to online import a Database which contains JSON values in it's columns,
I notice that when it get's loaded into CSV it breaks the CSV format.

Can we modify this to use https://neo4j-contrib.github.io/neo4j-apoc-procedures/#load-jdbc to overcome this?

I was able to leverage this strategy by handrolled cypher successfully. it would be great if this strategy can be adopted by the ETL tool.

Happy to help raise a PR with some guidance as well.

Ignoring unknown data types

Hello maintainers,

I noticed that while importing a postgres database with some columns having type oid and tsvector
the mapping step was ignoring them, probably because there is no mapping to these datatypes

I would suggest to have default mappings for unknown datatypes or a choice to ignore specific datatypes

Error running etl on Windows 10

When I try to run neo4j-etl help in Windows Command Prompt or PS I only get this error:

The system cannot find the file specified.
~0,-1
Error: Could not find or load main class org.neo4j.etl.NeoIntegrationCli

I get this error on 2 different PC's both with Windows 10 Pro(both up to date). I have environment variables for NEO4J_HOME and for JAVA_HOME.
neo4j-etl-cli-1.2.0-BETA01 was unzipped.

On my PC at home I downloaded the Community Edition 3.3.2, downloaded & unzipped the neo4j-etl-cli-1.2.0-BETA01-release.zip from github, added NEO4J_HOME to env variables, rebooted and tried to execute neo4j-etl in Windows Command Prompt at the bin directory of neo4j-etl. Is there anything else to do?

Here are the screenshots from my PC at work:
image

image

Add the possibility to remove table mappings

In many cases you do not want to load the complete database. Especially with very big databases. So there must be a possibility to remove the tables from the table mapping. Normally you know your rdbms so maybe it is useful to apply a filter before retrieving the metadata from the rdbms, it will save time.

Example code for using neo4j-etl with Microsoft SQL Server on local Docker

Hello,

I've had a discussion on the neo4j #help-import slack channel about using neo4j-etl to export data from a local Microsoft SQL Server running on a Docker. With help from the great people there I got it working. Because there ara lacking documentation about this I wrote some documentation about what I did that maybe could be useful for others.

My settings:

  • OSX
  • Docker installed
  • neo4j-etl version neo4j-etl-cli-1.2.0-BETA01
  • Java version: 1.8.0_131
  • Java $JAVA_HOME /Library/Java/JavaVirtualMachines/jdk1.8.0_131.jdk/Contents/Home
  • Neo4j enterprise

Setup:

1. Setup Docker for Microsoft SQL Server

create file: docker-compose.yml

version: "3"

services:
  mssql:
    image: microsoft/mssql-server-linux
    container_name: Test_Mssql_Server
    environment:
      ACCEPT_EULA: Y
      SA_PASSWORD: Password
    ports:
      - 1433:1433
    volumes:
      - mssqldata:/var/opt/mssql

volumes:
  mssqldata:

Start docker with $ docker-compose up

2. Import Northwind demo sql data to mssql db (or use your own)

  • Download Northwind.Ms.SQL.2005.sql from here
  • Import SQL to MsSQL ( I used Aqua Data Studio)

3. Download neo4j-etl

4. Install Microsoft SQL driver

  • Download here
  • Add driver .jar file (mssql-jdbc-6.2.2.jre8.jar) to /lib folder

5. Start Neo4j server

Example code

Generate-metadata-mapping

./bin/neo4j-etl generate-metadata-mapping \
 --rdbms:url "jdbc:sqlserver://localhost:1433;databaseName=Northwind" \
 --rdbms:user SA --rdbms:password Password \
 --rdbms:schema Northwind.dbo > ./mapping.json

Import to Neo4j database

Only tested with --using cypher:direct

./bin/neo4j-etl export \
 --rdbms:url "jdbc:sqlserver://localhost:1433;databaseName=Northwind" \
 --rdbms:user SA --rdbms:password Password --rdbms:schema Northwind.dbo \
 --using cypher:direct \
 --neo4j:url bolt://localhost:7687 --neo4j:user <neo4j-user> --neo4j:password <neo4j-pass> \
 --import-tool <path-to-neo4j-home>/bin \
 --csv-directory <path-to-neo4j-home>/import"

ETL Mapping fails when using Windows Authentication to MS SQL Server

ETL 1.3.2
Neo4J desktop 1.1.13

I am connecting ETL 1.3.2 to a microsoft SQL Server using Windows Authentication.

I have used setup according to http://www.microsoft.com/en-us/download/details.aspx?displaylang=en&id=11774 .
with the appropriate dll and adding the directory to java.lang.path (ie windows path on desktop)

Left the username and password blank, and appended "integratedSecurity=true"

ie
jdbc:sqlserver://10.0.0.102:1433;databaseName=northwind;integratedSecurity=true

Connect and save works no worries

when I attempt to map, the log tells me that ETL tool is complaining about missing username and password, so it seems to spawn etl-cli

looks like etl-cli doesnt support windows authentication - seems logical given there are no examples of this
Is this in the roadmap?

Error on Oracle connection

When trying to connect to Oracle (using ojdbc6, as well try with ojdbc5), it shows the error message below:

Exception in thread "main" java.lang.AbstractMethodError: oracle.jdbc.driver.T4CConnection.getSchema()Ljava/lang/String; at

Upon reading a bit off the internet, it seems there might be some compatibility issues.

We also repeated the same steps with different Oracle databases, and it shows the same result.

And then we also repeat the steps again, but with ojdbc6dms, and it shows a different error message:

Exception in thread "main" java.lang.NoClassDefFoundError: oracle/dms/console/DMSConsole at oracle.jdbc.driver.DMSFactory.

Any advice?

List schemas found

When running ETL and getting the following - it would be useful to list the schemas found so it is easier to debug why it doesn't match

  • Skipping reading import options from file because file [] doesn't exist.

  • Creating RDBMS to CSV mappings...

  • Using database plugin for

  • Crawling schemas

  • Retrieving all schemas

  • Retrieving all catalogs

  • Processed 14 rows for

  • Retrieved 0 schemas

  • Total time taken for - 00:00:00.177 hours

  • 99.4% - 00:00:00.176 -

  • 0.6% - 00:00:00.001 -

  • Command failed due to error (SQLException: No matching schemas found). Rerun with --debug flag for detailed diagnostic information.

Escaping characters in CSVs generated by neo4j etl tool

I am trying to export data from mysql to neo4j using neo4j-etl tool --using cypher:direct . But the neo4j-etl export command gets stuck after generating csv files. It remains stuck at
INFO: Creating Neo4j store from CSV...
without showing and error or stopping . Upon investigating the cypher file generated by neo4j-etl , I think that the problem is with the CSV files have some characters like new line character, quotes etc in the values . Do you think that can be cause of the problem , is there a way to escape those characters when csv files are being created ?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.