Git Product home page Git Product logo

csv-import's People

Contributors

akshayawate avatar alotropico avatar ciminelli avatar civan avatar d-lite avatar gammons avatar handotdev avatar mpatin avatar sairanjan avatar samurai2y avatar shriansh2002 avatar srinivasaraobodepudi avatar yobulkmaster avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

csv-import's Issues

Unable to start container tableflow-backend-1

Got error when starting the docker compose, get issue on tableflow-backend-1 (image registry-1.docker.io/tableflow/admin-server)

Logs as:

2023-12-15T02:31:52.628Z        FATAL   service/services.go:59  Error initializing Scylla       {"error": "gocql: unable to create session: unable to discover protocol version: read tcp 172.30.0.4:39981->172.17.0.1:9042: read: connection reset by peer"}
tableflow/go/internal/service.InitServices
        /app/go/internal/service/services.go:59
main.main
        /app/go/cmd/main.go:17
runtime.main
        /usr/local/go/src/runtime/proc.go:250
2023-12-15T02:32:00.339Z        FATAL   service/services.go:59  Error initializing Scylla       {"error": "gocql: unable to create session: unable to discover protocol version: read tcp 172.30.0.4:35519->172.17.0.1:9042: read: connection reset by peer"}
tableflow/go/internal/service.InitServices

and the 3001 shows

image

How to show loading indicator during POST processing?

Hi there, thanks a lot for the incredible work on this great tool! I see in #134 that there used to be some work about a loading indicator, but the branch got deleted and never merged into main? Is that functionality already in main and I am missing it? If yes, how do we show a loading indicator while posting results from the tool to a backend?

Thanks a lot for your answers :)

๐Ÿš€โœจ Feature Request: Traefik Integration for Enhanced SSL Management in Docker with Let's Encrypt ๐Ÿ›ก๏ธ

Hello tableflowhq team ๐Ÿ‘‹๐Ÿป,

As an avid user of tableflow within a Docker environment, I've run into some limitations due to the lack of integrated SSL support. I'm excited to propose the integration of Traefik for its seamless Docker compatibility and robust SSL management via Let's Encrypt ๐Ÿ”.

Why Traefik? ๐Ÿค”

  1. Automatic Service Discovery ๐Ÿ”„: Traefik effortlessly discovers and routes to services in Docker containers. Imagine updating services without manual reloads!

    # Example: Traefik auto-discovers new Docker services ๐Ÿš€
    traefik:
      image: traefik:v2.3
      command:
        - "--providers.docker=true"
        - "--providers.docker.exposedbydefault=false"
  2. Native Let's Encrypt Support ๐Ÿ”’: SSL certificate management is a breeze with Traefik, keeping our applications consistently secure.

    # Example: Easy SSL with Let's Encrypt and Traefik ๐Ÿ“œ
    command:
      - "--certificatesresolvers.myresolver.acme.tlschallenge=true"
      - "--certificatesresolvers.myresolver.acme.email=your-email@example.com"
  3. Real-Time Dynamic Configuration โš™๏ธ: Traefik updates its configuration in real-time as Docker containers start or stop, perfectly aligning with Docker's dynamic nature.

    # Example: Traefik's real-time configuration updates ๐Ÿ”„
    command:
      - "--providers.docker.watch=true"
  4. Load Balancing and High Availability โš–๏ธ: Traefik is not just a reverse proxy; it's also a powerful load balancer, enhancing the availability and performance of our apps.

    # Example: Load balancing setup in Traefik ๐Ÿ’ช
    labels:
      - "traefik.http.services.my-service.loadbalancer.server.port=80"

Request ๐Ÿ“

I kindly ask for your consideration to integrate Traefik into tableflow. This integration would not only simplify SSL management but also bring operational benefits like automated service discovery, effective load balancing, and smooth configuration in our Docker-based environment.

I firmly believe that integrating Traefik would be a significant enhancement to tableflow, benefiting our entire community ๐ŸŒŸ.

Eagerly awaiting your thoughts and feedback on this proposal.

Best regards

Custom request headers

Many APIs that I would like to use as an action target require header-based authentication. For example, it would be great to directly push events generated by inquery to Segment (HTTP Tracking API) to pipe the events to marketing tools and our data warehouse.

More generalized: Being able to set custom request headers (key/value similar to Postman) would be a great add-on to inquery to reduce the need to proxy the request when trying to integrate with more targets.

Large files, option to only return the column mapping insted of full content

Hi there.

We've already built an (ugly) but efficient piece of code that is able to read a csv file (first x lines) and then allow the user to tell what is the data type of each columns. Then we send the untouched file, with the mapping to the backend that does the import.

Indeed, with huge files (million lines) it would surcharge the browser to read the entire file and return the result.
I propose an option to return only the mapping instead of the full result set.

I can work on it and propose an implementation as I plan to replace our little piece of code with your library. (We already use it in some parts of our apps where the files uploaded are not that big).

Display connection error if front end cannot connect to API server

Whenever a component in React is loaded, call the api/health endpoint with a short (10 seconds) timeout to check if the front end is able to connect to the API server.

If the connection is not successful, return an error page component with any details about the connection error.

We currently don't have an error page component but can make one quickly.

Amazon S3 Upload

  • Ability to upload data to an external storage like AWS S3 buckets.

Can't open some XLSX files

This library works with some XLSX files, but with others, it doesn't.

index.js:44810 Uncaught (in promise) TypeError: cell.trim is not a function
    at index.js:44810:110
    at Array.some (<anonymous>)
    at isNotBlankRow (index.js:44810:75)
    at Array.filter (<anonymous>)
    at index.js:44840:61
    at step (index.js:92:23)
    at Object.next (index.js:73:53)
    at index.js:66:71
    at new Promise (<anonymous>)
    at __awaiter (index.js:62:12)

Issue seems to be in the isNotBlankRow function.

Problematic file attached
problematic excel.xlsx

Problem with very large file size

Problem with very large file size

Do you have a way to import files of several million records with volumes of more than 10 gigs?

CSV Separators

Hi!

Im trying tableflowhq and is not processing csv files separated with ";", only works with simple comma, this is right?

Thanks!

Error: Unsupported ZIP Compression method NaN

Using NextJS and trying to upload XLXS files and I get the error: Error: Unsupported ZIP Compression method NaN. I can use the XLSX library direct without issue. Also CSV files work just fine. I tried using the component normally and through dynamic load.

const DynamicImporter = dynamic( () => import("csv-import-react").then((mod) => mod.CSVImporter), { loading: () => <p>Loading...</p>, ssr: false, } );

<> <DynamicImporter modalIsOpen={isUploaderOpen} modalOnCloseTriggered={() => setIsUploaderOpen(false)} onComplete={(data) => console.log(data)} template={{ columns: [ { name: "First Name", key: "first_name", required: true, description: "The first name of the user", suggested_mappings: ["First", "Name", "firstname"], }, { name: "Phone", data_type: "phone", suggested_mappings: ["Phone", "telephone", "tel"], }, ], }} waitOnComplete={true} skipHeaderRowSelection={true} /> <Button type="primary" onClick={() => setIsUploaderOpen(true)}> <Icon icon="upload" className={clsx("mr-2")} /> Upload Contacs </Button> </>

I am not sure how the XLSX library is in use here but seems to be thinking its a zip file.

File import doesn't support non-ASCII content

When I attempt to import a UTF-8 encoded CSV file containing non-ASCII characters (such as Cyrillic), the characters are not decoded correctly. I'm uncertain where the issue lies - with the parser or the record list, but the result is garbled text.

I am also including the CSV file itself.
banki-makedonija.csv

image

May I ask, what is the reason for this?

Unable to connect. Check your connection and reach out to support if the problem persists.
image

git clone https://github.com/tableflowhq/tableflow.git cd tableflow cp .env.example .env docker-compose up -d

image

RangeError: Maximum call stack size exceeded

Hi, I am trying this React library. I make it into a component and the modal loads for a few seconds before it is gone. I noticed in the URL it returns "?" while the modal load.

2024-02-04 03 06 54

Upon checking the terminal it says:

RangeError: Maximum call stack size exceeded
    at RegExp.exec (<anonymous>)

What do you think I did incorrectly?

P/S: Don't mind the styling, I did customStyles and has disabled darkMode.

Allow schema-free import

It would be useful to allow the user to also include columns which are not in the template of the importer. Especially if the use case is a generic data set or table upload.

This would have a positive side effect: The template of an importer can act as a data annotation tool rather than just a mapping. If there is a non-required property "address" in my template, the user can use it to "annotate" address columns so the underlying application can make sense of it.

An option to enable this behavior on an importer-level would be great. I can put some work into it, once I have more time.

NextJS ReferenceError

When trying to build with Vercel, issue being caused

 โจฏ node_modules/csv-import-react/build/index.esm.js (44363:0) @ persist.name
 โจฏ ReferenceError: localStorage is not defined

File: csv-import-react/build/index.esm.js

Temp Fix:
theme: typeof window !== 'undefined' ? localStorage.getItem(STORAGE_KEY) : 'light',

Add support for review screen and validations

Hello congrats for this cool library!
I was trying to add the csv-import-js in a svelte app.
Everything fine but I'm not understanding if there is some sort of review screen because I have added many validations to the fields but the importer doesn't show anything to correct them.
Thank you!

xls support?

Hi all,

This looks like a fantastic tool. I'm wondering if you have a public roadmap somewhere - a lot of my clients have XLSX files that they import and it would be great it Tableflow supported that. Thanks!

Custom fields

  • Ability for user to add custom fields (columns), differentiate them with a constrained "user-defined".

Ports are hardcoded - Self-Host with different ports is not working

Followed the instructions to test with the docker, but even when setting the ENV file correctly, ports where not working.

So far I found the port 3003 hardcoded here:
https://github.com/tableflowhq/tableflow/blob/83ba1a5843a59fa41793d67b26183d6b1530c613/admin-ui/src/api/api.ts#L42C48-L42C52

Then the docker-compose.base.yml had also the port 3003 hardcoded, for example, in this case I had to change to 3004 the PORTS definition and the healthcheck + the .env variable TABLEFLOW_API_SERVER_PORT=3004

  backend:
    platform: linux/amd64
    ports:
      - "3004:3004"
    networks:
      - app_network
    env_file:
      - .env
    volumes:
      - ${PWD}/../.env:/.env
      - tmp:/tmp/tableflow-files
    stop_signal: SIGTERM
    stop_grace_period: 30s
    restart: on-failure
    healthcheck:
      test: >
        bash -c 'exec 3<>/dev/tcp/127.0.0.1/3004 && echo -e "GET /public/health HTTP/1.1\r\nhost: 127.0.0.1:3004\r\nConnection: close\r\n\r\n" >&3 && cat <&3 | grep "ok"'

Even then I can't make it work but seems the env are not being used thoroughly.

it looks so cool I wanted to test it !

url fail

docker-compose up -d -> Webpage opening error

image

ReferenceError: exports is not defined in ES module scope

Has anyone run into this issue? Attempting to use the react package in nextjs

ReferenceError: exports is not defined in ES module scope
This file is being treated as an ES module because it has a '.js' file extension and '.../node_modules/csv-import-react/build/package.json' contains "type": "module". To treat it as a CommonJS script, rename it to use the '.cjs' file extension.

Add support for SSH tunnels

This should be an option to connect to the database instead of connecting to the Postgres host directly.

We'll need to:

  1. Add a new option to the "Connect Database" page
  2. Update the connect-database endpoint to support the SSH options (server host, connection port, login username, key or password authentication)
  3. Support SSH key authentication or password authentication

More info can be found with how Airbyte supports this type of connection: https://docs.airbyte.com/integrations/sources/postgres/#connect-via-ssh-tunnel

Enhance README for external contributors

Hello.

I wanted to try to solve the issue with the XLSX I shared, but could not simply launch the project.

I cloned the project, did yarn install and yarn test, that did not work.
Could you improve the documentation about how to launch the project, modify it, and test it ?

Wildcard / Unmatched fields

Hi Tableflow team, thanks for your nice work!

I'm using csv-import to import a csv with a lot of columns. For future use, we want to save all unmatched columns into our db as well. Therefore, we need to receive all unmatched columns.

Is there a possiblity to get those columns? If not it would be nice, if we could get them via onSubmit as a second param or via a wildcard in the template which might be less clean.

Initial setup: non default port is sent as string in json and therefore cannot be unmarshaled to struct in go

Using another postgres port sends the frontend json field port as a string which leads to an unmarshal error.

Error: json: cannot unmarshal string into Go struct field ConnConfig.port of type uint16

Call to http://localhost:3003/api/v1/connection

{
    "host": "localhost",
    "port": "5442",
    "database": "postgres",
    "user": "postgres",
    "password": "postgres"
}

Leaving the default value, the correct json is submitted:

{
    "host": "",
    "port": 5432,
    "database": "postgres",
    "user": "postgres",
    "password": "postgres"
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.