Git Product home page Git Product logo

minerva-ml / open-solution-ship-detection Goto Github PK

View Code? Open in Web Editor NEW
61.0 11.0 22.0 1.67 MB

Open solution to the Airbus Ship Detection Challenge

Home Page: https://www.kaggle.com/c/airbus-ship-detection

License: MIT License

Python 96.01% Jupyter Notebook 3.99%
data-science machine-learning deep-learning deep-neural-networks unet unet-image-segmentation python python3 pytorch pytorch-implmention

open-solution-ship-detection's People

Contributors

dependabot[bot] avatar jakubczakon avatar kamil-kaczmarek avatar kant avatar varal7 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

open-solution-ship-detection's Issues

modify lovasz

def symmetric_lovasz(outputs, targets):
    return (lovasz_hinge(outputs, targets) + lovasz_hinge(-outputs, 1 - targets)) / 2

Local vs Cloud execution

Hello, thank you for putting together this excellent baseline.

I have two questions:

  1. what are these two directories (the others are self explanatory):
  annotation_file:      /path/to/data
  masks_overlayed_dir:  /path/to/data
  1. when I execute neptune run --config configs/neptune.yaml main.py prepare_masks I get
Calculated experiment snapshot size: 57.45 GB   
Sending sources to server:   0%|                                   | 31.8M/57.4G [00:25<8:14:17, 1.94MB/s

which means it is sending all the files to the cloud. Why is this happening? Isn't run intended to run everything locally?

Segmentation fault encountered in worker

I've successfully prepared the masks and the meta data. However when I run the training (as neptune run -x data --config configs/neptune.yaml main.py train --pipeline_name unet) I get the following error:

2018-08-27 14-54-31 ships-detection >>> training
/anaconda3/envs/neptune/lib/python3.5/site-packages/click/core.py:535: DtypeWarning: Columns (1) have mixed types. Specify dtype option on import or set low_memory=False.
  return callback(*args, **kwargs)
2018-08-27 14:54:31 steppy >>> initializing Step xy_train...
2018-08-27 14:54:31 steppy >>> initializing experiment directories under /Users/jonathan/devwork/open-solution-ship-detection/experiments
2018-08-27 14:54:31 steppy >>> done: initializing experiment directories
2018-08-27 14:54:31 steppy >>> Step xy_train initialized
2018-08-27 14:54:31 steppy >>> initializing Step xy_inference...
2018-08-27 14:54:31 steppy >>> initializing experiment directories under /Users/jonathan/devwork/open-solution-ship-detection/experiments
2018-08-27 14:54:31 steppy >>> done: initializing experiment directories
2018-08-27 14:54:31 steppy >>> Step xy_inference initialized
2018-08-27 14:54:31 steppy >>> initializing Step loader...
2018-08-27 14:54:31 steppy >>> initializing experiment directories under /Users/jonathan/devwork/open-solution-ship-detection/experiments
2018-08-27 14:54:31 steppy >>> done: initializing experiment directories
2018-08-27 14:54:31 steppy >>> Step loader initialized
/anaconda3/envs/neptune/lib/python3.5/site-packages/toolkit/pytorch_transformers/architectures/unet.py:22: UserWarning: Please make sure, that your input tensor's dimensions are divisible by (pool_stride ** repeat_blocks)
  warnings.warn("Please make sure, that your input tensor's dimensions are divisible by "
2018-08-27 14:54:32 steppy >>> initializing Step unet...
2018-08-27 14:54:32 steppy >>> initializing experiment directories under /Users/jonathan/devwork/open-solution-ship-detection/experiments
2018-08-27 14:54:32 steppy >>> done: initializing experiment directories
2018-08-27 14:54:32 steppy >>> Step unet initialized
2018-08-27 14:54:32 steppy >>> cleaning cache...
2018-08-27 14:54:32 steppy >>> cleaning cache done
2018-08-27 14:54:32 steppy >>> Step xy_train, adapting inputs...
2018-08-27 14:54:32 steppy >>> Step xy_train, transforming...
2018-08-27 14:54:32 steppy >>> Step xy_inference, adapting inputs...
2018-08-27 14:54:32 steppy >>> Step xy_inference, transforming...
2018-08-27 14:54:32 steppy >>> Step loader, adapting inputs...
2018-08-27 14:54:32 steppy >>> Step loader, transforming...
2018-08-27 14:54:32 steppy >>> Step unet, adapting inputs...
2018-08-27 14:54:32 steppy >>> Step unet, fitting and transforming...
2018-08-27 14:54:32 steppy >>> initializing model weights...
2018-08-27 14-54-32 ships-detection >>> starting training...
2018-08-27 14-54-32 ships-detection >>> initial lr: 0.0001
2018-08-27 14-54-32 ships-detection >>> epoch 0 ...
ERROR: Unexpected segmentation fault encountered in worker.
Traceback (most recent call last):
  File "/anaconda3/envs/neptune/lib/python3.5/site-packages/deepsense/neptune/job_wrapper.py", line 107, in <module>
    execute()
  File "/anaconda3/envs/neptune/lib/python3.5/site-packages/deepsense/neptune/job_wrapper.py", line 103, in execute
    execfile(job_filepath, job_globals)
  File "/anaconda3/envs/neptune/lib/python3.5/site-packages/past/builtins/misc.py", line 82, in execfile
    exec_(code, myglobals, mylocals)
  File "main.py", line 89, in <module>
    main()
  File "/anaconda3/envs/neptune/lib/python3.5/site-packages/click/core.py", line 722, in __call__
    return self.main(*args, **kwargs)
  File "/anaconda3/envs/neptune/lib/python3.5/site-packages/click/core.py", line 697, in main
    rv = self.invoke(ctx)
  File "/anaconda3/envs/neptune/lib/python3.5/site-packages/click/core.py", line 1066, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/anaconda3/envs/neptune/lib/python3.5/site-packages/click/core.py", line 895, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/anaconda3/envs/neptune/lib/python3.5/site-packages/click/core.py", line 535, in invoke
    return callback(*args, **kwargs)
  File "main.py", line 27, in train
    pipeline_manager.train(pipeline_name, dev_mode)
  File "/Users/jonathan/devwork/open-solution-ship-detection/src/pipeline_manager.py", line 28, in train
    train(pipeline_name, dev_mode)
  File "/Users/jonathan/devwork/open-solution-ship-detection/src/pipeline_manager.py", line 77, in train
    pipeline.fit_transform(data)
  File "/anaconda3/envs/neptune/lib/python3.5/site-packages/steppy/base.py", line 323, in fit_transform
    step_output_data = self._cached_fit_transform(step_inputs)
  File "/anaconda3/envs/neptune/lib/python3.5/site-packages/steppy/base.py", line 443, in _cached_fit_transform
    step_output_data = self.transformer.fit_transform(**step_inputs)
  File "/anaconda3/envs/neptune/lib/python3.5/site-packages/steppy/base.py", line 605, in fit_transform
    self.fit(*args, **kwargs)
  File "/Users/jonathan/devwork/open-solution-ship-detection/src/models.py", line 68, in fit
    for batch_id, data in enumerate(batch_gen):
  File "/anaconda3/envs/neptune/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 275, in __next__
    idx, batch = self._get_batch()
  File "/anaconda3/envs/neptune/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 254, in _get_batch
    return self.data_queue.get()
  File "/anaconda3/envs/neptune/lib/python3.5/multiprocessing/queues.py", line 335, in get
    res = self._reader.recv_bytes()
  File "/anaconda3/envs/neptune/lib/python3.5/multiprocessing/connection.py", line 216, in recv_bytes
    buf = self._recv_bytes(maxlength)
  File "/anaconda3/envs/neptune/lib/python3.5/multiprocessing/connection.py", line 407, in _recv_bytes
    buf = self._recv(4)
  File "/anaconda3/envs/neptune/lib/python3.5/multiprocessing/connection.py", line 379, in _recv
    chunk = read(handle, remaining)
  File "/anaconda3/envs/neptune/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 175, in handler
    _error_if_any_worker_fails()
RuntimeError: DataLoader worker (pid 33362) is killed by signal: Unknown signal: 0.
ERROR: Unexpected segmentation fault encountered in worker.
ERROR: Unexpected segmentation fault encountered in worker.

Do you have any ideas on this? Thank you very much.

enabling dev_mode

Hi

Using python 3.5 (without neptune), the following works.
python main.py -- train --pipeline_name unet

However I wish to debug using dev_mode and can't seem to pass through the flag.
python main.py -- train --pipeline_name unet --dev_mode

Error: No value provided for parameter 'dev_mode'

I think the click option is_flag for dev_mode means we shouldn't need to pass any values, so I'm doing something wrong?

Thanks

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.