Git Product home page Git Product logo

glacier-upload's People

Contributors

azeemba avatar dependabot[bot] avatar eplanet avatar pre-commit-ci[bot] avatar tbumi avatar z38 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

glacier-upload's Issues

AttributeError: 'module' object has no attribute 'log2'

Hello guys I get an error when I launch an upload :
# glacier_upload -v MYVAULT -f backup.tar
Traceback (most recent call last):
File "/usr/local/bin/glacier_upload", line 10, in
sys.exit(upload())
File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 664, in call
return self.main(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 644, in main
rv = self.invoke(ctx)
File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 837, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 464, in invoke
return callback(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/glacier_upload/upload.py", line 54, in upload
if not math.log2(part_size).is_integer():
AttributeError: 'module' object has no attribute 'log2'

Looks like a Math module issue ?
using python 2.7

Thanks in advance

Tried to upload 2Tb file and got OOM error

[17706654.622081] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=/,mems_allowed=0,global_oom,task_memcg=/user.slice/user-1000.slice/session-9892.scope,task=glacier,pid=2908245,uid=1000
[17706654.622164] Out of memory: Killed process 2908245 (glacier) total-vm:25586796kB, anon-rss:22666568kB, file-rss:0kB, shmem-rss:0kB, UID:1000 pgtables:44748kB oom_score_adj:0

Please advise.

H/W: 32Gb RAM, Intel(R) Xeon(R) CPU E5440 @ 2.83GHz

Unable to install glacier_upload

I'm trying to install glacier_upload with "pip3 install glacier_upload".

I'm getting

Collecting glacier_upload
  Downloading glacier_upload-1.0.tar.gz
Collecting click (from glacier_upload)
  Downloading click-6.7-py2.py3-none-any.whl (71kB)
Collecting boto3 (from glacier_upload)
  Downloading boto3-1.5.13-py2.py3-none-any.whl (128kB)
Requirement already satisfied: s3transfer<0.2.0,>=0.1.10 in /usr/local/lib/python3.5/dist-packages (from boto3->glacier_upload)
Collecting botocore<1.9.0,>=1.8.27 (from boto3->glacier_upload)
  Downloading botocore-1.8.27-py2.py3-none-any.whl (4.0MB)
Requirement already satisfied: jmespath<1.0.0,>=0.7.1 in /usr/local/lib/python3.5/dist-packages (from boto3->glacier_upload)
Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in /usr/local/lib/python3.5/dist-packages (from botocore<1.9.0,>=1.8.27->boto3->glacier_upload)
Requirement already satisfied: docutils>=0.10 in /usr/local/lib/python3.5/dist-packages (from botocore<1.9.0,>=1.8.27->boto3->glacier_upload)
Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.5/dist-packages (from python-dateutil<3.0.0,>=2.1->botocore<1.9.0,>=1.8.27->boto3->glacier_upload)
Building wheels for collected packages: glacier-upload
  Running setup.py bdist_wheel for glacier-upload: started
  Running setup.py bdist_wheel for glacier-upload: finished with status 'done'
  Stored in directory: /root/.cache/pip/wheels/bf/6f/37/37f8884e70bf52413af70ad2200f6d37e5dbb2479896327b3e
Successfully built glacier-upload
Installing collected packages: click, botocore, boto3, glacier-upload
  Found existing installation: botocore 1.8.26
    Uninstalling botocore-1.8.26:
      Successfully uninstalled botocore-1.8.26
Invalid script entry point: <ExportEntry delete_glacier_archive = glacier_upload.delete_archive.delete_archive:None []> for req: glacier_upload - A callable suffix is required. Cf https://packaging.python.org/en/latest/distributing.html#console-scripts for more information.
The command '/bin/sh -c pip3 install glacier_upload' returned a non-zero code: 1

This is run as part of the following Dockerfile

FROM ubuntu:16.04

RUN apt-get update

RUN apt-get install -y wget unzip vim
RUN apt-get install -y python-pip python3-pip
RUN pip install --upgrade pip
RUN pip3 install --upgrade pip

RUN pip install awscli
RUN pip3 install awscli
RUN pip3 install glacier_upload

Exception occured: AttributeError("type object 'ConnectionClosedError' has no attribute 'write'",)

Using ubuntu,

aws s3 sync works, but using this script I get the following, regardless of the number threads


Reading file...
Opened single file.
Initiating multipart upload...
File size is 29312856808 bytes. Will upload in 3495 parts.
Spawning threads...
Uploading part 1 of 3495... (0.00%)
Uploading part 2 of 3495... (0.03%)
Uploading part 3 of 3495... (0.06%)
Uploading part 4 of 3495... (0.09%)
Uploading part 5 of 3495... (0.11%)
Uploading part 6 of 3495... (0.14%)
Uploading part 7 of 3495... (0.17%)
Uploading part 8 of 3495... (0.20%)
Uploading part 9 of 3495... (0.23%)
Uploading part 10 of 3495... (0.26%)
Exception occured: AttributeError("type object 'ConnectionClosedError' has no attribute 'write'",)
Upload not aborted. Upload id: aLLwqQ9GdGfv5uVilm8d9CaL4acEiV9wyBy7TaKt2jWtPyE5rwN-xyaamEcntXKOORteVTUVm_GEkLU8zntgnc1hmhSV
Exiting.
Uploading part 11 of 3495... (0.29%)

botocore.exceptions.NoRegionError: You must specify a region

Hi! I'm doing something wrong and I can't figure out what exactly.
Calling glacier_upload --help gives me:

Traceback (most recent call last):
  File "/usr/local/bin/glacier_upload", line 11, in <module>
    load_entry_point('glacier-upload==1.2', 'console_scripts', 'glacier_upload')()
  File "/usr/lib/python3.6/site-packages/pkg_resources/__init__.py", line 476, in load_entry_point
    return get_distribution(dist).load_entry_point(group, name)
  File "/usr/lib/python3.6/site-packages/pkg_resources/__init__.py", line 2700, in load_entry_point
    return ep.load()
  File "/usr/lib/python3.6/site-packages/pkg_resources/__init__.py", line 2318, in load
    return self.resolve()
  File "/usr/lib/python3.6/site-packages/pkg_resources/__init__.py", line 2324, in resolve
    module = __import__(self.module_name, fromlist=['__name__'], level=0)
  File "/usr/local/lib/python3.6/site-packages/glacier_upload/upload.py", line 33, in <module>
    glacier = boto3.client('glacier')
  File "/usr/local/lib/python3.6/site-packages/boto3/__init__.py", line 91, in client
    return _get_default_session().client(*args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/boto3/session.py", line 263, in client
    aws_session_token=aws_session_token, config=config)
  File "/usr/local/lib/python3.6/site-packages/botocore/session.py", line 839, in create_client
    client_config=config, api_version=api_version)
  File "/usr/local/lib/python3.6/site-packages/botocore/client.py", line 86, in create_client
    verify, credentials, scoped_config, client_config, endpoint_bridge)
  File "/usr/local/lib/python3.6/site-packages/botocore/client.py", line 328, in _get_client_args
    verify, credentials, scoped_config, client_config, endpoint_bridge)
  File "/usr/local/lib/python3.6/site-packages/botocore/args.py", line 47, in get_client_args
    endpoint_url, is_secure, scoped_config)
  File "/usr/local/lib/python3.6/site-packages/botocore/args.py", line 117, in compute_client_args
    service_name, region_name, endpoint_url, is_secure)
  File "/usr/local/lib/python3.6/site-packages/botocore/client.py", line 402, in resolve
    service_name, region_name)
  File "/usr/local/lib/python3.6/site-packages/botocore/regions.py", line 122, in construct_endpoint
    partition, service_name, region_name)
  File "/usr/local/lib/python3.6/site-packages/botocore/regions.py", line 135, in _endpoint_for_partition
    raise NoRegionError()
botocore.exceptions.NoRegionError: You must specify a region.

Changing the command to glacier_upload --help --region eu-central-1 results in the same error.

I'm using Amazon Linux 2 (RHEL 7), Python 3.6. Tried both pip install glacier_upload and running python3 setup.py install in the latest develop version. Does this botocore mentioned in the stacktrace have the wrong version? I'm not very experienced with Python. ¯\_(ツ)_/¯

Fail at 10k+ parts?

When using the script to upload an archive in parts which number higher than 10,000 the script fails with a handled exception and can't resume.

After passing the upload ID:

Getting more parts...
Getting more parts...
Verifying uploaded parts  [####################################]  100%          
Spawning threads...
Uploading part 10001 of 56495... (17.70%)
Uploading part 10002 of 56495... (17.70%)
Uploading part 10003 of 56495... (17.70%)
Uploading part 10004 of 56495... (17.71%)
Exception occured: AttributeError("type object 'InvalidParameterValueException' has no attribute 'write'")
Upload not aborted. Upload id: teAUzrwAdbmYZ4YcprbwsGuFLhfvpNB9pCD_dr4IQFUIH0ppGc4MOVSW5asBhYhaBVkwhvObLHlJnvjKENOI9eFUZqkp
Exiting.

Removing click from commands

Hi,

I am not sure if I should have the discussion here so sorry in advance. Would it be possible to expose the available functions without click? For example, I would like to use the API you have created to write a snippet as the following:

from glacier_upload.upload import upload

upload(...)

However, when I currently do this within a python console I receive an issue about the number of arguments. Reading online I found an answer like: https://stackoverflow.com/a/40094408 which may allow both usages as the functions are straightforward.

An example similar to the link would be

# click decorators here
def upload_command(...):
    upload(...) # call the original upload here

def upload(...):
    # stays as it currently is but implements the functionality

Let me know your thoughts on this as I would be interested in submitting a pull request for this. My further use case would be having upload() return back the response metadata to store the archive id myself but that will be for another issue.

Great work BTW.

ImportError: No module named 'glacier_upload'

Installed for git in a virtual environment and git the following error:

Traceback (most recent call last):
  File "/opt/backup/glacier-env/bin/glacier_upload", line 11, in <module>
    load_entry_point('glacier-upload==1.1', 'console_scripts', 'glacier_upload')()
  File "/opt/backup/glacier-env/lib/python3.4/site-packages/pkg_resources/__init__.py", line 572, in load_entry_point
    return get_distribution(dist).load_entry_point(group, name)
  File "/opt/backup/glacier-env/lib/python3.4/site-packages/pkg_resources/__init__.py", line 2755, in load_entry_point
    return ep.load()
  File "/opt/backup/glacier-env/lib/python3.4/site-packages/pkg_resources/__init__.py", line 2408, in load
    return self.resolve()
  File "/opt/backup/glacier-env/lib/python3.4/site-packages/pkg_resources/__init__.py", line 2414, in resolve
    module = __import__(self.module_name, fromlist=['__name__'], level=0)
ImportError: No module named 'glacier_upload'

Steps to reproduce:

virtualenv-3 glacier-env
source glacier-env/bin/activate
pip install --upgrade pip
cd glacier-upload/
python setup.py install
glacier_upload --help

Unable to upload small files to glacier

I got the following error when uploading a file whose size is less than 4MB

botocore.errorfactory.InvalidParameterValueException: An error occurred (InvalidParameterValueException) when calling the UploadArchive operation: Invalid Content-Length: 0

This error happens at line 70 of upload.py, which is body=file_to_upload,.

The problem seems to be related to file_size = file_to_upload.seek(0, 2) line in upload.py.

Because after I change it to file_size = 1, the problem no longer occurs.

"OverflowError: Python int too large to convert to C long"

Successfully uploaded a video file to Glacier using glacier_upload. The size of the
file was 7406281365 bytes. Initiated an archival retrieval command using:

init_archive_retrieval -v MyVault -a ArchiveName -d description

which completed successfully and returned a job id.
After getting an SNS notification that the the job completed successfully, I tried:

get_glacier_job_output --vault-name MyVault  --job-id JobId -f archive.tar.xz

I get:

Checking job status...
Job status: Succeeded
Retrieving job data...
Traceback (most recent call last):
File "C:\Users\palmerf\AppData\Local\Programs\Python\Python38\Scripts\get_glacier_job_output-script.py", line 11, in
load_entry_point('glacier-upload==1.2', 'console_scripts', 'get_glacier_job_output')()
File "c:\users\palmerf\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 829, in call
return self.main(*args, **kwargs)
File "c:\users\palmerf\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 782, in main
rv = self.invoke(ctx)
File "c:\users\palmerf\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "c:\users\palmerf\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 610, in invoke
return callback(*args, **kwargs)
File "c:\users\palmerf\appdata\local\programs\python\python38\lib\site-packages\glacier_upload\get_job_output.py", line 41, in get_job_output
file.write(response['body'].read())
File "c:\users\palmerf\appdata\local\programs\python\python38\lib\site-packages\botocore\response.py", line 78, in read
chunk = self._raw_stream.read(amt)
File "c:\users\palmerf\appdata\local\programs\python\python38\lib\site-packages\urllib3\response.py", line 515, in read
data = self._fp.read() if not fp_closed else b""
File "c:\users\palmerf\appdata\local\programs\python\python38\lib\http\client.py", line 467, in read
s = self._safe_read(self.length)
File "c:\users\palmerf\appdata\local\programs\python\python38\lib\http\client.py", line 608, in _safe_read
data = self.fp.read(amt)
File "c:\users\palmerf\appdata\local\programs\python\python38\lib\socket.py", line 669, in readinto
return self._sock.recv_into(b)
File "c:\users\palmerf\appdata\local\programs\python\python38\lib\ssl.py", line 1241, in recv_into
return self.read(nbytes, buffer)
File "c:\users\palmerf\appdata\local\programs\python\python38\lib\ssl.py", line 1099, in read
return self._sslobj.read(len, buffer)
OverflowError: Python int too large to convert to C long

Clearly, the size of the file is greater than C's 2,147,483,647 long int definition. Is there a way around this?

delete_glacier_archive options error

I installed glacier_upload 1.2 via pip3

When I issue the help option on delete_glacier_archive it shows:

delete_glacier_archive --help
Usage: delete_glacier_archive [OPTIONS]

Options:
  -v, --vault-name TEXT  The name of the vault  [required]
  -u, --upload-id TEXT   ID of the archive to delete  [required]
  --help                 Show this message and exit.

But when try to delete an archive the script fails:

delete_glacier_archive --vault-name $VAULT --upload-id $ARCHIVE_ID
Traceback (most recent call last):
  File "/usr/local/bin/delete_glacier_archive", line 8, in <module>
    sys.exit(delete_archive())
  File "/home/sebastian/.local/lib/python3.8/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/home/sebastian/.local/lib/python3.8/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/home/sebastian/.local/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/sebastian/.local/lib/python3.8/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
TypeError: delete_archive() got an unexpected keyword argument 'upload_id'

So I searched in the source directory and it shows the --archive-id option.
i tried with this new option but fails again:

delete_glacier_archive --vault-name $VAULT --archive-id $ARCHIVE_ID
Usage: delete_glacier_archive [OPTIONS]
Try 'delete_glacier_archive --help' for help.

Error: no such option: --archive-id

my python knowledge is not good enough to try to fix it or recognize the problem 😞

Reattempt failed upload part instead of crashing immediately

Looks like my upload failed due to a RequestTimeout. If I start the upload again, it does properly check what parts have already been uploaded but it takes a lot of time.

I made some changes locally to make this work, wanted to check if there was interest before sending a PR.

FR: sync directory

Any plans to support sync of entire directory to a vault? Looking to use a native Glacier client as backup tool but need recursive/directory/sync support.

delete_glacier_archive shows TypeError: delete_archive() got an unexpected keyword argument 'upload_id'

Hi !

When I run delete_glacier_archive -v XXXX --upload-id YYYYYZZZZZ
I get the following error:

Traceback (most recent call last):
File "delete_glacier_archive", line 11, in
load_entry_point('glacier-upload==1.2', 'console_scripts', 'delete_glacier_archive')()
File "/usr/local/lib/python3.5/site-packages/click/core.py", line 764, in call
return self.main(*args, **kwargs)
File "/usr/local/lib/python3.5/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/usr/local/lib/python3.5/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/local/lib/python3.5/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
TypeError: delete_archive() got an unexpected keyword argument 'upload_id'

All other utilities of glacier_upload work fine.

Can you help us solve this problem ?
Best,
Antonio

Function import name mismatch

Hi, when trying to abort a glacier multipart upload I get the following error:

$abort_glacier_upload --help
Traceback (most recent call last):
File "///.local/share/virtualenvs/Glacier-Backups-Tool-UR0ryj4F/bin/abort_glacier_upload", line 5, in
from glacier_upload.abort_upload import abort_upload
ModuleNotFoundError: No module named 'glacier_upload.abort_upload'

This has 1 of two easy fixes:
1 - change the module file name to match: "manual_abort_upload.py" > "abort_upload.py"
or
2 - change the import to match the module file name: "from glacier_upload.abort_upload import abort_upload" > " from glacier_upload.manual_abort_upload import abort_upload"

I hope this helps 😃

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.