Skip to content

Bigstore-compress filter is broken with S3 backend #32

@jansel

Description

@jansel

I have not tested the other backends.

The first issue is the use of tempfile.TemporaryFile instead of tempfile.NamedTemporaryFile. The temporary file does not have a name on disk on most operating systems, which results in an error:

$ git bigstore push
pulling bigstore metadata...done
compressed!
Traceback (most recent call last):
  File "/home/jansel/testrepo/venv/bin/git-bigstore", line 87, in <module>
    args = parser.parse_args()
  File "/usr/lib/python2.7/argparse.py", line 1701, in parse_args
    args, argv = self.parse_known_args(args, namespace)
  File "/usr/lib/python2.7/argparse.py", line 1733, in parse_known_args
    namespace, args = self._parse_known_args(args, namespace)
  File "/usr/lib/python2.7/argparse.py", line 1942, in _parse_known_args
    stop_index = consume_positionals(start_index)
  File "/usr/lib/python2.7/argparse.py", line 1898, in consume_positionals
    take_action(action, args)
  File "/usr/lib/python2.7/argparse.py", line 1807, in take_action
    action(self, namespace, argument_values, option_string)
  File "/usr/lib/python2.7/argparse.py", line 1096, in __call__
    subnamespace, arg_strings = parser.parse_known_args(arg_strings, None)
  File "/usr/lib/python2.7/argparse.py", line 1733, in parse_known_args
    namespace, args = self._parse_known_args(args, namespace)
  File "/usr/lib/python2.7/argparse.py", line 1942, in _parse_known_args
    stop_index = consume_positionals(start_index)
  File "/usr/lib/python2.7/argparse.py", line 1898, in consume_positionals
    take_action(action, args)
  File "/usr/lib/python2.7/argparse.py", line 1807, in take_action
    action(self, namespace, argument_values, option_string)
  File "/home/jansel/testrepo/venv/bin/git-bigstore", line 33, in __call__
    push()
  File "/home/jansel/testrepo/venv/local/lib/python2.7/site-packages/bigstore/bigstore.py", line 246, in push
    backend.push(compressed_file, hexdigest, cb=ProgressPercentage(filename))
  File "/home/jansel/testrepo/venv/local/lib/python2.7/site-packages/bigstore/backends/s3.py", line 35, in push
    self.s3_client.upload_file(file.name, self.bucket, self.get_remote_file_name(hash), Callback=cb)
  File "/home/jansel/testrepo/venv/local/lib/python2.7/site-packages/boto3/s3/inject.py", line 106, in upload_file
    extra_args=ExtraArgs, callback=Callback)
  File "/home/jansel/testrepo/venv/local/lib/python2.7/site-packages/boto3/s3/transfer.py", line 275, in upload_file
    future.result()
  File "/home/jansel/testrepo/venv/local/lib/python2.7/site-packages/s3transfer/futures.py", line 73, in result
    return self._coordinator.result()
  File "/home/jansel/testrepo/venv/local/lib/python2.7/site-packages/s3transfer/futures.py", line 233, in result
    raise self._exception
OSError: [Errno 2] No such file or directory: '<fdopen>'

I fixed this issue on my fork, but bigstore-compress still seems buggy even after that. I observed the following behavior (with my partially fixed version):

  1. echo '*.bin filter=bigstore-compress' >.gitattributes
  2. dd if=/dev/zero of=test.bin bs=1M count=1
  3. git a . && git commit -m 'test' && git bigstore push runs without error
  4. git pull && git bigstore pull on another checkout results in a zero byte file instead of the expected 1M file

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions