0.05
No commit activity in last 3 years
No release in over 3 years
S3 multithreaded directory uploader
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
 Dependencies

Development

Runtime

>= 0.9.4
 Project Readme

S3Uploader

Gem Version Build Status Coverage Status

Multithreaded recursive directory uploader to S3 using fog.

It recursively transverses all contents of the directory provided as source parameter, uploading all files to the destination bucket. A destination folder where to put the uploaded files tree inside the bucket can be specified too.

By default, it uses 5 threads to upload files in parallel, but the number can be configured as well.

Files are stored as non public if not otherwise specified.

A CLI binary is included.

Installation

Add this line to your application's Gemfile:

gem 's3_uploader'

And then execute:

$ bundle

Or install it yourself as:

$ gem install s3_uploader

Usage

	uploader = S3Uploader::Uploader.new({
         :s3_key => YOUR_KEY,
         :s3_secret => YOUR_SECRET_KEY,
         :destination_dir => 'test/',
         :region => 'eu-west-1',
         :threads => 10
		})

  uploader.upload('/tmp/test', 'mybucket')

or

	S3Uploader.upload('/tmp/test', 'mybucket',
		{ 	 :s3_key => YOUR_KEY,
			   :s3_secret => YOUR_SECRET_KEY,
			   :destination_dir => 'test/',
			   :region => 'eu-west-1',
			   :threads => 4,
			   :metadata => { 'Cache-Control' => 'max-age=315576000' }
		})

Former static method upload_directory is still supported for backwards compatibility.

	S3Uploader.upload_directory('/tmp/test', 'mybucket', { :destination_dir => 'test/', :threads => 4 })

If no keys are provided, it uses S3_KEY and S3_SECRET environment variables. us-east-1 is the default region.

Metadata headers are documented here

Or as a command line binary

s3uploader -r eu-west-1 -k YOUR_KEY -s YOUR_SECRET_KEY -d test/ -t 4 /tmp/test mybucket

Again, it uses S3_KEY and S3_SECRET environment variables if non provided in parameters.

s3uploader -d test/ -t 4 /tmp/test mybucket

Compress files

If the :gzip options is used, files not already compressed are packed using GZip before upload. A GZip working directory is required in this case.

  S3Uploader.upload_directory('/tmp/test', 'mybucket',
    {    :s3_key => YOUR_KEY,
         :s3_secret => YOUR_SECRET_KEY,
         :destination_dir => 'test/',
         :region => 'eu-west-1',
         :gzip => true,
         :gzip_working_dir => '/tmp/gzip_working_dir'
    })

Contributing

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Added some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request

Contributors

License

Distributed under the MIT License. See LICENSE file for further details.