How To Upload Ten million files to Object Storage

I have 14 million files, I use s3cmd and Cyberduck to sync folder, But it is too slow, I uploaded ten thousand files one night. I uploaded the files to my linode server, and upload them in my linode server, it is also very slow.

How Can I Upload These Files, It is There a way to upload a zip file and unzip it in object storage?


2 Replies

Hey there -

I've been doing some testing of my own trying to come up with the best solution for this, and while I don't have as many files as you do to upload, a way in which I've discovered that might work for you is by mounting your object storage bucket as a local filesystem.

To do this, you can use a utility like FUSE. It's a free, open-source plugin that supports the main Linux distros. What it will do is show your S3 bucket as a local drive:

FUSE-based file system backed by Amazon S3

There's also a Stack Exchange article that goes into this as well, using FUSE as an example of how to zip and unzip files to your bucket while it is mounted as a local filesystem:

How to extract files from a zip archive in S3

I hope this helps. Feel free to let us know about any success you have (or obstacles you run into) when trying this.

Regarding this topic, do you have a recommended guide to encrypt/decrypt the files when writing/reading them to/from a bucket that is mounted into the server using this proposed method?


Please enter an answer

You can mention users to notify them: @username

You can use Markdown to format your question. For more examples see the Markdown Cheatsheet.

> I’m a blockquote.

I’m a blockquote.

[I'm a link] (

I'm a link

**I am bold** I am bold

*I am italicized* I am italicized

Community Code of Conduct