Store huge number of small files

I need to store, read and write huge number of small files, about a million files of 1kB, one file at a time.
What will be the best way to store them, in xfs, ext4 file system or maybe in tar or perhaps you have other suggestion?

4 Replies

Linode Object Storage is the ideal solution perfectly suited for your specific use case. To efficiently store, read, and write a vast quantity of small files, including approximately one million files each at 1kB in size, Object Storage is the ultimate choice.

I highly recommend delving into the product documentation for Object Storage to ensure it fully aligns with your unique requirements:

I think it'll depend on your use case as to whether or not Object Storage is the best option. I'm using it for delivering map tiles (thousands of small image files) and it's been reasonably reliable, although the Frankfurt data centre has had a worrying number of outages and on-going access issues. If you're regularly creating/reading/updating your files, the lag introduced by non-local storage could be a problem for you and you'll also need to factor in the possibility that you'll experience outages (check the historic Linode status reports to make sure the data centre you're thinking of using is reliable).

I understand how to create block storage and/or object storage but I don't know what URL you would give someone to access a file (like a PDF) you have on one or the other? Can someone explain?


Please enter an answer

You can mention users to notify them: @username

You can use Markdown to format your question. For more examples see the Markdown Cheatsheet.

> I’m a blockquote.

I’m a blockquote.

[I'm a link] (

I'm a link

**I am bold** I am bold

*I am italicized* I am italicized

Community Code of Conduct