massive short-term bandwidth needs

I need a Linode (largest-size) for just one month.

My client is a music company that has released an extremely popular album for download. Their existing server is swamped - it's unable to deliver the 60Mb/s traffic of people trying to download this album (hundreds of simultaneous downloads), and performance of their main web site is suffering.

I would like to move this download onto a linode, freeing up the main server for normal work.

How much bandwidth can a Linode 4096 handle (I'm interested in the per-second rate)? 60Mb/s is more than half of a FastEthernet connection - can we saturate a 100Mbps link? or do we have a gigabit uplink?

The situation is urgent, and I hope to get this set up within the next few hours, if this is a workable solution.

thanks, matt.

13 Replies

You have a Gig link which has a cap somewhere around 60meg. Linode support will glady raise that if you open a ticket.

Open a ticket and you will probably have a reply within 5 minutes from support.

So you're not currently using Linode? By default, your node is restricted to 50mbps, but you can create a support ticket and ask them to increase that, or so I've read others have done.

Really if you're just serving a static file, create an account, setup running a LEMP stackscript to have it served via nginx:

http://www.linode.com/stackscripts/view … criptID=42">http://www.linode.com/stackscripts/view/?StackScriptID=42

And you'll be up and running in about 20-30 minutes if you just want to point to an IP address.

btw, 4096 is not the largest plan :) It's just what's displayed on the main site.

Why aren't you looking at one of the big CDN providers for something like this?

Hi Matt -

60Mb/s wouldn't be a problem. A few quick points, however:

1) Linodes, by default, are capped at 50Mb/s. This can easily be raised with a support ticket and proper justification.

2) You may want to split this up between multiple Linodes.

3) 60Mb/s sustained for a month will total out around 18TB. Make sure you understand the transfer fees involved with pushing that much content.

I suggest you e-mail support@linode.com or open a support ticket if you already have an account.

Regards,

-Tom

The outbound connections of a Linode are capped at 50Mbps by default (to prevent you from blowing through your transfer quota in a couple days and racking up a huger overage bill). However, Linode is generally happy to bump that up with even the slightest technical justification, in my experience. Bumping it from 50 to 60 would seem like no problem at all.

EDIT: Wow… I was way too slow ;)

One other thought though… even a simple DNS round-robin would cut the transfer requirements of any one particular server dramatically.

@waldo:

So you're not currently using Linode? By default, your node is restricted to 50mbps, but you can create a support ticket and ask them to increase that, or so I've read others have done.

I use Linode for my own servers, and those of two other clients, but this is one I inherited; they have a dedicated server set up elsewhere. (Their disk space needs are massive, far beyond anything in the published Linode service plans - currently about 1.2TB of audio).

@waldo:

Really if you're just serving a static file, create an account, setup running a LEMP stackscript to have it served via nginx:

http://www.linode.com/stackscripts/view … criptID=42">http://www.linode.com/stackscripts/view/?StackScriptID=42

What I plan to do is roll out a simple Apache install with static files protected by HTTP basic auth (.htaccess), and then generate a unique username/password for each customer with the right to this file. The file isn't public - these customers each paid the purchase price of the album - so we need to be sure there are no unauthorised downloads, as giving it away for free would violate our contracts with the artists. Using the customer's real name or email for the htaccess username will discourage them from posting the link in any public places.

> Why aren't you looking at one of the big CDN providers for something like this?

Linode is what I'm familiar with! I'm confident I can build a secure delivery system, with usernames that will appear in the access logs so we can detect abuse, in an hour or so.

Thanks for the tip - I'll be phoning my client in a few minutes to advise that we go with this plan, with the bandwidth caps raised.

@tasaro:

Hi Matt -

60Mb/s wouldn't be a problem. A few quick points, however:

1) Linodes, by default, are capped at 50Mb/s. This can easily be raised with a support ticket and proper justification.

2) You may want to split this up between multiple Linodes.

3) 60Mb/s sustained for a month will total out around 18TB. Make sure you understand the transfer fees involved with pushing that much content.

I suggest you e-mail support@linode.com or open a support ticket if you already have an account.

Regards,

-Tom

Thanks - I'm going to talk to the client in a few minutes and get permission to spend the money, then set up a 4096 and ask it be uncapped. We expect that the traffic will die down after a week or so, and it's US-only which means it will drop dramatically at night, so we might not even hit the monthly limit - but I've looked at your price for going over, and it's reasonable, so it's not a big deal if we do.

You mention multiple nodes - in terms of serving this content efficiently (it'll be flat files served up directly by Apache, no https, no CGI, PHP or any interpreted language), and effectively using the CPU, would it be better to have one big node or two small ones? I can easily have the same hostname point to both.

thanks, matt

Two smaller nodes have the same amount of memory as one larger node, but they have "twice" the CPU availability and disk i/o bandwidth ("twice" is a simplification, hence the quotes).

For the type of traffic you're talking about, I would throw several smaller nodes at it, just using a DNS round-robin to distribute the load across them.

If you're just throwing static files out there, an evented server like nginx would probably be a better idea. Implementing the authentication you mention above would be just as easy. http://wiki.nginx.org/HttpAuthBasicModule

@JshWright:

Two smaller nodes have the same amount of memory as one larger node, but they have "twice" the CPU availability and disk i/o bandwidth ("twice" is a simplification, hence the quotes).

For the type of traffic you're talking about, I would throw several smaller nodes at it, just using a DNS round-robin to distribute the load across them.

If you're just throwing static files out there, an evented server like nginx would probably be a better idea. Implementing the authentication you mention above would be just as easy. http://wiki.nginx.org/HttpAuthBasicModule

Two nodes it'll be then - I can even locate them in different cities to spread the load. (Although with simple DNS round robin, not some GeoIP lookup, it'll be random as to who gets assigned to which).

Apache setup is as easy as "scp -r" the existing apache install tree from one of my existing nodes, so I'll probably go with that, at least at first - but if it appears to be hitting some cpu limitation I could look into nginx in a few days.

thanks, matt.

Apache will hit memory limitations long before it hits CPU limitations (though the fact that it won't have to carry a PHP interpreter around with it everywhere it goes will help)

I strongly suggest using mpm_worker instead or the standard prefork, worker is threaded so will use less ram, and since you aren't using php you don't need to worry about thread safe problems.

@obs:

I strongly suggest using mpm_worker instead or the standard prefork, worker is threaded so will use less ram, and since you aren't using php you don't need to worry about thread safe problems.

Thanks - I hadn't known about that, and will research that option. I've always done prefork.

I've got one node set up - but ran into a minor snag. Because it's the 29th of the month, I don't have access to the full 800GB quota until the 1st - the dashboard shows 77GB available, and overages will cost extra. We'll probably burn through that quota in the first few hours!

(This would have been a brilliant plan, if only our site had exploded in popularity at the beginning of the month rather than at the end.)

I'll probably set up a second server within my own account - as my personal server is well under quota, I've got about 600GB I can donate.

And the cost isn't excessive, really… $0.10 per GB is cheap, considering that each and every transaction this server makes will be to serve a paying customer.

I suggest contacting support and seeing if there is something they can work with you on this.

EDIT:
> And the cost isn't excessive, really… $0.10 per GB is cheap, considering that each and every transaction this server makes will be to serve a paying customer.

Yeah, but those charges add up… I run a brick-n-mortar business where we accept credit cards. While the transaction fees are small, spread that across the entire month of revenue and 1,000s of charges and they add up to a lot…. It's a bill I'd rather not have to pay. Well, to be honest, I don't mind paying for the service, I just wish the fees were even smaller :)

Reply

Please enter an answer
Tips:

You can mention users to notify them: @username

You can use Markdown to format your question. For more examples see the Markdown Cheatsheet.

> I’m a blockquote.

I’m a blockquote.

[I'm a link] (https://www.google.com)

I'm a link

**I am bold** I am bold

*I am italicized* I am italicized

Community Code of Conduct