Dear fellows,
I currently look after a security solution for my company. I know that I will not get many answers from the list, as security is pretty much the secret recipie of all network operators.
However, I better try to send a post here and see what feedbacks I can get, so let’s get started.
I run a farm of 15 servers, all running RedHat Linux 5 x64. These servers are mainly webhosting orientated, they handle website files, database and emails. The network is multihomed and with a capacity of 3 x 100 Mbit. We currently don’t have any kind of security, nor a firewall appliance (yes, I know shame on me).
At this point, I am looking at a cost-effective solution. I have checked around for commercial solutions and have found Cisco and Juniper to be my options.
I must admit that I am not convinced at all by these brands and would fell pretty ashamed to have a Cisco ASA toy in my rack. As for Juniper, it seems that the boxes are a bit overpriced for my single-featured IT department and would kill my yearly poor budget.
I use to see some dirty forged packets hiting the servers. They never took a server down, nor made them fill up the memory, but I consider I could see some „dos“ or even non-bot size „ddos“ attacks. Another point is that I must have a firewall that is transparent. Some servers requires to have public IP (for dumb license reasons).
What would you advice? Is BSD/Linux with a multi-gig port a good option to consider? What firewall do you advice? How do you clean ddos?
Looking forward to reading all answers.
Regards.
- Simon
Simon Leins wrote:
Dear fellows,
I currently look after a security solution for my company. I know that I will not get many answers from the list, as security is pretty much the secret recipie of all network operators.
There is nothing secret about it: - don't run untrusted code - fix/remove vulnerable software ASAP - firewall appropriately - split management/client - have good contacts upstream - watch all the logs, but don't log too much
However, I better try to send a post here and see what feedbacks I can get, so let’s get started.
I run a farm of 15 servers, all running RedHat Linux 5 x64. These servers are mainly webhosting orientated, they handle website files, database and emails. The network is multihomed and with a capacity of 3 x 100 Mbit. We currently don’t have any kind of security, nor a firewall appliance (yes, I know shame on me).
Why are you 'shamed' that you don't have one? As you are running some form of Linux and Linux has this thing called 'iptables', except for it being a software based firewall, there is not much of a difference and most likely it will give you a lot lot lot more flexibility than what you would have with an off the shelf product aka "appliance". The latter will just let you probably click something together which will actually not have any appropriate effect for your environment.
It of course, like always, completely depends on your requirements.
The best thing to remember is that as long as nothing listens, they can't get at you through that; then again there are also again TCP based exploits which just need to talk to a single TCP stack... aka upgrade and fix vulnerabilities in time.
As you say 'webhosting' your biggest worry though won't be that, it will be all the great php/perl/whatever scripts written by people who haven't figured out what security means causing great things as SQL injections or just simple remote file inclusions. (aka, enable php error logging in full to a file, and see what junk you get, and you might want to consider running PHP with Suhosin.
Of course, you can always limit with iptables the outbound ports/addresses that your box talks to unless the connection was 'established'.
If you have CPU cycles free on your hosts you could always run a Snort on it or something like that, but there will be lots of alerts which will be no concern at all.
I use to see some dirty forged packets hiting the servers.
Nothing you can do about it as upstream needs to take care of spoofed packets. Nevertheless, iptables can take care of most of the junk.
They never took a server down, nor made them fill up the memory, but I consider I could see some „dos“ or even non-bot size „ddos“ attacks.
Nothing that will help you there with your 300mbit and most likely only 100mbit on the servers itself. You'll be gone in seconds.
Another point is that I must have a firewall that is transparent. Some servers requires to have public IP (for dumb license reasons).
The proper solution is to separate your management and your client-facing networks. That way, even if the client-facing network is overloaded due to whatever is happening you can still get in through the backdoor. Of course, you should not publish the management details in any way....
What would you advice? Is BSD/Linux with a multi-gig port a good option to consider? What firewall do you advice? How do you clean ddos?
You don't "clean" ddos.... The best thing you can do is to avoid attracting one and keeping your systems clean.
Greets, Jeroen
Morning
I just about agree with Jeroen.
* on the Wed, Sep 16, 2009 at 10:38:21AM +0200, Jeroen Massar wrote:
As you say 'webhosting' your biggest worry though won't be that, it will be all the great php/perl/whatever scripts written by people who haven't figured out what security means causing great things as SQL injections or just simple remote file inclusions. (aka, enable php error logging in full to a file, and see what junk you get, and you might want to consider running PHP with Suhosin.
That's not enough, by far. You might consider to use mod_security.
I use to see some dirty forged packets hiting the servers.
Nothing you can do about it as upstream needs to take care of spoofed packets.
You can do some rough ingress-filtering on your routers. And you definitely should do egress-filtering on them, so YOU can't become a source of spoofed packets.
Nevertheless, iptables can take care of most of the junk.
Yes, I'd recommed to do just that. Filter out any junk with iptables; block any ports you're not using for services from the outside (so any user on your machine running a daemon can't have connections to it from the outside), and limit outgoing connections. I personally also like to rate-limit ICMP.
But don't be over-zealous, especially not where ICMP is concerned: http://portal.acm.org/citation.cfm?id=1050542
Cheers Seegras
Peter Keel a écrit :
Morning
I just about agree with Jeroen.
- on the Wed, Sep 16, 2009 at 10:38:21AM +0200, Jeroen Massar wrote:
As you say 'webhosting' your biggest worry though won't be that, it will be all the great php/perl/whatever scripts written by people who haven't figured out what security means causing great things as SQL injections or just simple remote file inclusions. (aka, enable php error logging in full to a file, and see what junk you get, and you might want to consider running PHP with Suhosin.
That's not enough, by far. You might consider to use mod_security.
Beware of default mod_security filters. It first disabled phpmyadmin on a cluster (long time ago) but it's generaly a good protection against XSS et SQL injections (and w00tw00t-like).
Nevertheless, iptables can take care of most of the junk.
Yes, I'd recommed to do just that. Filter out any junk with iptables; block any ports you're not using for services from the outside (so any user on your machine running a daemon can't have connections to it from the outside), and limit outgoing connections. I personally also like to rate-limit ICMP.
Yeah, I prefer rate limiting with monitoring when packets becomes refused so you can follow your bandwidth/pps progression. I also rate-limit UDP (after an UDP flood from a hacked server, inside) and rate-limit TCP sessions establishment.
I'm doing this with OpenBSD's pf on our BGP routers. We don't have much traffic (~ 20 Mbps) so it's working like a charm.
But all this doesn't protect from DDoS. I don't know how to block these without big traffic analysis with a lot of probabilities and other mathematical functions.
But don't be over-zealous, especially not where ICMP is concerned: http://portal.acm.org/citation.cfm?id=1050542
I recently saw a customer who couldn't steam any video correctly because his (previous) hoster was blocking some types of ICMP. rate-limit is far better in my opinion but must be maintained to be sure the rate is adequate.
Julien Escario