We have to connect our Network to the Internet, how can we defend it?
Intuitively, one would expect that the defence of the company network from outsiders is at the forefront of corporate management’s mind. Information is the primary asset for companies operating in most industries, and management is brought up to be fearful of data loss, corruption, and unauthorised disclosure.
Further, security incidents on the internet regularly receive a high profile in the media. Replacement of material at the Dept of Justice and CIA web sites has recently caught the public eye, and this is not exceptional. Internet security has been a rich source of journalistic material ever since the Internet Worm. Management should by now, be familiar with the idea of people attacking networks from the outside with some significant degree of networking knowlege and expertise, and with access to an array of suitable tools.
Consequently, most people you speak to say that security is important to their company, that they are aware of Internet security issues, and, indeed, security is often given as the justification for some surprising and arcane network configurations.
However, in practice there does seem to be a polarisation between stated intent and achievement. When companies networks are examined from the internet, often the range of hosts and protocols offered to the internet speak for themselves about the corporate commitment to security, and the well intentioned statements about the level of attention given to security are shown to be complacent platitudes. Analysis of around one hundred sites requesting trials of the Netcraft Network Examination service has given us a good deal of empirical knowledge about the state of corporate internet gateways and the people who manage them.
Does anyone do any testing?
Very few people test the configuration of their internet gateways at all, never mind doing it on a regular basis. Of the companies, primarily based in the UK, who requested a free trial of our own Network Examination Service, only three had done a visibility test of their own network from outside their own router previously, and none were doing it on a frequent and regular basis.
This points to a gap in the education of corporate MIS management. Probably none of the companies concerned would release an untested program into a production environment, because they are familiar with the indigeonous risk and likely consequences. But, nearly all of them have connected networks to the internet without testing to see which hosts in their own network address space are visible from the internet, which services (protocols) these machines offer to the internet at large, or whether these services have well known vulnerabilities.
Those companies who have tested their network configurations have typically done it as a one-off exercise, often immediately after connecting to the internet. This misses the point that security is temporal; a site which has no known vulnerabilities one day, may be trivially vulnerable the next, perhaps because someone has made a configuration change to a router or a piece of software, or because a feature becomes known in a piece of software not previously known to be insecure.
More than half the people running corporate internet gateways are complacent about security to the point of comic theatre. Classic quotes I’ve heard in conversations with prospects for our own Network Examination Service during the course of this year include;
- We wouldn’t have connected to the internet if we weren’t secure.
- National Newspaper
- There’s no point in testing our security from the internet at the moment because we haven’t defined our security policy yet.
- We haven’t finished configuring our firewall yet, so it’s bound to be insecure.
- If I wanted to check what protocols were visible to the internet I’d look at our router configurarion.
- Computer Manufacturer
- We had a firm of consultants check our internet gateway when we installed it.
- When I saw the report I phoned our ISP, and they say they don’t configure routers to offer those protocols, so there must be something the matter with your software.
- The router’s not our problem. Our ISP manages our router for us, so it would be their problem if there was something wrong with it.
The counterpoint to this lack of care and responsibility are the antedotes that well known internet security consultants throw into their presenatations; one remark of Brent Chapman’s particularly springs to mind;
It’s not unusual for big name companies to be attacked within a couple of hours of connecting to the internet, and it’s not unheard of for those attacks to be successful.
What are the most common mistakes?
Insufficient protocol filtering
The most obvious network configuration error is insufficient protocol filtering. Lack of protocol filtering invites probes on any of the machines on the network, and this means there are simply too many points of entry to defend them all. Gaining control of any machine on the local area network gives an attacker a bridgehead into the orgainisation; even if the machine is not in itself a particularly valuable one, it can be used as a platform to attack others, and makes access to broadcast packets available.
The most commonly encountered host configuration error to date has been the enabling of the udp services echo, chargen, and time, especially on routers. Even otherwise well protected networks often have this vulnerability, probably because it has been the default configuration on Cisco routers until recently and Cisco is a very popular choice of router; all the major UK ISPs use them. The impact is that someone who takes a dislike to a site can use up its bandwidth by sending a packet the chargen port on their router with the source port and source address set to the echo port on someone else’s router. This will likely cause both routers to loop sending packets to each other, and use up a substantial portion of the sites available bandwidth.
Relatively few sites devote time to actively tracking security information and patches from vendors. Consequently, it’s common to see smtp banners from versions of sendmail long since known to be insecure. Sendmail, though, is not by any means the totality of the problem. For example, the October 1996 Netcraft Web Server Survey found 9316 sites running NCSA/1.3 which has been known to be insecure since February 1995. And in the Network Examination trials we have only found one site running Microsoft-Internet-Information-Server/1.0 that was not vulnerable to trivial attacks that have been well known since March 1996.
So, how should we defend our network?
The first thing is to realise that there’s nothing easy about it. Security, like reputation, has to be continually earned, and can be lost in an instant.
Compile a list of exactly what services need to be offered to the internet at large from your network. Implement filtersets on your router to ensure that exactly these services are available and no more. If you don’t already have one, a simple router such as the Cisco 2501 can be bought for around £1200 and is the best value individual purchase you can make in defending your network.
Test it. These tests should be run from outside your own router, so that the tests by defintion see your network from the same perspective as other internet users, and should be performed on the entire network address space. Only the omniscient can tell what their network looks like from the outside without actually being there.
Test it again. Regular automated testing is necessary to accomdate change. Wheras a one-off consultants report is a very fine thing, it is out of date the next time someone makes a router configuration change or a vulnerability is found in a commonly used service. Frequent testing mitigates this risk.
Religeously track the status of all the software you use to provide services to the internet, for security information. If you use an external supplier to do your testing, they will help with this.
It is commercially imperative for companies to integrate their own information systems with the internet to provide an expressive level of services to the internet community. Presently, people are insufficiently fearful about the bad things that might happen to them once their network is accessible by the world at large, and do not develop a mindset or an attention to detail commensurate with the risk this entails. External testing can provide the empirical information to confront complacency and lack of responsibility, and facilitate a reduction in a company’s exposure to external vulnerabilities.