Abstract
Web servers are the building blocks of this generation distributed systems and plays a major role in the enterprise application architectures. Due to the flexibility and openness that these Web server centric computing model offers, though makes this ubiquity possible, it also opens up corporate data and processes to security threats. This paper focuses on the security of the public Web servers. There are many different vulnerabilities such as exploits utilizing software bugs in the underlying operating system, denial of service, unauthorized access of confidential information, man-in-the middle attacks, subvert the Web server and use it as a bot in a larger botnet, and for piracy. This paper lists three of the most common vulnerabilities that plague the Web servers and enumerates the process and tools for mitigating them. An architectural design that could be used to adopted to ensure prevention of a Denial of Service (DoS) attack on a Web server. About 40% of the federal websites are not secure, as they have not adopted DNSSec. The report looks at the reasons for the federal agencies not adopting DNSSec for their websites. Some of the DNS vulnerabilities are enumerated and the mitigation or defense mechanisms that are needed to overcome those DNS vulnerabilities are discussed. By implementing these defense mechanisms, federal agencies can secure their websites.
Keywords: Web Servers, Denial of Service, DNSSec, Vulnerabilities
INTRODUCTION
Most organizations rely on the Web-centric business model using websites to provide information about themselves or sell their products or services. Government information systems are also becoming Web-centric so that technology helps them to meet and exceed the expectations of citizens in a cost effective manner. The Web has two principle components: applications that make information available (Web servers) and those that are used to access and display the information (Web browser). This paper focuses on the security issues of the Web servers.
WEB APPLICATION VULNERABILITIES AND MITIGATION
While the Web servers are prone to myriad varieties of attacks or vulnerabilities, this paper will three of the most common Web application vulnerabilities and their mitigation strategies.
Remote code execution: Due to this highly critical vulnerability, the attacker will be able to run system-level code on the server thereby retrieving any information. This happens due to improper coding errors. Two such types are errors are possible: Exploiting register_globals in PHP and XMLRPC for PHP vulnerabilities. In earlier versions of PHP, the setting register_global that allows the availability of superglobal variables was on by default, which resulted in insecure coding and wide exploitation. XMLRPC allows remote procedure calls over the internet and several implementations pass unsanitized data to the server allowing the attacker to execute code on the system.
Mitigation for the exploitation of register_globals in PHP is to update the PHP to the latest version as the later versions have this flag off by default. Similarly, the data from XMLRPC or any other user input must be thoroughly sanitized before allowing its use.
SQL Injection: MS SQL allows execution of any system level command through MS SQL server using extended stored procedure calls and the error messages reveal a lot of information. This moderately critical to highly critical vulnerability will enable the attacker to retrieve information from the database that would not be available normally and the error messages give out information such as the database name, table name, usernames, password hashes.
Countermeasures include avoiding connecting to the MS SQL server as a super user or database owner and instead connect as a customized user with bare minimum privileges required for the task. If PHP magic_quotes_gpc function is available and is on, it will escape the special characters in POST, GET, COOKIE data automatically in the latest versions of PHP. Using PHP versions after 4.3.0 will enable two functions addslashes (old method) and mysql_real_escape_string (latest method) to sanitize user data by prepending backslashes to \x00, \n, \r, \, ', "and \x1a. Using stored procedures with parameters that are automatically parameterized will also mitigate this.
Cross-site scripting: This less to moderately critical vulnerability occurs when the victim executes a malicious URL so disguised that it looks legitimate. This happens when there is an XSS bug in the Website, which can be stored XSS attacks, DOM based XSS attacks, or reflected XSS attacks. The server-side mitigation is to use context-sensitive output encoding, as it will eliminate the inclusion of untrusted data in HTML data in the response, which is the cause for server XSS error. By using safe JavaScript APIs, the problem with using unsafe data to update DOM can be eliminated, which is a problem that causes client XSS.
ARCHITECTURAL DESIGN TO PROTECT WEBSERVERS FROM DoS
Figure 1: A possible architectural design for mitigating Web servers from DOS
Source:
SECURITY RISKS FACING U.S. GOVT WEBSITES
There is a regulation that requires all government websites from employing an extra layer of authentication by using DNS security extensions to prevent hijacking of Web traffic to bogus sites, but only 40% of the sites are compliant by the deadline that was given, which was December 31, 2009. Various studies conducted showed that only 57% to 59% of the federal websites had adopted DNSSec while the rest had not. DNS attacks such as cache poisoning used to divert traffic from legitimate websites to bogus websites without the knowledge of the both the user and the website operator can be prevented by using DNSSEC as it will force the websites to verify the domain names and corresponding IP addresses using PKI (public-key infrastructure) and digital signatures. If all levels of the DNS such as the root zone, TLD (top level domain), and the individual websites support the DNSSEC standards, then it will prevent the man-in-the-middle attacks. Since the root zone and TLD .gov are cryptographically signed, following the standards will improve end-to-end security. The reasons for this are
Federal agencies are not paying attention to this
Government feels that they are adequately protected and federal CIOs are not aware that government sites can be hijacked
Not being on the radar screen of the executives
There is a delay in establishing chain of trust, which is a done by adopting the DNSSec standards from root level to TLD
The key rollover process takes three to four years’ time or else the domain name will not validate
Delay in the promulgation of latest FISMA guidelines making it mandatory for the websites to validate caching resolvers.
MITIGATING DNS ATTACKS USING DNSSEC
The DNS was not built with security in mind and as result has many vulnerabilities including:
Forged answers, registry compromise, and bogus routes - since DNS has no encryption, the responses can be manipulated or spoofed
DDoS attacks which make computing resources unavailable to user
Amplification attacks where attackers use publically accessible open DNS servers to flood a target system with DNS response traffic interception and modification of response packets, fake or manipulated name servers of a zone
Cache poisoning where an attacker injects a malicious mapping between a domain and an IP, redirecting the user to a malicious web or email server
Most attacks are however cache poisoning and DDoS attacks. The Domain Name System Security Extensions (DNSSEC) enhances DNS with a few security features by providing DNS clients and resolvers with authentication of origin of DNS data and denial of existence and integrity of data from the signing of the information that DNS Zones contain. Due to this, a DNS resolver knows the authenticated origin of the DNS information, verifies it by contacting the authoritative nameserver, and evaluates the integrity of its data. It authenticates the entity that signed the data thereby providing non-repudiation. It provides protection to DNS resolvers from cache poisoning, which forges or manipulates the DNS data. DNSSec-aware resolver looks up a domain name that is DNSSec enabled then the following process ensues. The stub resolver goes into a recursive mode. The query first goes to the recursive name server, which forwards the query to the trust anchor. Trust anchor is usually the root. Using this recursive mode, the DNSSec response to a resolution request can be validated by checking the entire chain, from the root to the TLD, This is possible as the rollover helps set up a chain of trust from the root to the TLD. This protects nameservers from possible attacks and manipulation of their data or use of fake nameservers. As a result, the parent zone can verify the integrity of child zone’s data and notify the user about it. However, it does not protect from DoS or DDoS. To prevent those attacks, IPSEC, TSIG (Transaction SIGnature), and SIG can be used.
The following points have to be kept in mind by federal agencies for DNSSec deployment:
DNSSec utilizes public-key cryptography and the records of a zone are digitally signed. The DNS root zone is verified by a set of public keys and a chain of trust is established. The domain administrator must upload the DNSKEY (KSK) hash to the parent zone. Overheads to implementing DNSsec include keys having to be generated, zones having to be signed, parent zone’s key hashes having to be published, and the general maintenance of the DNS server. Configuration mistakes result in unavailability of both DNS and DNSsec services. Memory and CPU utilization is higher, sometimes as much as seven times. Since there is an increase in the packet size, an increase in bandwidth is necessitated resulting in additional network overhead. To handle this increase in the packet size, the network appliances may have to fragment the DNS packets and rejoin them after processing which may make it difficult for firewalls and proxies to handle these packets.
Key management is important in the case of the DNSSec implementation and the rollover of the keys is a complicated with many steps. There is a requirement that these steps be performed in the correct order without missing any. These steps include generating different types of keys. All intermediate components should be configured properly adhering to DNSSec standards or else the validation fails. The components that are the most important in DNSsec deployment are root and TLD's zones, in which many zones rely on their security to provide validation to their end users. DNSSec uses additional timestamps for protection against replay attacks and attempts to retrieve DNSKEY, which have to be refreshed often by re-signing the zone and re-distributing them to the delegated servers failing which the error SERVFAIL can occur.
Conclusion
Web servers are an integral part of the businesses today. Due to their open and flexible nature, they are also prone to malicious attacks. These can be disruptive or unauthorized accesses. Some of the most common disruptive attacks are DNS cache poisoning and DDoS attacks. Attacks such as man-in-the middle attacks lead to unauthorized accesses. This paper provides potential mitigation actions for some of the common attacks. It focuses on DDoS mitigationg and guidelines for DNSSec implementation for federal websites.
References
Kargl, F., Maier, J., & Weber, M. (2016, July 9). Protecting Web servers from distributed denial of service attacks. Retrieved from www10.org: http://www10.org/cdrom/papers/409/
Marsan, C. D. (2012, March 15). 40% of U.S. government web sites fail security test: DoD, CIA among agencies that haven't adopted extra DNS security measures . Retrieved from networkworld.com: http://www.networkworld.com/article/2186860/data-center/40--of-u-s--government-web-sites-fail-security-test.html
OWASP. (2016, April 6). Cross-site scripting (XSS). Retrieved from owasp.org: https://www.owasp.org/index.php/Cross-site_Scripting_(XSS)
Poulidis, A., & Rohani, H. (2014). Research project 2: DNSSec revisited. Amsterdam: University of Amsterdam.
Siddharth, S., & Doshi, P. (2010, November 10). Five common Web application vulnerabilities. Retrieved from symantec.com: http://www.symantec.com/connect/articles/five-common-web-application-vulnerabilities
Tracy, M., Jansen, W., Scarfone, K., & Winograd, T. (2007). Guidelines on securing public web servers: recommendations of National Institute of Standards and Technology. Gaithersburg, MD: Computer Security Division, NIST.