We have all heard about web services.
Are they the key building blocks for the next generation of killer apps? Microsoft is betting its next-generation architecture on it (in .NET) and many of the companies that have traditionally taken opposite sides on infrastructure (IBM, Sun and Microsoft) are embracing it in one form or another.
On the surface, this would appear to help smooth out many of the interoperability problems that have plagued both software developers and end users who want their applications to run well in heterogeneous environments. Seems like a good thing, right? Maybe. You hear about all the great things that web services will bring, but we don’t hear much about the potential security of web services. This article will take a look at some of the potential security problems of web services.
Web Services – Why Are They Different?
First, here’s some background. A web service is a piece of software that allows programs to make remote procedure calls to it via a standard XML-based protocol called the simple object access protocol (SOAP) or XML-RPC. Unlike typical web applications, a web service is meant to offer programs access to the servers rather than browsers. This won’t mean much from an end user’s perspective; they still just see the web pages that the web server provides. But from an application developer’s standpoint, it is a huge paradigm shift. Where previously a web developer would have essentially to pretend to be a browser and extract the data from a page that was designed to be displayed by browsers, a web service client provides data directly as output parameters in procedure calls. Not only is this more straightforward for the developer, it eliminates the trouble of having to change the client each time the web page’s format changes. The image below shows how the traditional and new web services models may co-exist.
Figure 1: Traditional and web services models
We’ve all seen remote procedures before, why are these any different? In previous incarnations of RPC, you had to subscribe to a proprietary format and an infrastructure. This was true of Sun RPC, DCE, Microsoft RPC, DCOM, and CORBA. Web services, on the other hand, uses XML and an infrastructure you already have, your web server.
The two other primary technologies that provide the foundation for web services are universal description, discovery and integration (UDDI) and web services description language (WSDL). UDDI provides an open framework for describing and discovering services offered by various businesses that have “registered” their services (much like the DNS registry today). WSDL is an XML grammar for specifying a public interface for accessing a web service.
The combination of support from major vendors and the fact that the web services infrastructure is already in place makes a strong argument that web services will be successful.
What Does Success Mean?
The success of web services will mean that far more fine-grain services will be offered to a larger user base. We’re already seeing interfaces being offered from organizations like Google and Amazon. It is only natural that, with a simple way to build programmatic interfaces, more companies will be interacting with their partners via web services.
It also means new problems. It’s very likely that companies will use web services to export interfaces to existing applications that run on their internal networks and were designed for strictly internal use. You may ask, “Why would anyone do this?” The answer is opportunity. In every possible internal service that resides on your network, someone may see an opportunity for revenue in exporting the service. The “Why?” has never been the real question, it has been the “How?” which has been hard to accomplish up to this point.
This “shortcut to success” may carry a high price tag. Customers, partners, and hackers will be coming through your firewall to use your internal services. This will place an increasing burden on the security mechanisms at your application level because the majority of firewalls can do little or nothing to help. Note that in order to be effective, a firewall has to be an “application” based firewall that understands the web services protocols.
Application developers will need to fully understand the security vulnerabilities that web services introduce and mitigate them at the design level. Historically, application developers have not been effective at designing in compensating controls to deal with security deficiencies in a technology. Web services raises these stakes and places a substantial security responsibility on developers.
Web Services Security Mechanisms
Fortunately, the proponents of web services have recognized the need for application security and have made some progress. The WS-security specification provides a language for describing integrity, authentication, and authorization information inside the remote procedure calls. Given this framework, it is possible to establish connection security with SSL and authenticate individual RPCs with a variety of authentication mechanisms such as Kerberos, public key or simple web authentication.
Unfortunately, having a framework to define security and making real applications secure are two different things. If you are creating applications that are inside your organization, you can rely on your locally accepted authentication authority (e.g., Active Directory). However, if you are either using or exporting services on the Internet, the question becomes, “Who will authenticate whom?” The answer depends on what type of service you are building, who the consumers are, and, most importantly, whom the consumers and providers of the service are willing to trust.
When there are only two organizations dealing with one another, they often mutually authenticate using a public key mechanism and trust a certification authority like VeriSign. In some cases, a company offers the service to a set of customers. This model is usually supported by a username password approach – the customer trusts the service provider. While these types of services follow a “traditional” model, the new generation of web services is suggesting that a new model will be necessary. Organizations like Microsoft (with Passport) and the Liberty Alliance are making progress toward offering one or more trusted third parties that allow users to authenticate once to the authentication authority and automatically be authenticated to each service the user encounters. If web services are as successful as some believe, and as fine grain as some predict, this type of model will be very attractive.
Problems to Consider
While web services provide a significant leap in the area of functionality and robustness, there are some very critical problems to consider. First, web services encourage low-level interactions with internal services. Are those services designed for external use? Or is the SOAP interface just an internal interface that is now exposed to the general Internet? Is the hardware running the web service hardened like other Internet exposed systems? Was the service written with malicious users in mind (i.e., does it check all input)?
One of the biggest problems is that web services are typically dispatched through a web server, and thus it may be impossible to recognize this powerful interface at the perimeter. If your environment depends on firewalls for protection, it may be completely exposed.
Web Services the Right Way
If you are a web services provider, you should:
- Design services to be externally consumed (i.e., check input thoroughly and assume malicious users.
- Only export the functionality that is necessary for the service (i.e., limit the information transferred and restrict published interface to the operations that you want outsiders to use).
- Authenticate, encrypt and sign where possible and when practical.
- Isolate the systems providing the service.
- If you are a web services consumer, you should:
- Evaluate how critical the service is to your product/customers.
- Determine how sensitive is the data is that is being requested.
- Evaluate the interaction model (i.e., are you purely a consumer or peer to peer?).
- Understand the information exchange with the service provider.
- Understand your requirements to store and segregate the data, as well as any privacy mechanisms and agreements.
- Understand how your choice of authentication affects others (i.e., is every consumer required to authenticate the same way? If not, how do weak mechanisms affect strong ones?).
- What other services does the provider offer and how does the infrastructure relate to other services, corporate resources? Is the organization capable of living up to its commitments (i.e., technically competent, adequately staffed, understands both the design and administrative issues associated with running real business services).
The Last Word
Web services offer powerful tools for building clean, portable, conveniently accessed services. However, they may introduce new exposures by encouraging access to fine grain internal services that were not designed for external access. You can avoid some problems by recognizing that these services must be designed to be robust and secure. By properly utilizing the security mechanisms at our disposal (i.e., WS-security), we are well on the way to a more secure environment. Finally, you will also need to remember that consuming any business-critical service requires a deep understanding of the design, implementation and administration of the service and its environment.
Dick Mackey is a principal and Phil Cox a consultant with SystemExperts Corporation (www.systemexperts.com), a provider of network security consulting services.