Content

Taking a Realistic Approach to Remote Authentication

Just under a decade ago, I was asked to write a white paper on the future of telecommunications and networking for a major telecom company who shall remain nameless.

I spoke to several industry gurus who told me that the next 'big thing' in the networking industry was distributed network resources.
At the time, the World Wide Web as we know it today was just a sparkle in the eyes of Jim Clarke and Mark Anderson, the co-founders of Netscape, and it's probable that even Bill Gates had scarcely even heard of the Internet. Despite this, the early Internet - offering such delights as time-lag text conferencing, Lynx browsing of users' text-only pages, and Telnet sessions to remote computer systems - was quite commonplace amongst the tech community.

Like the Internet today, the text-only Internet of a decade ago allowed people to share information between them, as well as interact via email and server-based bulletin board systems. But to share information on the Internet, users of today, as well as those of a decade ago, have to publish their information on the Net itself, rather than allow access willy-nilly to their own local databases.

Now IBM, as well as Sun and HP, are getting behind a new shared information initiative called the computational grid, or Grid for short. The Grid has the same aims that the Internet had in the 1980s - a pervasive and inexpensive medium by which users can gain access to a wide variety of data. Crucially, plans call for the Grid to also offer access to high-end computational capabilities (just like the Telnet facility currently provides on the Internet, even though most users don't harness its power) as well as shared access between users' local databases.

Meanwhile, back to my white paper. I rambled on about the idea of shared neighborhood servers, acting as a repository for information, polling updates regularly from a central resource. The idea wasn't that revolutionary, it was the same concept that most early viewdata systems such as the U.K.'s Prestel and France's Minitel used in the 1980s to spread the load across their network. But the informational flow on viewdata systems of the 1980s was almost entirely one-way - email was still in its infancy in those days, you'll remember.

My white paper envisaged users with home PCs storing their data on local neighborhood servers, and sharing elements of their data with other users, allowing PC users easy access to bus and rail timetables, as well as a variety of other local information. You and I know that much of this data is now available on the web, but it is also only there because the companies supplying that data have spent time and effort creating the web pages to display it.

Let's fast-forward to today. Grid computing, as IBM calls it, has the ability to reshape the way in which people view the Internet. An early example of Grid working is the SETI@home project (www.setiathome.ssl.berkeley.edu), started in the mid-1990s and still going strong. The Search for Extraterrestrial Intelligence (SETI) Project centers around the use of a screensaver that uses the idle time on a user's PC to number crunch the vast quantity of data downloaded from the radio-telescopes used by the project team in their hunt for extra-terrestrial intelligence.

This shared use of distributed computing is exactly what the Grid is, or rather, will be, about. IBM is now actively promoting Grid computing as the way forward for giving desktop PC users access to phenomenal processing power for minimal outlay.

But what about the security of the Grid?

It's all very well allowing, for example, Internet users to share file access between themselves, but as the industry has seen with the likes of Gnutella (www.gnutella.com), KaZaA (www.kazaa.com) and Napster (www.napster.com), there are enormous copyright and security issues to be solved before network-based file sharing can go mass market. And, as anyone who has used an always-on Internet connection will attest, there are large numbers of users out there who seem to delight in setting off automated applications to sniff out unauthorized access loopholes in your own desktop PC firewall.

Imagine what would happen if the Internet, by default, allowed full access to files and resources stored on all users' machines. This is the security issue that faces IBM and the other IT majors who are busy talking about laying the foundations of the Grid.

According to David Heard, vice president of SecureLogix (www.securelogix.com), a company whose Enterprise Telephony Management (ETM) system offers organizations a telecommunications firewall, the security problems arising from convergent technologies such as the Grid espouses are immense. SecureLogix' ETM platform is currently capable of real-time analysis, blocking and audit logging of voice calls. By the spring of next year, it hopes to release ETM software that can complete similar processes with voice-over Internet protocol (VoIP) calls.

The problem with IP transmissions, Heard argues, is that, without analyzing the data stream, you cannot be sure of what type of communication is going on, never mind what data the IP stream actually contains. From a business point of view, the Grid poses a major security threat, as its benefits will undoubtedly be highly attractive to the staff of most organizations, just as the Internet has proven to be.

One research and software development group, the Globus Project (www.globus.org)  is one of the primary drivers in establishing the technical foundations of the Grid. Headed by Ian Foster, one of the Grid's luminaries from the Argonne National Laboratory at the University of Chicago, and Carl Kesselman of the University of Southern California Information Sciences Institute, the project aims to make Grid computing a reality as far as IT infrastructures are concerned. The Globus Project has already spawned an open grid services architecture (OGSA), which, among many other things, anticipates the widespread use of public key infrastructure (PKI) technology, as well as use of the Kerberos network authentication protocol.

Kerberos, in case you haven't come across the technology (see https://web.mit.edu/kerberos/www/#what_is), provides strong levels of authentication for use in client-server applications, such as the Internet and the Grid. Although a free implementation of Kerberos is available from the Massachusetts Institute of Technology (www.mit.edu), the technology is hardly what you might call user-friendly. Nor, for that matter, is PKI at the user-friendly stage where everyone can use it.

Fortunately for users of the Internet, many of whom would jump at the chance to shift up to the much higher gear that will be the Grid, there is already a widely available technology that is capable of uniquely identifying a user - the mobile phone.

Although still in its infancy, the use of PKI technology on mobile phones, as seen on several Ericsson and Nokia GSM handsets, is already starting to take off. Both Ericsson and Nokia have staged demonstrations in which users of GSM mobiles buy small-value items, such as cans of soda from vending machines, charging the cost to their mobile account rather than dipping in their pocket or purse for change. The PKI technologies underlying such demonstrations are the foundation for authenticating users across remote applications. Many of the latest mobiles include PKI technology for mobile Internet applications, so it's a relatively easy task to use this technology to authenticate a user accessing the Grid.

Of course, assuming users of the Grid are prepared to use their mobile phones to authenticate themselves across the network is one thing - one hopes that the actual transmission system they use is equally secure.

Pass me that bottle of absinthe, will you?

Steve Gold is news editor for SC Magazine (www.infosecnews.com; www.scmagazine.com).
 

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.