Tim Mather's article "Why Biometrics Might Just Bite Back" [September 2005] is misleading, inaccurate and at least ten years old in thinking. First, let me correct a market misconception that a biometric is just a very long password. Biometrics are nothing like passwords. Passwords are semi-secrets known only to the user and the verifier; hence the security of passwords rely primarily on their confidentiality. Biometrics are not secret; individuals leave latent fingerprints everywhere -- their faces are photographed, their voices are recorded, etc. Thus the security of a biometric authentication system by definition cannot rely on confidentiality. Rather, such security must rely on the integrity and authenticity of the biometric data. Hence biometrics, unlike semi-secret passwords, are public information and do not need to be reset.
This is exactly the approach of the American National Standard X9.84 Biometric Information Management and Security. As the incumbent chair of the X9F4 Cryptographic Protocols and Application Security Working Group that developed this standard, I can attest that X9.84 and other national standards are being transformed into international biometric standards.
This brings me to my second point that there are numerous technical and security-related biometric standards under development by several standards bodies, including the American Standards Committee X9, INCITS M1, ISO TC68 and ISO/IEC JTC1 SC27 and SC37. There are standards for common file exchange, biometric application programming interface (BioAPI), and biometric security.
My third point is that biometrics are being widely deployed in the financial services industry and government and have been for the past three to five years. Interestingly, FIPS 201 and the NIST Special Publication 800-76 Biometric Data Specification for Personal Identity Verification ignore some of the existing biometric standards and create non-standard security solutions. I'm rather surprised that SC Magazine would publish such an inappropriate article that is primarily opinion and clearly not based on facts.
Jeff Stapleton, chair, X9F4 Cryptographic Protocols and Application Security Working Group; president and founder of the Information Assurance Consortium; chief cryptographer, architect, Innove LLC
Editor's note: To clarify, "From the CSO's desk," which appears monthly, is an opinion piece submitted by guest columnists.
I understand from Webroot's public relations pages that you gave Webroot Spysweeper glowing reviews in an article by Jeff Dodd in the October 2005 issue of your magazine, and that you have nominated them for an award. My experience with Spy Sweeper is not at all consistent with such high evaluations. They appear to be having at the least, such serious growing pains that their tech support is not accessible to users. I just tried it. I thought $30 was a good price, giving the glowing things everyone is saying about it. I tried the trial version and liked it. Last weekend I paid for it, though I could not find clear instructions for doing so. The only provision on my trial version took me to the page to order a download, and added another copy each time I went to that page. Finally I went to that page, and paid for one copy. In return I got both a page of download instructions, and an email of download instructions. Both communications gave me a key to enter to tell the program I am registered three times. Neither set of instructions told how to enter the key in the trial version I ALREADY installed, nor did either set of instructions say to uninstall the trial version and install it all over again.
I used the tech support form on the Webroot website to ask for assistance a week ago yesterday. I got no response.
I have decided not to use this product, and to demand a refund. Since I cannot reach tech support, the product is useless. I do not know if tech support earned the glowing evaluation it got before this product became so popular, but now tech support is essentially nonexistent. I don't recommend it.
Dora Smith, Austin, TX
A view from the bottom
The article "Code writer's responsibility" [November] calmed me down – now I know we deserve what we have and will deserve it until we accept cybersecurity advisers who blame software developers, personally, for the security problem. A developer is supposed to write code in accordance to the specification. Period. It is an architect or designer who is responsible for embedding security into the architecture and design as well as for creating development frameworks that strengthen code security. However, designers will consider security only if business requires it, and pays for it.
Cybersecurity is the business issue. If business suggests that security comes magically by itself, they still have to pay. Plus, there should be a spectrum of tools to test security controls (some of them available already), and scheduled project time to create secured frameworks, run security code reviews, simulate intrusions and test protections, which are still omitted because business is in hurry, as usual.
Security defines trust; trust is the fundamental part of any business. Neglecting security exposes the business to a constant risk of broken trust. Who is interested in dealing with the business at risk?
Michael Poulin, software architect