Today’s columnist, Rustom Kanga of iOmniscient, says Americans will have to make some tough ethical choices about face recognition technology as political tensions and divisions mount across the country. (Credit: iOmniscient)

The attack on the U.S. Capitol on January 6 raises many deep ethical questions for companies like us that offer face recognition (FR) systems based on artificial intelligence (AI) to our customers around the world.

Law enforcement agencies in Washington, D.C., and elsewhere use the technology available to them to determine who might have been involved in violent criminal activity, such as the events that occurred at the Capitol earlier this month.

Please register to continue.

Already registered? Log in.

Once you register, you'll receive:

  • News analysis

    The context and insight you need to stay abreast of the most important developments in cybersecurity. CISO and practitioner perspectives; strategy and tactics; solutions and innovation; policy and regulation.

  • Archives

    Unlimited access to nearly 20 years of SC Media industry analysis and news-you-can-use.

  • Daily Newswire

    SC Media’s essential morning briefing for cybersecurity professionals.

  • Learning Express

    One-click access to our extensive program of virtual events, with convenient calendar reminders and ability to earn CISSP credits.