Having unenthusiastically acquiesced to the bring-your-own-device (BYOD) revolution and, more recently, to the bring-your-own-app (BYOA) development, enterprises have begun to recognize that they need to get more control over permitted mobile app versions, app licensing (and its associated costs), and a means to maintain app security. To that end, many have begun considering and, in many cases, implementing their own mobile enterprise app stores (EAS). In fact, this year analyst firm Gartner projected that 25 percent of enterprises will have an EAS in place by 2017.
Enterprises with their own private app store in place will have a better handle on app licensing and maintain more consistency in terms of app versions across enterprise devices. Hopefully, that will ease the IT department's app management burden while cutting costs. However, getting the app security that enterprises need may be more elusive.
Yes, having an EAS reduces the likelihood that employees will install apps that are malicious. By controlling which apps can be installed, the source of those apps, and limiting the available apps to those vetted by the enterprise makes the mobile ecosystem safer. And when an EAS can analyze apps for known malware, the enterprise's overall state of security is further improved.
Yet, there is more to worry about when it comes to app security than maliciousness. Sure, some apps are malicious and make headlines, but what percentage of mobile apps – used by enterprise users – wind up being malicious? How much of a threat is that? It may become more of a problem as anti-malware systems struggle to detect things like malicious mobile ad networks that get linked into apps at run time. For the most part, malicious mobile apps aren't an enterprise problem for those with an EAS.
But, it turns out that the level of security provided by the EAS is only as good as what the enterprise does during the app-vetting process for each individual app.
A large multinational may already have many custom-made applications and it's very likely that no one employee has a handle on all the company's apps, but once the apps do somehow get corralled into the EAS, one wonders what testing was done on those apps – and by whom?
Unfortunately, most tools look at code quality and not code functionality. Can the enterprise say with certainty that its apps are invulnerable to attacks, follow best practices, aren't over permissioned, and adequately protect stored and transmitted sensitive information?
Take, for example, a hospital system with a network of physicians covering a wide geographical area. Doctors may be recommending a conflicting set of apps for asthma, diabetes care, etc. Yes, an EAS solves this app-consistency problem when the organization standardizes on a few appropriately vetted apps. But, even then, how well versed in HIPAA regulations are the third-party app developers? Did the developer take into consideration the FDA's latest requirements in terms of mobile apps? What sort of processes and procedures do the app developers have in place to verify that the apps take industry regulatory compliance requirements into consideration? Will developers be around a month – or year – from now to provide support for the apps on which the enterprise standardized?
In the end, it's not the enterprise app store's raison d'être to provide all the needed mobile app security. Each app has to be individually tested and analyzed. There's no way around this if proper security and privacy protection is a goal for the apps in an enterprise's app store.