Mobile applications seem like a rite of passage for many enterprise development teams until they face the reality of supporting multiple devices and platforms. If your company doesn't have one or two enterprise mobile applications, the perception is that the organization is getting left behind.
Porting a desktop application to mobile doesn't always fill an immediate business need, however, and developing new mobile software can add fundamental security and privacy concerns; especially when these applications take advantage of third-party services such as mapping or device sensors (GPS, cameras, accelerometers), Bluetooth, Near Field Communication and more. Are IT security teams actually thinking through the ramifications before the company starts building the next Snapchat photo messaging or Chase Mobile banking application?
Enterprise organizations have to look at a number of security and privacy issues before their developers even begin to write code. These security gaps stretch from the distributed architecture of the platform to the privacy concerns that could make an application fodder for the next set of headlines (as Snapchat discovered when it ignored security warnings and its software exposed users' names and phone numbers).
Keep in mind that these security and privacy concerns are fundamental to the idea of a mobile application. They do not change whether the development team is using Google Android, Apple iOS, or programming BlackBerry 10 applications. (OK, we know that last one is not happening …) Enterprises need to evaluate the development platform and the security testing that developers and quality assurance teams will perform, and ensure that everyone understands the privacy concerns up front.
The first concern an organization needs to address is which development platforms will be used. This discussion is not about whether the application should support iOS, Android or some other mobile platform (even though that question will be considered at some point). It is about deciding whether developers will be writing native code for the various mobile platforms, and using the native development environments for each mobile operating system.
Enterprise organizations have to look at a number of security and privacy issues before their developers even begin to write code.
Early on, the only option was to build native applications based on the devices' mobile platform, which meant that for iOS smartphones and tablets, the developer would program in Objective-C using Apple Xcode, and for Android devices the application would typically be written in Java using Eclipse (or some other IDE) with the Android Development Tools plug-in. This development model has its pros and cons. The application runs natively so as long as it's well written; it performs as fast and stably as the platform allows. This approach also allows for full access to native interfaces and features within the mobile platform. The primary drawback to native development is that the software development tools and development processes are somewhat different for each platform. Therefore, the IT organization needs to build integrated development and test environments for each mobile platform they want to support. The developers also have to know how to program within each platform and understand its vulnerabilities across multiple versions of the operating system. For this reason, most applications get rolled out to one mobile platform first and then ported to other platforms.
The second fundamental concern an organization must address is security testing. If IT security teams are going to expose the application, its data and the back-end services to the Internet, they have to know that it's packaged for the potential onslaught of malicious actors and curious users. With every interface a potential source of attacks, development teams need to ensure that they understand the risks these applications can add and the vulnerabilities that exist.
As with the development platform, the organization must create a testing plan that evaluates the security of the application. This plan should also be developed before the first line of code is created. It is harder and more expensive to bolt security onto the application after its deployment, or even once development has started. Keep in mind that security testing performed against Web applications can be leveraged to evaluate the app's back-end services because they are typically Web-based.
The security testing also needs to determine what issues may exist on enterprise smartphones and tablets, and bring your own devices. Does the application leverage the encryption available on the device, or does it store sensitive information outside of a secure area on the phone? Teams need to ensure that the security testing covers all major risks and beyond. One great source of testing guidance is the OWASP Mobile Security Project. (Full disclosure, one of my open source projects, MobiSec, is part of this overarching project.) This is where the Top 10 Mobile Risks list is maintained. It runs the gamut from weak server-side controls and insecure data storage to improper session handling and binary protections.
Finally, enterprise organizations need to consider the privacy concerns that exist within the mobile space. Privacy is a major issue today. With headlines about the National Security Agency infiltrating personal data on smartphones, or Snapchat and Starbucks exposing user data, organizations have to get in front of potential privacy issues before they start adding enterprise mobile applications to the fold. These concerns can be categorized within two overarching issues: what you ask for and what you store.
What you ask for is a commonly overlooked part of mobile development. As the applications are developed, they are built in such a way that they need access to features of the phone. An application may want to make use of the GPS sensor to provide location-based information. When the application uses these features, the user is prompted for permission (or has already granted it during the installation of the application). If an application is asking for too many permissions, users get concerned about what the organization is planning on doing. This type of a response can cause an uproar (online) that reflects badly on the organization, threatening their brand. LinkedIn had this happen after a recent mobile application upgrade asked for extended permissions for calendar and personal information.
What you store is another privacy concern. Applications often store sensitive data on mobile devices. And if this is not handled correctly or safely, it can cause a significant problem for the organization. Starbucks' iPhone mobile payment application was recently found to store users' names, passwords, email and GPS location files in plain text. This not only caused a backlash against the company, it led people to question whether other data security issues existed within the application.
The bandwagon can be a great place
Mobile applications are popular today and heightened security issues shouldn't prevent enterprise adoption. If an organization thinks through its development and security testing plans, and developers build mobile applications that meet identifiable business needs, the benefits often outweigh the concerns. Personally, I am going to start building out an Apache Cordova application because it will run easily on so many different platforms. <grin>
About the author:
Kevin Johnson is the founder and CEO of Secure Ideas, an IT security consulting firm specializing in identifying companies' cybersecurity vulnerabilities. In a career spanning over 20 years, Kevin has worn almost every imaginable IT security hat, including instructor, consultant, public speaker, administrator and architect. You can find him on Twitter at @secureideas.