Mobile developers building applications for Android devices are making many of the same mistakes as enterprise...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
developers, and those poor coding practices may be rendering encryption and other security features ineffective, according to a new study. Flawed apps coupled with Android vulnerabilities could be just the right recipe for an attack, the study found.
If someone loses their phone and an attacker gets access to that application, the attacker could basically get access to all the data that everyone in the organization can access.
Chris Wysopal, CTO of Veracode.
An analysis of mobile applications conducted by Burlington, Mass.-based testing company Veracode Inc., found 40% of Android applications contain at least one instance of hard-coded cryptographic keys. The practice gives every user of an application the same encryption key, which is similar to everyone within an organization using the same password to secure their data, said Chris Wysopal, co-founder and CTO of Veracode. Because Android applications are easy to decompile, an attacker can easily extract and publicize hard-coded keys, Wysopal said.
“If someone loses their phone and an attacker gets access to that application, the attacker could basically get access to all the data that everyone in the organization can access,” Wysopal said.
The Veracode analysis was on mainly enterprise-grade Android applications used by employees of financial and health care firms. Wysopal said developers sometimes hard-code cryptographic keys to make it easier to develop the application.
“They don’t want to reinvent their authentication system when they move to mobile,” Wysopal said. “They’re not typically writing a from-scratch system, they’re building a new thick client for an older system that was probably client-server and then became Web and is now mobile.”
The practice of hard-coding the encryption key is more common in Web applications running on the server side, where the server is authenticating to back-end data sources. The risk of an attacker gaining access to the cryptographic key is much lower because the only people with access to the Web server are the administrators running the server, Wysopal said. “With mobile everyone has access to the software with the key in it,” he said.
Coding errors abound in mobile apps, because the tools and frameworks for building them are less mature, Wysopal said. Specialty mobile application development firms are getting the bulk of the work because the development teams at most organizations are highly skilled in developing J2EE, Java or .NET Web applications, not mobile applications. The mobile apps are being built outside the organization and at a much quicker pace, Wysopal said.
“We have customers who tell us they actually built their mobile app in two weeks,” Wysopal said. “That’s an indicator that a lot of security thinking isn’t going into this kind of development.”
Wysopal, who is giving a presentation on mobile security at the 2012 RSA Conference, said organizations are just beginning to understand the new risks of mobile platforms and how they are going to be attacked. It took about five years to understand the most problematic flaws in Web applications, Wysopal said, and today SQL injection and cross-site scripting continue to plague Web applications.
“It will take a few years to understand how these attacks are going to occur on mobile and how we can prevent them,” he said. “Even though we don’t know how everything is going to get attacked, we can still be prudent in how we go about building mobile applications.”
In addition to hard-coded passwords, coding errors that can cause the most problems on mobile devices include sensitive data leakage, unsafe data storage or insecure data transmission, according to Veracode. Functionality issues can also cause problems. For example, poorly coded apps can be altered by an attacker to conduct activity monitoring or send unauthorized text messages.
Some security features that have been built into the platforms – code signing, sandboxing and permission notifications – are tools that if used properly by software developers, could vastly improve security, Wysopal said. Code signing enables users to verify that an app comes from a legitimate source, sandboxing isolates applications from critical processes, and permission notifications warn users when an application wants to access location data, messaging or other data source. All of the security features have their weaknesses, but they are a good start, Wysopal said.
“I think a lot of these things are good conceptually, but in practice it falls down because there’s not a good enforcement framework around them,” he said.
Dig Deeper on Secure software development