Manage Learn to apply best practices and optimize your operations.

Embedded application security: Inside OWASP's best practices

OWASP released a draft of new guidelines for creating secure code within embedded software. Expert Judith Myerson discusses best practices, pitfalls to avoid and auditing tools.

Due to the growth of the internet of things industry, the Open Web Application Security Project drafted a set of...

guidelines for creating secure code within embedded software.

While the guidelines are meant to increase security standards, there are still several best practices that should be followed, along with potential pitfalls to avoid. In this tip, we'll start by exploring the recommended best practices from the Open Web Application Security Project's (OWASP) Embedded Application Security project.

Best practices for embedded application security

First, memory must be protected against buffer and stack overflows, along with other vulnerabilities. Unsafe C functions -- strcat, strcpy, sprintf and scanf -- must be avoided, and users should know that the strncat function can fix the strcat function's overflow problem.

While the user can enter the maximum number of bytes to copy to the receiving buffer, the maximum number should not exceed the size of the buffer. The recommended format is as follows:

char buffer[SOME_SIZE]; strncat( buffer, SOME_DATA, sizeof(buffer)-1);

Furthermore, embedded application security must protect against code injections, both malicious and accidental. All data and user input must be validated and sanitized, as OS command injections occur more frequently than cross-site scripting, SQL injections and XPath injections. The OWASP guidelines recommend avoiding a command processor via systems (), exec(), or ShellExecute (), as well as creating a whitelist of accepted commands via a lookup map.

All data and user input must be validated and sanitized, as OS command injections occur more frequently than cross-site scripting, SQL injections and XPath injections.

Likewise, downloadable firmware updates shouldn't be modifiable, as updates must include cryptographically signed firmware images and should be downloaded over the most recent TLS version. This means that anti-download protection should be put in place to prevent rolling back to a vulnerable version of embedded software.

Sensitive information should also be secured, and certificates and passwords should not be hardcoded into devices or written into disks.

To further improve security, secrets should not be stored in an unprotected storage location or external storage within an EEPROM or flash. Also, hardware security elements or a Trusted Execution Environment, if available, should be used to store any sensitive data. If these features are not available, then a strong cryptography should be used, as all sensitive data in clear text should be moved to volatile memory as soon as possible.

Since passwords are often considered sensitive data, enforcing password polices is crucial, as they protect individual identities. Therefore, weak passwords shouldn't be allowed, and users should know the protocol for sending a password over insecure platforms -- such as HTTP and FTP -- because storing session IDs in a cookie is effective only when the HttpOnly flag for the cookie is active.

Another way to ensure platform security is to harden embedded frameworks. Linux Build systems are often used to restrict frameworks and C-based tool chains to the libraries and functions used to build firmware. Tools such as Lynis should be used to harden auditing. Insecure libraries, dead code and unused shell interpreters must be removed when eliminating file compression.

Backdoor accounts should also be removed if they give root privileges to developers to debug and test code. Furthermore, OEMs should sign a master service agreement that signifies backdoors have been removed. The agreement should also confirm that all code has been reviewed for software security vulnerabilities prior to firmware release and, if possible, embedded device firmware analysis tools, such as Binwalk, should be used.

To further ensure that sensitive data is not tampered with, the latest versions of TLS should be used, as deprecated SSL and early TLS versions are not acceptable. Likewise, private keys and certificates must be secured before they are stored on a disk, and the maximum time to run the certificate before they expire should be properly set. Also, OpenSSL library functions can be used to validate a basic certificate against a root certificate.

Personally identifiable information and sensitive personal information should also be securely collected and stored as quickly as possible. The device owner should perform a factory reset before transferring the device to another user. If the data is stored in the European Union, General Data Protection Regulation guidelines should be followed to avoid noncompliance penalties.

The last best practice is keeping third-party code and components updated to protect against vulnerabilities after the tool chain is set up. You should also check the National Vulnerability Database (NVD) and Open Hub for new vulnerabilities.

Auditing tools include Retire.js for JavaScript libraries, OWASP ZAP for web application testing, Lynis for basic Kernel hardening auditing, and package managers -- opkg -- and LibScanner for searching for dependencies and cross-referencing them with the NVD.

Potential pitfalls

When put to use, OWSAP guidelines can greatly improve security. However, there are several downfalls to the new rules.

First and foremost, enterprises don't rely on firmware security because all malicious third-party code is often removed prior to firmware being released to the market. For example, backdoors that give administrative rights to developers to debug vulnerabilities should be eliminated, and firmware updates should be downloaded to prevent hackers from accessing the code.

In terms of auditing tools, a Linux kernel usually doesn't have security issues, and the National Institute of Standards and Technology recommends Lynis to harden and audit a basic Linux kernel. However, the enterprise should look at the NVD for the latest Linux issues and look to see if any flaws found in embedded software on Linux-based devices have been fixed.

While hackers can't reverse-engineer code, skilled hackers can reverse-engineer firmware binary updates that can be downloaded in full. Since downloading recent changes in binary firmware software can discourage hackers from reverse-engineering code, enterprises should avoid a vulnerable device that a hacker might buy, as the hacker could use it to download code and use reverse-engineering to exploit a flaw.


With new internet of things and connected devices hitting networks every day, along with emerging threats that seek to exploit them, securing embedded software has become increasingly important. As OWASP's embedded application security standard continues to evolve, new technology will continue to emerge and challenge the standards set in place.

This was last published in February 2018

Dig Deeper on Web application and API security best practices



Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

Does your organization use auditing tools to help improve security?