Sergey Nivens - Fotolia
Much of the software that runs the internet and the devices we use on a daily basis follows the open source architecture:...
the code is available for anyone to study, change and distribute. Everyone uses applications that are either open source or include open source code; third-party commercial components are typically only 65% custom code. Apache, OpenSSL and MySQL are open source, as are development frameworks and libraries like Bootstrap, WordPress, Play Framework and Node.js. Two of the best-known operating systems, Linux and Android, also fall under the open source architecture, but it still came as a huge shock when Apple decided for the first time not to encrypt the central core, or kernel, of its new preview version of iOS 10. Apple certainly hasn't decided to make iOS open source -- it's now more source-available -- but what are the security pros and cons of this decision to allow access to such sensitive code? In this tip, we look at the potential benefits and drawbacks of having the iOS kernel unencrypted.
Allowing everyone and anyone to view such critical code is a significant move by the normally secretive Apple; some observers even thought that it may have been a mistake. Apple finally stated that it intentionally left the iOS kernel unencrypted and made the preview code source-available; this was not done for security reasons, but to boost performance. An Apple spokesman said: "By unencrypting it, we're able to optimize the operating system's performance without compromising security." Those last three words are the subject of huge debate.
Many argue that opening up a program's source code to public scrutiny results in more secure software than when the source code is not published – closed source or proprietary software. The rationale is that public scrutiny ensures that almost every problem and fix will be obvious to someone -- this is Linus' Law, "given enough eyeballs, all bugs are shallow." Of the 4,000 plus vulnerabilities reported each year by NIST, only 5% are found by automated scanning tools, the rest by dedicated research teams. However, for the open source architecture to work, it requires a large number of active groups of skilled developers and researchers; the Heartbleed vulnerability was a surprisingly small bug in OpenSSL's implementation of the TLS heartbeat mechanism, but it remained undetected for two years before anyone spotted what turned out to be a classic coding error.
Apple's iOS kernel will certainly not be short of people willing to pour over the code looking for flaws -- researchers and developers to find and report security weaknesses, and hackers to find and exploit them. Opening up the code will certainly mean more flaws get found and disclosed to Apple so that it can fix them, which will in turn make it harder for hackers and government agencies to leverage knowledge of vulnerabilities for long periods of time. As new security and privacy features are added, they will be reviewed and tested to see if they actually work, or whether they can be circumvented -- a free code review for Apple. Apple recently decided to join the likes of Google, Microsoft and other big vendors in operating its own bug bounty program, offering cash payments to anyone who discloses critical vulnerabilities they have found in its products. The rewards for reporting certain flaws could earn researchers up to $200,000. This means that some who discover important security vulnerabilities may turn them over to Apple instead of selling the findings to others. The incentives and rewards for finding and selling exploits can be much higher than those for finding and publishing them. The FBI reportedly paid more than $1.3 million to an unidentified third party to provide a way to open the San Bernardino iPhone after Apple refused to help the agency.
Many enterprises were concerned when Microsoft moved to the open source architecture with its .Net framework in 2014. The number of reported vulnerabilities did rise slightly in 2015 but so far in 2016 an average number has been reported. There certainly wasn't an immediate outbreak of attacks exploiting newly discovered flaws after the source code was made public.
The open source architecture allows hackers to explore weaknesses in software more easily than closed source, but there are no statistics or surveys that prove that either open source or closed source code is inherently more secure. Allowing anyone to see the source code will increase both the chances of researchers finding vulnerabilities and Apple fixing them, and hackers finding and exploiting them. As always, finding software vulnerabilities remains a big part of the digital arms race.
Find out which open source database will fit your enterprise's needs
Learn why open source isn't always the right choice for business analytics