I recently heard about the "Melbourne Shuffle" algorithm, which can reportedly improve data access pattern security. Can you please explain how it works and why enterprises might find it useful?
Before I explain what the Melbourne Shuffle is -- other than a popular dance move -- it's probably best if I describe the problem it's trying to solve.
Professional cloud service providers go to great lengths to protect their data centers, but with the amount of data they hold, it's become worthwhile for certain groups to expend a considerable amount of time and effort trying to find ways to illicitly access it. As more and more enterprises store ever-increasing amounts of data in the cloud, improving the security controls protecting that data has become a major priority.
Sensitive data should always be encrypted as that provides a means of controlling access to it; but also be aware that encryption doesn't necessarily prevent a determined attacker from gathering information about the encrypted data. In other words, encryption may make data files unreadable, but doesn't hide the way they are accessed. By eavesdropping on users accessing data stored in the cloud, an attacker may be able to uncover data access patterns that reveal information about the content of the data. For example, prior to a company making stock price-sensitive announcements, there may be an increase in the number of times files at certain locations are accessed. Observant attackers could recognize this correlation and potentially take advantage of it even though they may have no idea what information is contained in those particular files. Access patterns can also give away the types of programs that are running against data in a database.
This type of attack is quite sophisticated and requires specialist skills, but it has to be taken seriously by enterprises storing large amounts of sensitive data in the cloud. To combat this form of snooping, enterprises need to conceal data access patterns. For example, enterprises could mix real accesses with a sequence of random "dummy" accesses, or they could continually move items around in the server's memory space so files do not permanently reside in the same place. Data-oblivious algorithms can shuffle and move items to new locations that are independent of their previous ones, while hiding the correlations between the two. However, this can be expensive, slow down access times and increase costs where hosting charges are based on per data requests.
Now, as to your initial question, the Melbourne Shuffle is an algorithm that its authors believe will improve existing oblivious storage options as it's computationally more efficient than previous models.
The algorithm, published by Koki Hamada, Dai Ikarashi, Koji Chida and Katsumi Takahashi in a paper entitled "The Melbourne Shuffle: Improving Oblivious Storage in the Cloud," works by moving small pieces of data from the cloud server and storing them in the user's local memory instead. These are then rearranged before being returned to the server. This process is continually repeated until all the data has been moved to a new location on the cloud server. This makes data access pattern recognition ineffectual -- even when a user accesses the same data over and over -- data keeps being moved to a different location.
The authors claim their algorithm is provably secure and computationally more efficient than previous methods, so for those enterprises with high security requirements, it has the promise of improving data security; but it remains to be seen whether the algorithm will be integrated into any commercially available products. The researchers see it being deployed using an on-premise software application or hardware device, or in some form of user-controlled, tamper-proof chip installed in the data center of the cloud provider.
Ask the Expert!
SearchSecurity expert Michael Cobb is ready to answer your application security questions -- submit them now! (All questions are anonymous).
Learn more about data protection challenges in the cloud.
Dig Deeper on Disk and file encryption tools
Related Q&A from Michael Cobb
Pirated software is still a major concern nowadays. Uncover how to prevent software piracy and protect your organization's intellectual property. Continue Reading
Port scans provide data on how networks operate. In the wrong hands, this info could be part of a larger malicious scheme. Learn how to detect and ... Continue Reading
By performing ongoing risk assessments, organizations can keep their SSH vulnerabilities at a minimum and ensure their remote access foundation is ... Continue Reading