jump to navigation

Information Security: The more things change the more they stay the same December 16, 2013

Posted by 8237mcraew in Security.
trackback

A review of Saltzer and Schroeder’s paper The Protection of Information in Computer Systems

Summary and comments

Written in 1974, this paper presents a focused view on information protection in the context of technical trends of the late ‘60s and early 70’s.  In this time frame, user-to-computer interaction typically centered on terminal access to mainframe computers.  The majority of “mini-computers” were the size of refrigerators.  One of the first personal computers, the Xerox Alto, came out just prior to the publishing of this paper.  The Alto carried a whopping 128 KBs of main memory with a mass storage capability of 2.5MBs [1].  In addition, application of computing resources was limited to large institutions.  Fast forward nearly 40 years and we have computing devices with multi-cores, GBs of RAM, and expandable disk space of up to 32 GBs that can fit in our pockets.  These devices are globally connected and have a user base that numbers in the billions [2].  However, despite the vast disparities in computing resources and user scope, great value can be assessed to the information protection functions and design principles described in this paper.

The intent of this paper, which is delivered effectively, is to explain the system architecture needed to support robust information protection.  The authors, in an attempt to avoid confusion, offer definitions to two terms of great relevance to the topic of information protection, privacy and security.   Privacy, as defined by the authors, “denotes a socially defined ability of an individual (or organization) to determine whether, when, and to whom personal (or organizational) information is to be released” [3].  In contrast, security is defined as the “techniques that control who may use or modify the computer or the information contained in it” [3].  With these definitions in mind the authors spell the three categories of information security violations [3]:

1.  Unauthorized information release

2.  Unauthorized modification of information

3.  Unauthorized denial of use

The use of the term unauthorized is key here, and can be defined as contrary to the desire of the person who controls information and/or contrary to the usage constraints imposed on the computing resource.  This idea of authorization, or lack thereof, can be cross-referenced across the multiple functional categories of information protection.  These functional areas are affected by both computing platform and user demographics.  First we have the unprotected system.  With this function there are no built in mechanisms to prevent any user to access all information.  In the time period of this paper, these were less common and were somewhat protected by lack of physical access.  Currently, billions of mobile and personal computing devices present an unprotected system, unless specific steps are taken by the user to restrict access.

Secondly, the paper mentions the all or nothing system which is predicated on the isolation of users.  In some cases this includes access to a Public Library mechanism with some level of security.  A contemporary example of this is personal workstations configured by organizations for individual users.

The third function, as described by the authors, refers to the controlled sharing of resources through a list of authorized users and the varying authority a user has in relation to each file and/or application.  This function, along with the fourth function of user programmed sharing controls, is a common use case in professional organizations.   The case of user programmed sharing controls, where access is restricted by time, data aggregation, data partitions, and modification criteria describes accurately the information lifecycle in a data warehouse/Business Intelligence.

The final function is “putting strings on information”.  Putting strings on information is the process of maintaining control of information after release.  Across all of these functions is the challenge of dynamics of use.  The dynamics of use refers to establishing and changing user access parameters.  This, along with putting strings on information, are the most difficult to address.  The increasing use of personal computing devices at home and in a business setting can make strict adherence to information protection policies difficult to enforce.

With these functions in consideration Saltzer and Schroeder recommend the following design principles [3]:

1. Economy of mechanism: the KISS principle applies to information protection as well.  Even with more complex system today, it is important to design a mechanism that is easily implementable and verified.

2. Fail-safe defaults: Base access decisions on permission rather than exclusion.   This was a good policy then, and remains so today.  As mentioned in the paper, security failures are difficult to identify.

3. Complete mediation: Every access to every object must be checked for authority.  The challenge of today that pertains to this principle is the pervasiveness of mobile devices.  Comprehensive information security policies must also take into account BYOD rules.

4.  Open design: The mechanisms of security should not depend on the ignorance of potential attackers, but rather on the possession of specific, more easily protected, keys or passwords.   This sounds scary, but is spot on.  Determined malicious users will locate the design no matter what.  Having transparency, along with the simplicity described in the first principle, also help to ensure unintended access paths are not created by confused authorized users.

5. Separation of privilege: Where feasible, a protection mechanism that requires two keys to unlock it is more robust and flexible than one that allows access to the presenter of only a single key.

6. Least privilege: Every program and every user of the system should operate using the least set of privileges necessary to complete the job.   The challenge comes from the integration of many and varying applications with many users.  This can create a situation where the dynamics of use must be managed carefully.

7. Least common mechanism: Of all the principles listed, this one has the greatest risk of being deprecated.  The increasing use of collaborative and crowd sourcing mechanisms challenge the relevance of this principle.  However, the appropriate and controlled use of collaborative mechanisms is a necessity.

8. Psychological acceptability: It is essential that the human interface be designed for ease of use, so that users routinely and automatically apply the protection mechanisms correctly.  Users faced with overly clumsy interfaces will inevitably seek means to circumvent.

9. Work factor: Compare the cost of circumventing the mechanism with the resources of a potential attacker. The resources of attackers have grown exponentially since this paper was published, something that needs to be planned for.

10. Compromise recording: It is sometimes suggested that mechanisms that reliably record that a compromise of information has occurred can be used in place of more elaborate mechanisms that completely prevent loss.   Intrusion detection programs began development in the mid to late 80’s, and are currently commonplace.  Logging is a key aspect of intrusion detection [4].

Conclusion

Despite vast differences in technological capabilities, Saltzer and Schroeder’s focused view of information protection provides an excellent backdrop to current information security topics.  Designing your information protection policy around the above principles may not answer all of your security needs; however it provides a functional baseline that identifies persistent concerns that need monitoring.  In closing, I pose the following question:  Is it possible to maintain a balance between information “lock-down” and value added sharing and collaboration in today’s mobile communication environment, or must you sacrifice one for the other?

Works Cited

[1] “History of Computing Hardware (1960s–present).” Wikipedia. Wikimedia Foundation, 28 Nov. 2013. Web. 16 Dec. 2013.

[2] “List of Countries by Number of Mobile Phones in Use.” Wikipedia. Wikimedia Foundation, 13 Dec. 2013. Web. 16 Dec. 2013.

[3] Saltzer, J.H., and M.D. Schroeder. “The Protection of Information in Computer Systems.” Proceedings of the IEEE 63.9 (1975): 1278-308. Print.

[4] Scarfone, Karen; Mell, Peter (February 2007). “Guide to Intrusion Detection and Prevention Systems (IDPS)”

Advertisements

Comments»

No comments yet — be the first.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: