For example, if somebody stores a spare key under the doormat in case they are locked out of the house, then they are relying on security through obscurity. The theoretical security vulnerability is that anybody could break into the house by unlocking the door using the spare key. However, the house owner believes that the location of the key is not known to the public, and a burglar is unlikely to find it.
The reverse of security by obscurity is Kerckhoffs' principle, which states that system designers should assume that the entire design of a security system is known to all attackers, with the exception of cryptographic key secrets.
The full disclosure movement goes futher, suggesting that security flaws should be disclosed as soon as possible, delaying the information no longer than is necessary to fix or workaround the immediate threat.
It is sometimes argued that security by obscurity is better than no security. In the above example, it is better to hide a spare key under the mat than to leave the door unlocked.
Many people believe that 'security by obscurity' is flawed. This is because:
Operators of systems that rely on security by obscurity often keep the fact that their system is broken secret, so as not to destroy confidence in their service or product. It is possible that this may amount in some cases to fraudulent misrepresentation of the security of their products.
Typically the designers believe that they have ensured security by keeping the design of the system secret. It is difficult for those who approach security by trying to keep things secret to have enough perspective to realize they are inviting trouble, sometimes very big trouble. Self delusion or ignorance is a very difficult problem and has many, almost universally unfortunate, consequences.
This security practice occasionally sets the world up for debacles like the RTM worm of 1988 (see Morris worm).
There are conflicting stories about the origin of this term. It has been claimed that it was first used in the Usenet newsgroup in news:comp.sys.apollo during a campaign to get HP/Apollo to fix security problems in its Unix-clone Aegis/DomainOS[?] (they didn't change a thing). ITS[?] fans, on the other hand, say it was coined years earlier in opposition to the incredibly paranoid Multics people down the hall, for whom security was everything. In the ITS culture it referred to (1) the fact that that by the time a tourist figured out how to make trouble he'd generally got over the urge to make it, because he felt part of the community; and (2) (self-mockingly) the poor coverage of the documentation and obscurity of many commands. One instance of deliberate security through obscurity is recorded; the command to allow patching the running ITS system (altmode[?] altmode control-R) echoed as $$^D. If you actually typed alt alt ^D, that set a flag that would prevent patching the system even if you later got it right.
Search Encyclopedia
|
Featured Article
|