Shannon definition of good encryption algorithm (1949)
1 The amount of secrecy needed should determine the amount of labor appropriate for the encryption and decryption
Even simple cipher may be strong enough to deter the casual interceptor for a short time
2 The set of keys and the enciphering algorithm should be free from complexity
The algorithm should not restrict the selection of key
All keys should be equally secure
The algortihm should not give restrictions to plaintext either
3 The implementation of the process should be as simple as possible
It is easier to make mistakes when implementation is complex (this was especially important when encryption was mostly done by hand)
Complex algorithms will easily slow down the operation
4 Errors in ciphering should not propagate and cause corruption of further information in the message
This was important rule on time when encryption and decryption was done by hand.
Currently this is important on time critical communication systems, where retransmissions are not feasible e.g. live interviews.
5 The size of the enciphered text should be no larger than the text of the original message
Encrypted message does not carry more information than original message, but longer message gives more data for cryptoanalyst to analyse
Knudsen classification for algorithm break
Lars Knudsen PhD thesis 1994
Attacker finds Key k so that
Cryptoanalyst finds another algorithm that can be used instead of the original one
Analyst can find the plaintext from an instance but not the key
Analyst gets some information about key or the plaintext.
cannot be broken no matter how much ciphertext the cryptoanalyst can obtain
OTP is only unconditionally secure algorithm
all the others can be broken by Brute-force
Cannot be broken with available resources, now or in the future.
unless something ground breaking happens in computing
Currently used algorithms are computationally secure
Measuring the hardness of breaking
How much data is required to conduct the attack
Time needed to breaking the system
required amount of work
How much processing memory is required
How much other storage space is required
Basic assumption is that encryption algorithm is known
1. Ciphertext-only attack
Analyst sees only the ciphertext and tries to find out key and plaintext
Protocol should be designed so that the cryptoanalyst is forced for this type of attack
not always possible
Some sources use cipher-text only attack as an attack where even the encryption algorithm is not known
2. Known-plaintext attack
Cryptoanalyst knows something about the encrypted plaintext
Goal is to find the key that is used for encryption of several documents
Headers, templates or even whole documents
3. Chosen-plaintext attack
Attacker can select what will be encrypted and uses that for analysis
4. Adaptive-chosen-plaintext attack
Attacker can make smaller changes to data that will be encrypted based on the result
No-one is looking whether the data that will be encrypted makes sense
5. Chosen ciphertext attack
Cryptoanalyst can select the data that will be decrypted and see the result
tamperproof hardware solutions
works mainly on asymmetric cryptography
6. Chosen-key attack
cryptoanalyst has some knowledge about the relationship between generated keys
7. Rubber-hose cryptoanalysis
Threaten, blackmail, bribe (purchase key)
Passive attacks against protocol
Follow the protocol behavior in order to get additional informatio
e.g. Password sniffing
Hard to notice
Preventing easier than detecting in many cases.
Hard to decide which information requires protection
Active attacks against protocol
Goal is to change the protocol behaviour for attackers benefit
Adding a new message in communication
Resending earlier message or repeating same message several times
Breaking communications links
Changing stored data.
Pretend to be someone else in protocol
Outsider pretends to be A
A pretends to be T
M pretends to be A to B and B to A
Legal partner on protocol
Do not follow the protocol as the protocol states
Passive attacker tries to get more information with protocol than he is allowed to
e.g. attack against zero knowledge protocols (see authentication lectures)
Sending illegal messages
Co-operation of two partners against third one
Even if the protocol and used algorithms are secure the implementation may open up flaws.
Most of the security vulnerabilities are in fact due bad implementation and not a flaw in protocol or algorithm.
Transmitted data not verified
e.g. SQL injection attack
Automatic encryption or signing of any data
Flaw in generated authentication key
Key is not really random
Netscape 1.1. generated 128 bit encryption key, which entropy was equal to 20 bit key, due bad implementation
Key is based on lesser entropy source
password based keys: while 8 bits is used to describe one letter, the entropy is far from the 8 bits. (you can produce roughly 100 different values from keyboard, while full 8 bit can be used to represent 256 different values)
misuse of primitive
CBC-MAC used as hash via publishing key.
use of common IV rather than random one.
Assumptions on security and trust
Protocols have usually designed to have some assumptions that has to be filled for being secure
e.g. If there is no authentic time available, then protocols based on authentic times are not secure.
Secure as long as no-one has access to hardware
Secure as long as no-one is able to get to harddrive and change the keys
Secure as long as Trent can be fully trusted.
Or administrator is trustworthy
or communicating partners are trustworthy
Secure as long as no-one can eavesdrop the communication
Closed systems may rely that their fiber cannot be eavesdropped.
Secure as long as the public keys are authentic
Protocol may not define authentication of public keys, so that has to be done separately.'
Wireless communication have different requirements than wired
Think what assumption you can make in your intended environment and what can you trust on.