Symmetric ciphers were developed to enable 2 parties sharing knowledge of a secret key to be able to communicate privately using an insecure channel, e.g. radio frequencies. But if this approach is intended for private communications between many parties this results in difficult problems concerning key management.
Using a symmetric cipher, the same key is used both to encrypt and to decrypt a message. Using asymmetric cryptography, keys are created in related pairs. One key in a pair can be used to encrypt a message and the other key in the pair is used to decrypt the same message. If it is possible to create these key pairs so that knowledge of one can't be used to discover the other, then one key in the pair can be kept secret, while the other is made public. For Alice to send a secret message to Bob, Bob generates a keypair and publishes his public key, keeping the other key in the pair secret. Alice uses Bob's public key to encrypt the message, sends the encrypted message to Bob, and only Bob can decrypt it using his secret key.
An important feature of using paired keys to encrypt and decrypt the same message is that Bob can sign a message which he sends to Alice by generating a digest of the message using a cryptographic hash function and he can then encrypt this digest with his secret key. So anyone who:
can then use Bob's public key to decrypt the digest to confirm that Bob signed the message digest using his secret key. As they can use the same secure hash function to obtain the same message digest, confirming that Bob signed the digest can be considered equivalent to Bob having signed the message.
This isn't such a difficult problem if Alice and Bob arrange to meet to exchange keys. Bob could then give Alice a copy of his public key, and assuming Alice can identify Bob, she can know she is using Bob's public key when she encrypts a secret message to be sent to Bob, or when she checks Bob's signature on a message he sends her. If Alice recognises Bob's voice on the telephone, Alice can phone Bob and ask him to give enough details concerning the public key to enable her to bind her knowledge of the key to her knowledge of Bob's identity. (Enough details is here taken to mean a fingerprint of the key taken using a cryptographic strength hash function, which will be easier to read out over the telephone than the whole key.)
One way to answer this question depends upon how important it is to keep the messages Alice sends to Bob secret. If Alice and Bob want to go to the trouble of using cryptography, then this question probably is important, but it still might not be convenient for Alice and Bob to meet in person to be able to exchange public keys.
Our growing cast of crypto use-case actors and actresses includes the evil spy Ethel, who is able to intercept messages sent by Bob to Alice or by Alice to Bob. Ethel knows the public keys of both Alice and Bob but not their secret keys. Ethel's evil plans include creating bogus websites containing public keys which she claims belong to Alice and Bob but which she has really created herself. If she is successful at persuading Bob and Alice to accept her bogus keys as genuine, then she can intercept and decrypt a message Alice meant to send to Bob using her bogus Bob secret key, and then encrypt it to send it on to Bob using her bogus Alice key. Alice and Bob are sending messages to each other thinking these messages are secret, while all along the wicked Ethel is reading them, and maybe altering them in transit.
Ethel's devastating attack is called a man in the middle attack.
If both Alice and Bob know and trust Dave to identify the other party and sign their respective keys, then Alice and Bob can use Dave's signatures on each others public keys to verify that they belong to each other. A signature verifying the identity of the owner of a cryptographic key is called a public key certificate, or just a certificate.
For Dave to do his job well he will either have to know Alice and Bob personally, or he will have to take the usual steps to confirm their identities, i.e. by checking relevant documentation such as student cards, passports, driving licenses, and letters from utility companies sent to their home addresses etc.Dave is acting in the role of a Trusted Third Party in respect of the cryptography used in connection with Alice and Bob's communications.
Identity theft has become a lucrative crime in recent years. If an impersonator can "steal an identity" then for our purposes, identity concerns how the individual concerned is seen by gullible others and systems, rather than the whole person to whom the identity should be connected.
A good way to think about this is by combining the terms "identified" with "entity". Identity in this sense is what is in the eye of the person or agent who identifies another entity. So when we start considering binding a cryptographic key to a personal identity it might help us avoid unrealistic expectations about what crytography can do if we digress for a moment to consider the implications of identity as being a subjective matter.
First of all, how many names can a person have ? I'm called Rich, or Richard, or sometimes Richard Kay. I also have a middle name which I use occasionally and optionally in connection with my first and family names. But I have every right to use my middle name, and to change my name by deed poll. My personal Internet DNS domain is called copsewood.net . DNS names are very useful, as these involve a nearly infinitely extensible, scalable, delegated and globally unique namespace. I have home and work email addresses and a number of role-based email addresses which are usable in connection with systems I manage (e.g. email@example.com ), as well as other email addresses which I use occasionally for testing purposes. Email addresses are useful hooks for on-line identity as security tokens can automatically be sent to them, but as with names, keys and passwords, they are multiple for one individual.
You may say I still have the same identity regardless of the relationship of others to me and the names used in many connections. But I might not want to be identified the same way regardless of the connection.
Taxpayers don't want governments to give benefits to the same person operating under multiple identities more than once. Taxation systems dependent upon personal allowances don't want one individual to get allowances twice in respect of different tax identities. In a sense the biometric (e.g. a photo) is the identity as it is the information that is most likely to be matched when catching someone attempting to commit crimes by using multiple or stolen identities.
Organisations including governments have often not been efficient at preventing people from having multiple identities as seperate records within the same administrative systems. At one time within UCE it was possible for a student studying more than one course to have more than one student number. If someone loses their memory they will still need a National Insurance number. To the extent civil liberty concerns result in democratic opinion preventing governments from using and correlating strong biometrics for this purpose, some of us will resist governments becoming able to use all technically available means of preventing multiple identity crime which we don't find politically acceptable.
It can also be argued for privacy reasons, that the identity by which you interact with one organisation should not have to be the same as the identity with which you interact with every other organisation.
In reality, many names and records apply to the same individual. A number of people might be recognised as having a similar face to a particular photo. If you have an identical twin even your DNA isn't neccessarily the one and only definitive biometric usable to identify an individual without other evidence. A researcher in Japan managed to mould gelatine finger caps using a fingerprint collected from a glass which fooled a fingerprint reader
How many eggs should you keep in one cryptographic basket ? The practical consequence of this is that cryptographic keys relating to a single individual seem likely to be multiple. You probably don't want to use the same key on your passport as on your bank card or for your private correspondence.
Notice how we started by thinking about asymmetric cryptography as being a solution to having to manage too many keys. This is true when we think about the use of keys needed between a number of potential communicators within a bounded network. However, when we consider individual privacy needs, we ended up potentially still needing multiple keys to authenticate ourselves and handle our communications privately with the multiple identifiers we legitimately have in different connections.
Key revocation is what you do when your secret key is compromised. In this situation you don't want it to be used for privacy purposes and you certainly don't want anything signed by this key to be accepted as being something you authorised.
Repudiation is what you do if someone forges your authority to prevent this authority being misused. Some examples:
The reason we are willing to trust our money to a bank in preference to keeping a stash of banknotes under the matress is because we trust the bank will keep our money more securely. Unfortunately this trust hasn't always been justified. One reason that makes us believe our money is more safe in a bank than under the mattress is that we can repudiate a payment made in error which we didn't authorise. (Unfortunately in the UK the onus of proving non-authorisation in connection with phantom ATM withdrawals has been on the bank customer, rather than the onus being on the bank to prove authorisation. For UK non ATM transactions, the onus has traditionally been on the bank, but in the US the onus has been on the bank in respect of both ATM and non ATM transactions.)
Repudiation is a difficult issue in connection with public key cryptography. The problem concerns the degree of guarantee that can be obtained after a secret key is notified as having been compromised, that it won't be used in error. It's probably easiest to illustrate this problem with a story.
Let's imagine that our evil cracker Ethel has placed a very large bet on the shares of ACME Holdings going down. Technically this is called "shorting" shares and it involves selling shares Ethel has borrowed and will have to buy back later, (possibly at a time earlier than the one she has choosen). She wins on this bet if the price of the ACME shares goes down and loses if the price goes up.
Bob is a stockbroker and Alice is a very wealthy shareholder who owns 2,000,000 shares in ACME. On a normal day's trading perhaps 100,000 ACME shares might be bought and sold. Bob and Alice have a contract such that Alice instructs Bob to buy and sell shares, using emailed instructions signed using Alice's secret key, which Bob confirms using Alice's public key after checking to see if the key has been revoked.
One morning, Alice becomes aware that her key has been compromised on finding that the safe containing it has been blasted open. So she finds the key revocation certificate which she had kept for this eventuality, and she uploads this onto a key revocation server, which immediately replicates this revocation record to a number of clustered key revocation backup servers.
Meanwhile, evil Ethel has found Dmitri Nyetovitch, the enigmatic botnet operator and hired him to launch a distributed denial of service attack on the key revocation servers. Ethel had also arranged for Joe Semtex, the notorious safe cracker to carry out the theft of Alice's secret key and Ethel uses this key to sign an emailed instruction to Bob to sell all of Alice's ACME shares.
As the number of shares to be sold is abnormally large, Bob first checks the key revocation list to see if Alice's key has been revoked, but as all the replicated servers providing this list are inaccessible, Bob uses a copy of this list which is in his browser cache, and this copy doesn't contain Alice's key. So Bob sells Alice's ACME shares based on the forged instruction which he believes to be genuine. The amount of trading causes the price of ACME shares to fall through the floor. Ethel buys back the 1,000,000 ACME shares at half the $10 price she shorted these at, making a profit of $5,000,000.
When Alice discovers that Bob has sold these shares based on a forged instruction and has lost $10,000,000 on them, she demands that Bob recompense her for this loss. But Bob has logged all of his actions based on the written contract he has with Alice, and believes that he checked whether Alice's key had been revoked.
The contract between Bob and Alice had specified that for transactions greater than $100,000 Bob would have to check the key revocation servers to see if Alice's key had been revoked prior to using this as authority to buy and sell shares. But the contract between Bob and Alice didn't specify what Bob should do in the event of all the key revocation servers being out of service. This contractual ambiguity will be one that Bob and Alice's lawyers are likely to fight out in court and could either cost Bob everything he has or a substantial part of Alice's net assets. (Alternatively the legal fees will be enough that both Alice and Bob will lose and the lawyers will win.)
To avoid the problem of not being able to guarantee access to a key revocation server, or of key revocation lists growing indefinitely, in practice keys have to be given a finite lifetime, so that they expire naturally and information about good and bad ones can be managed. Some security researchers are still sent messages encrypted using public keys for which they lost the private keys many years ago before it became widely accepted that public key encryption keys should be given a limited lifetime. One approach to the key revocation problem is for all cryptographic keys to be issued with a very short lifetime. This is similar to the approach used by Kerberos in respect of session keys with a lifetime of a few hours. In this situation the complexity of revocation certificates and servers is unneccessary.