CS255 Notes PDF

Title CS255 Notes
Course Introduction To Cryptography
Institution Stanford University
Pages 17
File Size 246.7 KB
File Type PDF
Total Downloads 93
Total Views 142

Summary

Notes I took as a student in Dan Boneh's class at Stanford University, CS255: Introduction to Cryptography....


Description

Lecture 1 1/7/19: - Crypto in use - TLS (HTTPS) - Session setup: public key crypto + certificate signing - Use secret key to exchange info: symmetric encryption - Encrypted file systems / volumes on computer - Typically use symmetric encryption - Chat systems: digital signatures - User authentication (2 factor / 2FA, challenge-response / U2F): digital signatures - Wireless systems - And more! (elections, private auctions, zero-knowledge, copyright protection - Symmetric ciphers - Encryption (E) of message takes in a key and a message (k, m), produces ciphertext (c) - Decryption (D) takes ciphertext and key and outputs message - Cipher is defined over (\K, \M, \C) is a pair of “efficient” algorithms (E, D) where E maps from \K x \M → \C, D maps from \C x \K → \M such that for all m \in \M and all k \in \K, D(k, E(k, m)) = m - E is often randomized and has a third input of a truly random string - D is always deterministic - Example of bad ciphers - Substitution cipher - |\K| = 26! = 288  ish where key is a mapping of character replacements - Very insecure because you can use things like letter frequency, digram frequency, etc. - “Secure” cipher - First example was a one time pad - \M = \C = \K = {0, 1}n - Secret key is just a random bit string of length n - c = E(k, m) = k \xor m - m = D(k, c) = k \xor c - Considered secure if CT (ciphertext) reveals no information about the PT (plaintext) - Cipher has perfect secrecy if for all m0, m1 \in \M with the same length, for all c \in \C the probability of E(k, m0) = c is equal to the probability of E(k, m1) = c for a k randomly chosen from \K - Perfect secrecy requires that the cardinality of the key space is greater than or equal to the cardinality of the message space, because any possible message without a key that could map it to the ciphertext gives away some information (that this message is not the plaintext) - Length of key must also be at least as long as the length of the message

Lecture 2 1/9/18: - Stream ciphers - make OTP useful - Replace random key with pseudorandom key from pseudorandom generator - PRG is a function G that stretches key from shorter seed to longer output - Allows us to no longer need key to be as long as message - E(k,m) = m \XOR G(k), D(k,c) = c \XOR G(k) - Can’t be “perfectly secret” any more, but can still be secure with a good PRG - PRG must be unpredictable - G(k) must be indistinguishable from - If PRG is secure, then it is unpredictable (proof by contrapositive) - If PRG is secure, then stream cipher is secure against one-time CT-only attack - Attacks on misused stream ciphers - 2-time pad - c1 = m1 \XOR k - c2 = m2 \XOR k - c1 \XOR c2 = m1 \XOR m2 - Can then extract info about frequencies or find spaces (ASCII space \XOR ASCII A-Z = ASCII a-z and vice versa) Lecture 3 1/14/19: - Block ciphers: workhorse of cryptography - Modern stream ciphers also take an extra nonce param that allow you to reuse k with different nonces - Chacha20 (used on android for encrypting messages) does this - Block cipher also a pair of algorithms E and D - E takes in a key k and a PT block of n bits, produces n bits encrypted - D takes in n bits and key, produces n bits - 2 commonly used block ciphers - 3DES used to be widely used (still by banks) (Data Encryption Standard) - n=64 bits, key=168 bits - AES still used almost everywhere (browsers, phones, etc) - n=128 bits, key=128,192, or 256 bits (AES-128, AES-192, AES-256; tradeoff between performance and security) - A block cipher (E,D) defined over (\K, X) s/t E,D: \K x X → X, computable “efficiently” by deterministic algorithms s.t. For all keys in \K and x in X, D(k, E(k, x)) = x - Once key k is fixed, F(x) = E(k, x) is one-to-one function (always produces same thing, won’t get same result for any other message) - Block cipher built by iteration - Key schedule k expanded into fixed set of keys k1, k2, …, kd - Round function R(k,m) applied to message repeatedly (d times, one for each key) - 3DES-48 rounds, AES128-10 rounds, AES192-12 rounds, AES256-14 rounds - DES uses a Feistel network with functions F1 , …, Fd

-

-

Each round takes L0, R0, produces L1, R1 according to L1 = R0, R1 = F1(R0) \XOR L1 - Can invert to get R0 = L1, L0 = R1 \XOR F1(L1) to recover original - Repeat d of these rounds - Can draw out the same network in the opposite order with L and R flipped to get decryption network - Key determines F1, …, Fd - Fi(x) = f(ki, x) - DES is a 16-round Feistel network - Get L0 and R0 by taking 64-bit input and splitting it into 2 32-bit chunks AES is what’s called an iterated Even-Mansour cipher - IEM uses \pi1, …, \pid which are fixed permutations (mappings) of 128 bits - Have key scheduler that gives you k keys as well - Simply take x \XOR k0, apply \pi1, \XOR k1, apply \pi2, etc to get encrypted block - Also easily invertible by applying inverses of \pi’s in opposite order

Lecture 4 1/16/19: - Block ciphers should produce CT that looks like a random permutation on X - AES uses n=128 bits = 16 bytes - 16 bytes organized into 4x4 array - Each \pi specifies a table (s-box) where you lookup what to replace each byte with (256 entries for each of the possible 256 8-bit combos of bytes) - Next step is to shift/rotate: first row shifts left by 0, second row by 1, 3rd by 2, 4th by 3 - Last step is to multiply each column by a fixed matrix c(x) to produce end columns (don’t do this last step on our last permutation \pid - To implement in hardware, intel added new instructions to the hardware called AES NI - Almost all mainstream processors use this to encrypt TLS web traffic - Some low-end Android phones use chacha/stream ciphers - Can do encryption/decryption in parallel on optimized hardware - Don’t use ECB mode with AES because it gives away information (if m1 = m2, then c1 = c2) - Secure if you do not reuse keys for different blocks - Deterministic counter mode allows us to use a PRF to replicate having a OTP Lecture 6 1/23/19: - PRF: F: \K x \X → Y, PRP: (E,D): \K x \X → \X - F(x) = F(k,x) should be indistinguishable from random function - \pi(x) = E(k,x) should be indistinguishable from a random permutation - If adversary sent 2 messages and we sent back 1 of them encrypted, adversary should not be able to accurately guess which message was sent back - One time eavesdropping security

-

-

-

-

OTP has semantic security Deterministic counter mode has semantic security when F is a secure PRF Semantic security for many-time key: CPA security - If secret key is used multiple times, encryption alg must produce different outputs - Otherwise adversary can learn leaked info - Cipher should also take a nonce! (non-repeating value) Nonce = value that changes from msg to msg, (k,n) pair should never be repeated CBC - cipher block chaining with random IV (nonce) - Chain blocks of message together by xoring output of one block’s encryption with next block message - Can get bound for how many times we can reuse a key before we may be at risk - Also typically encrypt the nonce with a different key - If message length is not divisible by block size, you need to pad through the end - Typically if you need n+1 bytes, you pad with every byte holding value n so you know how many bytes to skip over backwards - If message length is divisible, pad with a whole extra dummy byte - Because each block is dependent on the previous one, we cannot parallelize Random-counter mode with CPA security - Take a random IV, add 1 to it for each subsequent block - Upper 96 bits used to hold actual nonce, lower 32 bits are the counter that increments Message integrity - confirm checksum with shared secret key

Lecture 7 1/28/19: - Message authentication codes (MACs) - Alice wants to make sure message gets to bob without being tampered with - A and B have a shared key - A computes hash of message to send tag along with message - t ← S(k, m) - B verifies with V(k, m, t) which outputs Y or N - Secure MAC - Attacker’s power is to send chosen msg attack for m1, …, mq and get back tags t1, …, tq with goal of producing a valid pair (m, t) different from what’s been seen - Game: Adv sends a bunch, then sends (m, t) and wins if V(k,m,t) = Yes and (m,t) not in past messages/tags, advantage = probability of adv winning - Lemma: if F is a secure PRF, then IF is a secure MAC - To prove MAC security, use contrapositive: for any MAC adv. A, there exists a PRF adv. B s.t. MACadv[A, IF] ≤ PRFadv[B, F] + 1/|Y| - AES gives secure MAC IAES for 16 byte messages - Big question: go from small MAC to big MAC - Two families: CBC-MAC (banks), HMAC (internet) - CBC-MAC (ECBC - Encrypted CBC) - Let F be a secure PRF on (K, X, X) where X = {0, 1}n

-

-

Take first message block, put it through F, then XOR that with second block and put result through F, etc… then at end put result through F with a second key to get tag - CBC theorem says if F is a secure PRF over (K,X,X) then FCBC is a secure PRF over (K2 , XL ,X) - Why do we do the last step with another key? - Raw CBC (without this) is insecure (see lecture) - CBC-MAC is sequential though, so we can’t parallelize - PMAC is parallelized but not used a ton - Each block XOR’d with padding function, put through F, all results XOR’d, then put through F with other key to get tag HMAC (hash MAC - collision resistant hash function) - Example CRHF: SHA256, SHA512 - Application: Let I = (S,V) be a secure MAC on small inputs, then we define I’ = (S’,V’) with S’(K,M) = S(K,H(M)) - Voila! Small MAC to big MAC as long as H is collision resistant

Lecture 8 1/30/19: - AES-GCM mode helps detect tampering to ensure both encryption and integrity! - Hash function H is a CRHF if it is hard to find a single collision - Storing hash of data (with a CRHF) helps ensure integrity if you compare against hash because you could detect if the file was different - Attack on CRHF - birthday paradox: when n = 1.2*\sqrt(B) where n is # people, B is # possible bdays - If distribution is not uniform/independent, it will happen even sooner - Birthday attack: sample r1 ,...,rn where n = 1.2*\sqrt(tag space T), find ri , rj that cause a collision, repeat if we don’t find one - For security, our gold standard is to require 2128  attempts - That’s why SHA-256 outputs 256 bits, because \sqrt(2256  ) = 2128  - MD hash function breaks message into blocks and pads the last block - Pad must contain length of message (10000...00 || length) - Takes as input a fixed IV and first block, then applies compression function, output and next block are input to same function again, etc - Takes {0,1}b (# of blocks) x {0,1}n → {0,1}n - MD is CRHF - proof in lecture - MAC from hash function - S(k,m) = H(k || m) - Envelope method: S(k,m) = H(k || m || k) - HMAC: S(k,m) = H(k XOR opad || H(k XOR ipad || m)) - opad and ipad are some outer pad and inner pad - CPA security without integrity is worthless - must also ensure integrity

Lecture 9 2/4/19: - Authenticated encryption - encrypt to provide confidentiality and integrity - Encryption without integrity implies no confidentiality - Definition of CPA security and CT integrity (game for cipher (E,D)) - Challenger has random key k - Challenger sends back “reject” if asked to decrypt something that’s been tampered with - Adversary A gets to submit any message mi and gets back the encryption of that message ci (repeated many times) - Then A submits one more CT c and wins if D(k,c) ≠ reject and c ≠ any previous ci - Construction for MAC of CPA secure enc - MAC-then-encrypt (TLS 1.0) t = S(km, m), c = E(ke, m || t) send c; decryptor checks tag then encrypts….INCORRECT - Can be insecure, no CT integrity (POODLE attack) - encrypt-then-MAC (GCM) c = E(ke, m), t = S(km, c), send (c,t); decryptor checks t then decrypts….CORRECT - Even if MAC leaks something, it’s only leaking info on ciphertext - encrypt-and-MAC (SSH) t = S(km  , m), c = E(ke , m), send (c,t); decryptor decrypts then checks t….INCORRECT - Insecure if t leaks bits of m (bad MAC, no CPA security) - GCM (Galois counter mode) - Uses nonce-based-ctr-mode (CPA secure), then MAC (carter-wegmen mac) - Required in TLS 1.3 - Uses additional associated date (authenticated but not encrypted) AEAD - AEAD is header that is sent in the plain but is also authenticated along with encrypted message body - Important that ke (encryption key) and km (mac key) are independent - Derive them from another key k via a PRF - Next half of class will look at methods of key exchange Lecture 10 2/6/19: - Key management: public key crypto - One method is to have a trusted third party (TTP) who has shared key with each user - TTP gives u1 and u3 a shared key k13 if those two want to talk - Can be vulnerable to replay attacks - Key exchange without TTP - Original idea took a long time to do enough back-and-forth so not practical - Diffie-Hellman protocol created at Stanford - Modular arithmetic recap - primes - Let p be prime and 600 digits - \Zp = {0,1,2,...,p-1} - Addition and multiplication mod p keep us in \Zp - Theorem: for all nonzero a in Zp , ap-1  %p=1

-

-

-

-

-

Inverse of x in Zp is y in Zp if x*y % p = 1 Then x-1  = xp-2  by earlier theorem for nonzero x Better inversion algorithm: given x and p in Z as inputs, gcp(x,p) = 1, so we can find a,b in Z such that a*x + b*p = 1 by Euclid’s algorithm - Reduce equation by mod p, then b*p goes away and we’re left with a*x = 1 so a is inverse of x Quadratic: Euler theorm: for all p ≥ 2, \Zp* is a cyclic group - there is a generator g such  } that \Zp* = {1, g1 , g2 , …, gp-2 - 3 is a generator of \Z7*, 2 is not a generator, but generates a subgroup - Order of g in Zp is the smallest positive a in Z such that ga = 1 - Lagrange theorem: for all g in \Zp, the order of g is a multiple of p-1 (or p-1 is multiple of order of g) - Square root of x in Zp is y in Zp s.t. Y2 = x in Zp - sqrt(2) = 3 in Z7 - sqrt(3) doesn’t exist in Z7 - If x in Zp has a square root, then x is a quadratic residue - Euler says x in Zp* is a QR iff x(p-1)/2  = 1 in Zp (otherwise will be -1) Computing square roots - if p = 3 mod 4, then x in Zp is a QR means sqrt(x) = x(p+1)/4  in Zp - If p = 1 mod 4 we can find a square root with a randomized algorithm that may fail sometimes but will eventually succeed Can now solve quadratic equations of ax2 + bx + c because we can inv and sqrt - x = (-b +/- sqrt(b2 - 4ac) * (2a)-1  Multi-precision arithmetic - Represent x in Zp with 32-bit blocks - 1/32 * log2p blocks - Addition mod p takes O(log p) - Multiplication and division mod p takes O(log2 p) - exponentiation : use repeated squaring to get g raised to many powers of 2 - g13 = g1101  = g8+4+1  = g8 g4g1, so we compute g, g2 , g4 , g8 through repeated squaring and than multiply - Will require at most 2*log2x multiplications - So overall if x is same order as p, gx takes O(log3 p) - Actually O(log2 p * logx) How to generate large primes - Fact: density of primes - choose m at random from {1,2,3,...,2n }, probability of n being prime is roughly 1/n - Fernat test: given random m, accept that it is prime if 2m-1  = 1 mod m, 3m-1  =1 m-1 mod m, and 5 = 1 mod m (all primes will pass, few others will)

Lecture 11 2/11/19: - Zp* is the set of invertible items in Zp - For a prime p, it’s everything in Zp except 0 - Zp* is a finite cyclic group

-

-

-

Let h in Zp* be an element of prime order q, so hq = 1 mod p but hl ≠ 1 for any l < q - Then q | p - 1 ( q divides p-1) - Group generated by h is {1, h, h2 , ….} is a subgroup of Zp* of prime order q - This group is closed under multiplication Key exchange (toy version - only for passive eavesdropping) - A and B have never met, but want a shared key - Need some protocol that results in both knowing the key without eavesdropper E learning it - Merkle puzzles could do this using block ciphers, but has a quadratic gap and is impractical Example 1: Basic Diffie-Hellman protocol (1976) - Fix a cyclic group G of order q with generator g in G (fixed forever) - A chooses random number alpha in {1,...,q-1}, B does same for beta - A sends galpha  , B sends gbeta  - A raises result to alpha, B raises to alpha, then (galpha  )beta = (gbeta  )alpha - Then both hash it with a function H to make it look random - Why is this secure? Can E get galpha*beta  from G, g, galpha  , and gbeta  alpha beta alpha*beta - DH function DHg(g , g ) = g never been solved so we assume it’s hard - Computational D-H assumption is that CDH holds for finite cyclic group G of order q with generator g in G if probability of determining galpha*beta  is negligible - Related problem: discrete-log problem - Given h = galpha  in G, find alpha in Zq , say that Dlog is hard for G if chance of doing this is negligible - If Dlog is easy, then CDH is easy, because if E can find alpha then E can calculate (gbeta  )alpha - If CDH is easy, is Dlog easy?? Almost true but maybe not always lol - This is a toy protocol because E could change alpha to alpha’ and beta to beta’ and do a man in the middle attack by intercepting and reading messages - Protocol is anonymous so it wouldn’t actually work - Non-interactive key exchange (NIKE) is another property of this - Given some board with a bunch of users where each has their own secret key and posts their public key galpha  / gbeta  / etc - Then any pair of two users automatically has their own shared key without needing any direct interaction - Known algorithm for letting 3 users get a shared key from this, nothing practical for 4 though - How hard are CDH and Dlog? - Generic algorithm can do O(sqrt(q)) = O(e0.5  log q) cuberoot(log p) - Algorithm in Zp* can do O(e ) — sub-exponential algorithm! - Zp* not actually used in the real world, instead use elliptical curve function mod p - Elliptical curve Diffie-Hellman = ECDH

Lecture 12 2/13/19: - Dlog breaking uses Big step giant step which runs in sqrt(q), if we want things to take at least 2128  , then we need q ≥ 2256  - Build CRHF from Dlog - Fix g, h, in G - Hash function H: Zq 2 → G, H(alpha, beta) = galphahbeta - Theorem: if there is a collision for H that we can find, then we break Dlog - Proof: let (alpha, beta) and (alpha’, beta’) be a collision with (alpha, beta) ≠ (alpha’, beta’), then we have galpha’  hbeta’ = galpha  hbeta → galpha’-alpha  = hbeta  - beta’, then g = (beta-beta’)/(alpa-alpha’) h so we’ve found Dlog - Therefore we have a CRHF! - Public key encryption - PKE is a triple of efficient algorithms (G, E, D) - G: gen public/private key pair pk, sk - E(pk, m) → c - D(sk, c) → m - D(sk, E(pk, m)) → m - Also called asymmetric encryption - Semantic security (Security against encryption) - Adversary sends two messages - C sends back encryption of one of the messages with their public key - A wins if they can guess which message is encrypted - Toy PKE key exchange - B generates their own pk, sk pair, A generates a k, sends it to B encrypted with B’s pk, B can decrypt k using sk - Safe from eavesdropping, but not from MITM - ElGamal PKE encryption scheme - Group G, (Es , Ds ) symmetric cipher, hash function H - Sk is alpha, pk is galpha  = h (can’t get it because Dlog) alpha - Encrypt E(g , m) - Choose random beta from Zq, then get mu = gbeta  , v = hbeta  = galpha*beta  - Beta chosen at random every time so different key each time - Get a key k = H(mu, v), c = Es(k, m) - Decrypt D(sk, (mu, c)) - v = mualpha  = galpha*beta  = hbeta  - k = H(mu, v) - m = Ds(k , c) - PKE security against active adversaries - Security against chosen plaintext attacks (CCA) - A gets C’s pk, sends a type 1 query with 2 messages, gets back an encryption of one of them - Then A gets to send a type 2 query of a ciphertext that C decrypts and sends back (can repeat this, but can’t send ciphertext received from type 1 query)

-

-

-

A wins if they...


Similar Free PDFs
CS255 Notes
  • 17 Pages
Notes
  • 18 Pages
Notes
  • 12 Pages
Notes
  • 61 Pages
Notes
  • 35 Pages
Notes
  • 19 Pages
Notes
  • 70 Pages
Notes
  • 6 Pages
Notes
  • 35 Pages
Notes
  • 29 Pages
Notes
  • 70 Pages
Notes
  • 6 Pages
Notes
  • 19 Pages
Notes
  • 32 Pages
Notes
  • 28 Pages