Title: EEL%205930%20sec.%205,%20Spring%20
1EEL 5930 sec. 5, Spring 05Physical Limits of
Computing
http//www.eng.fsu.edu/mpf
- Slides for a course taught byMichael P. Frankin
the Department of Electrical Computer
Engineering
2Basics of Quantum Theory
3Systems and Subsystems
- Intuitively speaking, a physical system consists
of a region of spacetime all the entities (e.g.
particles fields) contained within it. - The universe (over all time) is a physical system
- Transistors, computers, people also phys. systs.
- One physical system A is a subsystem of another
system B (write A?B) iff A is completely
contained within B. - Later, we will make these definitions more
formal precise.
B
A
4Closed vs. Open Systems
- A subsystem is closed to the extent that no
particles, information, energy, or entropy (terms
to be defined) enter or leave the system. - The universe is (presumably) a closed system.
- Subsystems of the universe may be almost closed
- Often in physics we consider statements about
closed systems. - These statements may often be perfectly true only
in a perfectly closed system. - However, they will often also be approximately
true in any nearly closed system (in a
well-defined way)
5Concrete vs. Abstract Systems
- Usually, when reasoning about or interacting with
a system, an entity (e.g. a physicist) has in
mind a description of the system. - A description that contains every property of the
system is an exact or concrete description. - That system (to the entity) is a concrete system.
- Other descriptions are abstract descriptions.
- The system (as considered by that entity) is an
abstract system, to some degree. - We nearly always deal with abstract systems!
- Based on the descriptions that are available to
us.
6System Descriptions
- Classical physics
- A system could be completely described by giving
a single state S out of the set ? of all possible
states. - Statistical mechanics
- Instead, give a probability distribution function
p??0,1 stating that the system is in state S
with probability p(S). - Quantum mechanics
- Give a complex-valued wavefunction ?? ? C,
?(S)?1, implying the system is instate S with
probability ?(S)2.
7States State Spaces
- A possible state S of an abstract system A
(described by a description D) is any concrete
system C that is consistent with D. - I.e., it is possible that the system in question
could be completely described by the description
of C. - The state space of A is the set of all possible
states of A. - So far, the concepts weve discussed can be
applied to either classical or quantum physics - Now, lets get to the uniquely quantum stuff
8Distinguishability of States
- Classical quantum mechanics differ crucially
regarding the distinguishability of states. - In classical mechanics, there is no issue
- Any two states s,t are either the same (st), or
different (s?t), and thats all there is to it. - In quantum mechanics (i.e. in reality)
- There are pairs of states s?t that are
mathematically distinct, but not 100 physically
distinguishable. - Such states cannot be reliably distinguished by
any number of measurements, no matter how
precise. - But you can know the real state (with high
probability), if you prepared the system to be in
a certain state.
9State Vectors Hilbert Space
- Let S be any maximal set of distinguishable
possible states s, t, of an abstract system A. - I.e., no possible state that is not in S is
perfectly distinguishable from all members of S. - Identify the elements of S with unit-length,
mutually-orthogonal (basis) vectors in an
abstract complex vector space H. - The systems Hilbert space
- Postulate 1 Each possible state ? ofsystem A
can be identified with a unit-length vector in
the Hilbert space H.
t
s
?
10(Abstract) Vector Spaces
- A concept from abstract linear algebra.
- A vector space, in the abstract, is any set of
objects that can be combined like vectors, i.e. - You can add them
- Addition is associative commutative
- Identity law holds for addition to zero vector 0
- You can multiply them by scalars (incl. ?1)
- Associative, commutative, and distributive laws
hold - Note There is no inherent basis (set of axes)
- The vectors themselves are the fundamental
objects,rather than being just lists of
coordinates
11Hilbert spaces
- A Hilbert space H is a vector space in which the
scalars are complex numbers, with an inner
product (dot product) operation ? HH ? C - See Hirvensalo p. 107 for defn. of inner product
- x?y (y?x) ( complex conjugate)
- x?x ? 0
- x?x 0 if and only if x 0
- x?y is linear, under scalar multiplication
and vector addition within both x and y
Componentpicture
y
Another notation often used
x
bracket
x?y/x
12Review The Complex Number System
- It is the extension of the real number system via
closure under exponentiation. - (Complex) conjugate
- c (a bi) ? (a ? bi)
- Magnitude or absolute value
- c2 cc a2b2
The imaginaryunit
i
c
b
?
a
Real axis
Imaginaryaxis
?i
13Review Complex Exponentiation
- Powers of i are complex units
- Note
- e?i/2 i
- e?i ?1
- e3? i /2 ? i
- e2? i e0 1
e?i
i
?
?1
1
?i
14Vector Representation of States
- Let Ss0, s1, be any maximal set of
mutually distinguishable states, indexed by i. - A basis vector vi identified with the ith such
state can be represented as a list of numbers - s0 s1 s2 si-1 si si1
- vi (0, 0, 0, , 0, 1, 0, )
- Arbitrary vectors v in the Hilbert space H can
then be defined by linear combinations of the vi - And the inner product is given by
15Diracs Ket Notation
- Note The inner product definition is the
same as the matrix product of x, as a
conjugated row vector, times y, as a normal
column vector. - This leads to the definition, for state s, of
- The bra ?s means the row matrix c0 c1
- The ket s? means the column matrix ?
- The adjoint operator takes any matrix Mto its
conjugate transpose M ? MT, so?s can be
defined as s?, and x?y xy.
Bracket
16Distinguishability of States, again
- State vectors s and t are (perfectly)
distinguishable or orthogonal (write s?t)
iff st 0. (Their inner product is zero.) - State vectors s and t are perfectly
indistinguishable or identical (write st)
iff st 1. (Their inner product is one.) - Otherwise, s and t are both non-orthogonal, and
non-identical. Not perfectly distinguishable. - We say, the amplitude of state s, given state t,
is st. Note amplitudes are complex numbers.
17Probability and Measurement
- A yes/no measurement is an interaction
designed to determine whether a given system
is in a certain state s. - The amplitude of state s, given the actual state
t of the system determines the probability
of getting a yes from the measurement. - Postulate 2 For a system prepared in state t,
any measurement that asks is it in state s?
will say yes with probability P(st) st2 - After the measurement, the state is changed, in a
way we will define later.
18A Simple Example
- Suppose abstract system S has a set of only 4
distinguishable possible states, which well
call s0, s1, s2, and s3, with corresponding ket
vectors s0?, s1?, s2?, and s0?. - Another possible state is then the unit vector
- Which is equal to the column matrix
- If measured to see if it is in state s0, we
have a 50 chance of getting a yes.
19Schrödingers Cat
- A thought experiment that illustrates the weird
nature of quantum states. - An apparatus is set up tokill a cat if an atom
decays in a certain time (50 prob.). - The system enters the quantum superposition
state live cat? dead cat?. - We cant say that the cat is really either
alive or dead until we open the box and observe
it. - Even then, the true state can validly be
considered to be we see live cat? we see
dead cat?. - Outwardly-spreading entanglement ? Many-worlds
picture
20Linear Operators
- V,W Vector spaces.
- A linear operator A from V to W is a
linear function AV?W. An operator on V is
an operator from V to itself, AV?V. - Given bases for V and W, we can represent linear
operators as matrices. - An Hermitian operator H on V is a linear operator
that is self-adjoint (HH). - Its diagonal elements are real.
21Eigenvalues Eigenvectors
- v is called an eigenvector of linear operator A
iff A just multiplies v by a scalar a, i.e. Avav
- eigen (German) means characteristic
- a, the eigenvalue corresponding to
eigenvector v, is just the scalar that A
multiplies v by - a is degenerate if it is shared by 2
eigenvectors that are not scalar multiples of
each other - Any Hermitian operator has all real-valued eigenv
ectors, which form an orthogonal set
22Observables
- A Hermitian operator H on the set V is called an
observable if there is an orthonormal (all
unit-length, and mutually orthogonal) subset of
its eigenvectors that forms a basis of V. - Postulate 3 Every measurable physical property
of a system can be described by a corresponding
observable H. Measurement outcomes correspond to
eigenvalues of H. - The measurement can also be thought of as a set
of yes-no tests that compares the state with each
of the observables normalized eigenvectors.
23Wavefunctions
- Given any set S?H of system states,
- Whether all mutually distinguishable, or not,
- a quantum state vector v can be translated to a
wavefunction ?S?C, giving, for each state s?S,
the amplitude ?(s) of that state. - When s is some other state vector, and the
actual state is v, then ?(s) is just sv. - Whenever S includes a basis set, ? also
determines v. - ? is called a wavefunction because its dynamics
takes the form of a wave equation when S ranges
over a space of positional states.
24Unitary Transformations
- A matrix (or linear operator) U is unitary iff
its inverse equals its adjoint U?1 U - Some nice properties of unitary transformations
- Invertible, bijective, one-to-one.
- The set of row vectors comprises an orthonormal
basis. - Ditto for the set of column vectors.
- Preserves vector length U? ?
- Therefore also preserves total probability over
all states - Implements a change of basis,
- from one orthonormal basis to another.
- Can be thought of as a kind of generalized
rotation of? in Hilbert space.
25Time Evolution
- Postulate 4 (Closed) systems evolve (change
state) over time via unitary transformations. - ?t2 Ut1?t2 ?t1
- Note that since U is linear, a small-factor
change in the amplitude of a particular state at
t1 leads to a correspondingly small change in the
amplitude of the corresponding state at t2! - Chaotic sensitivity to initial conditions
requires an ensemble of initial states that are
different enough to be distinguishable (in the
sense we defined) - Indistinguishable initial states never beget
distinguishable outcomes - ? true chaotic/analog computing is physically
impossible
U-1 U
26Schrödinger's Wave Equation
- Start w. classical Hamiltonian energy
equation H K P (K kinetic, P
potential) - Express K in terms of momentum K ½mv2 p2/2m
- Substitute H i??t and p -i??x
- Apply to wavefunction ? over position states x
(Where ?a ? ?/?a)
27Multidimensional Form
- For a system with states given by (x,t) where t
is a global time coordinate, and x describes N/3
particles (p0,,pN/3-1) with masses (m0,,mN/3-1)
in a 3-D Euclidean space, where each pi is
located at coordinates (x3i, x3i1, x3i2), and
where particles interact with potential energy
function P(x,t), the wavefunction ?(x,t) obeys
the following (2nd-order, linear, partial)
differential equation
28Features of the wave equation
- Particles momentum state p is encoded by their
wavelength ?, as per ph/? - The energy of a state is given by the frequency
f of rotation of the wavefunction in
the complex plane Ehf. - By simulating this simple equation, one can
observe basic quantum phenomena, such as - Interference fringes
- Tunneling of wave packets through potential
energy barriers - Demo of SCH simulator
29Gaussian wave packet moving to the rightArray
of small sharp potential-energy barriers
30Initial reflection/refraction of wave packet
31A little later
32Aimed a little higher
33A faster-moving particle
34Relativistic Wave Equations
- Unfortunately, despite its many practical
successes, the literal Schrödingers equation is
not relativistically invariant. - That is, it does not retain the same form in a
boosted frame. - However, solutions to the free Schrödingers
equation (where V0) can be given a
self-consistent relativistic interpretation. - Let p -i??x be relativistic momentum,
- Let m i??t be rest mass in the particles frame
of ref. - Taking the derivative along an isospatial, the
proper time t axis - Let E i??t be relativistic energy of the
particle - Then, E2 p2 m2 is easily shown to be true for
plane-wave solutions - Lines of constant phase angle are the isochrones
of the moving particle. - And everything transforms properly to a new
reference frame. - In fact, the solutions to the free Schrödingers
equation closely correspond to solutions to the
relativistic Klein-Gordon equation ?µ?µ
m2f(xµ) 0. - This describes a free, massive scalar particle.
35Compound Quantum Systems
- Let CAB be a system composed of two separate
subsystems A,B with vector spaces A,B with bases
ai?,bj?. - The state space of C is a vector space CA?B
given by the tensor product of spaces A and B,
with basis states labeled as aibj? ai?bj?. - E.g., if A has state ?aca0a0 ? ca1
a1?,while B has state ?bcb0b0 ? cb1 b1?,
thenC has state ?c ?a??b ca0cb0a0b0?
ca0cb1a0b1? ca1cb0a1b0? ca1cb1a1b1?
(Use distributive law)
36Entanglement
- If the state of compound system C can be
expressed as a tensor product of states of two
independent subsystems A and B, ?c ?a??b, - then, we say that A and B are not entangled, and
they have definite individual states. - E.g. 00?01?10?11?(0?1?)?(0?1?)
- Otherwise, A and B are entangled (quantumly
correlated) their states are not independent. - E.g. 00?11?
(State has entropy 0 but mutual information 2!)
37Size of Compound State Spaces
- Note that a system composed of many separate
subsystems has a very large state space. - Say it is composed of N subsystems, each with k
basis states - The compound system has kN basis states!
- Many possible states of the compound system will
have nonzero amplitude in all these kN basis
states! - In such states, all the distinguishable basis
states are (simultaneously) possible outcomes - each with some corresponding probability
- This illustrates the many worlds nature of
quantum mechanics. - And the enormous number of possible worlds
involved.
38After a Measurement?
- After a system or subsystem is measured from
outside, its state appears to collapse to exactly
match the measured outcome - the amplitudes of all states perfectly
distinguishable from states consistent w. that
outcome drop to zero - states consistent with measured outcome can be
considered renormalized so their probs. sum to
1 - This collapse appears nonunitary ( nonlocal)
- However, this behavior is now explicable as the
expected consensus phenomenon that would be
experienced even by entities within a closed,
perfectly unitarily-evolving world (Everett,
Zurek).
39Pointer States
- For a given system interacting with a given
environment, - The system-environment interactions can be
considered measurements of a certain observable
of the system by the environment, and vice-versa. - Any each observable, there are certain basis
states that are characteristic of that
observable. - These are just the eigenstates of the observable.
- A pointer state of a system is an eigenstate of
the system-environment interaction observable. - The pointer states are the inherently stable
states.
40Key Points to Remember
- An abstractly-specified system may have many
possible states not all pairs are
distinguishable. - A quantum state/vector/wavefunction ? assigns a
complex-valued amplitude ?(s) to each state s. - The probability of state s is ?(s)2, the square
of ?(s)s length in the complex plane. - Quantum states evolve over time via unitary
(invertible, length-preserving) transformations.
41Quantum Information
- Generalizing classical information theory
concepts to fit quantum reality
42Density Operators
- For any given state ??, the probabilities of all
the basis states si are determined by an
Hermitian operator or matrix ? (called the
density matrix) - Note that the diagonal elements ?i,i are just the
probabilities of the basis states i. - The off-diagonal elements are called
coherences. - They describe the quantum entanglements that
exist between basis states. - The density matrix describes the state ??
exactly! - It (redundantly) expresses all of the quantum
info. in ??.
43Mixed States
- Suppose the only thing one knows about the true
state of a system that it is chosen from a
statistical ensemble or mixture of state vectors
vi (called pure states), each with a derived
density matrix ?i, and a probability Pi. - In such a situation, in which ones knowledge
about the true state is expressed as probability
distribution over pure states, we say the system
is in a mixed state. - Such a situation turns out to be completely
described, for all physical purposes, by simply
the expectationvalue (weighted average) of the
vis density matrices - Note Even if there were uncountably many vi
going into the calculation, the situation remains
fully described by O(n2) complex numbers, where n
is the number of basis states!
44Von Neumann Entropy
- Suppose our probability distribution over states
comes from the diagonal elements of some density
matrix ?. - But, we will generally also have additional
information about the state hidden in the
coherences. - The off-diagonal elements of the density matrix.
- The Shannon entropy of the distribution along the
diagonal will generally depend on the basis used
to index the matrix. - However, any density matrix can be (unitarily)
rotated into another basis in which it is
perfectly diagonal! - This means, all its off-diagonal elements are
zero. - The Shannon entropy of the diagonal probability
distribution is always minimized in the diagonal
basis, and so this minimum is selected as being
the true (basis-independent) entropy of the mixed
quantum state ?. - It is called the von Neumann entropy.
45V.N. entropy, more formally
- The trace Tr M just means the sum of Ms diagonal
elements. - The ln of a matrix M just denotes the inverse
function to eM. See the logm function in
Matlab - The exponential eM of a matrix M is defined via
the Taylor-series expansion ?i0 Mi/i!
(Shannon S)
(Boltzmann S)
46Quantum Information Subsystems
- A density matrix for a particular subsystem may
be obtained by tracing out the other
subsystems. - Means, summing over state indices for all systems
not selected. - This process discards information about any
quantum correlations that may be present between
the subsystems! - Entropies of the density matrices so obtained
will generally sum to gt that of the original
system. (Even if the original state was pure!) - Keeping this in mind, we may make these
definitions - The unconditioned, reduced or marginal quantum
entropy S(A) of subsystem A is the entropy of
the reduced density matrix ?A. - The conditioned quantum entropy S(AB)
S(AB)-S(B). - Note this may be negative! (In contrast to the
classical case.) - The quantum mutual information I(AB)
S(A)S(B)-S(AB). - As in the classical case, this measures the
amount of quantum information that is shared
between the subsystems - Each subsystem knows this much information
about the other.
47Tensors and Index Notation
- For our purposes, a tensor is just a generalized
matrix that may have more than one row and/or
column index. - We can also define a tensor recursively as a
number or a matrix of tensors. - Tensor signature An (r,c) tensor has r row
indices and c column indices. - Convention Row indices are shown as subscripts,
and column indices as superscripts. - Tensor product An (l,k) tensor T times an (n,m)
tensor U is a (ln,km) tensor V formed from all
products of an element of T times an element of
U - Tensor trace The trace of an (r,c) tensor T with
respect to index k (where 1 k r,c) is given
by contracting (summing over) the kth row index
together with the kth column index -
Example a (2,2)tensor T in which all 4indices
take on values from the set 0,1
(I is the set of legal values of indices rk and
ck) ?
48Quantum Information Example
AB AB
- Consider the state vAB 00?11? of compound
system AB. - Let ?AB vv.
- Note that the reduced density matrices ?A ?B are
fully classical - Lets look at the quantum entropies
- The joint entropy S(AB) S(?AB) 0 bits.
- Because vAB is a pure state.
- The unconditioned entropy of subsystem A is S(A)
S(?A) 1 bit. - The entropy of A conditioned on B is S(AB)
S(AB) - S(A) -1 bit! - The mutual information I(AB) S(A) S(B) -
S(AB) 2 bits!
00? 01? 10? 11?
49Quantum vs. Classical Mutual Info.
- 2 classical bit-systems have a mutual information
of at most one bit, - Occurs if they are perfectly correlated, e.g.,
00, 11 - Each bit considered by itself appears to have 1
bit of entropy. - But taken together, there is really only 1 bit
of entropy shared between them - A measurement of either extracts that one bit of
entropy, - Leaves it in the form of 1 bit of incompressible
information (to the measurer). - The real joint entropy is 1 bit less than the
apparent total entropy. - Thus, the mutual information is 1 bit.
- 2 quantum bit-systems (qubits) can have a mutual
info. of two bits! - Occurs in maximally entangled states, such as
00?11?. - Again, each qubit considered by itself appears to
have 1 bit of entropy. - But taken together, there is no entropy in this
pure state. - A measurement of either qubit leaves us with no
entropy, rather than 1 bit! - If done carefully see next slide.
- The real joint entropy is thus 2 bits less than
the apparent total entropy. - Thus the mutual information is (by definition) 2
bits. - Both of the apparent bits of entropy vanish if
either qubit is measured. - Used in a communication tech. called quantum
superdense coding. - 1 qubits worth of prior entanglement between two
parties can be used to pass 2 bits of classical
information between them using only 1 qubit!
50Why the Difference?
- Scenario Entity A hasnt yet measured B and C,
which (A knows) are initially correlated with
each other, quantumly or classically - A has measured B and is now correlated with both
B and C - A can use his new knowledge to uncompute
(compress away) the bits from both B and C,
restoring them to a standard state
OrderABC
Classical
Quantum
Knowing he is in state 0?1?, A can unitarily
rotate himself back to state 0?. Look ma, no
entropy!
A, being in a mixed state, still holds a bit of
information that is either unknown (external
view) or incompressible (As internal view), and
thus is entropy, and can never go away (by the
2nd law of thermo.).
51Simulating the Schroedinger Wave Equation
- A Perfectly Reversible Discrete Numerical
Simulation Technique
52Simulating Wave Mechanics
- The basic problem situation
- Given
- A (possibly complex) initial wavefunction
in an N-dimensional position basis,
and - a (possibly complex and time-varying) potential
energy function , - a time t after (or before) t0,
- Compute
-
- Many practical physics applications...
53The Problem with the Problem
- An efficient technique (when possible)
- Convert V to the corresponding Hamiltonian H.
- Find the energy eigenstates of H.
- Project ? onto eigenstate basis.
- Multiply each component by .
- Project back onto position basis.
- Problem
- It may be intractable to find the eigenstates!
- We resort to numerical methods...
54History of Reversible Schrödinger Sim.
See http//www.cise.ufl.edu/mpf/sch
- Technique discovered by Ed Fredkin and student
William Barton at MIT in 1975. - Subsequently proved by Feynman to exactly
conserve a certain probability measure - Pt Rt2 It?1It1
- 1-D simulations in C/Xlib written by Frank at MIT
in 1996. Good behavior observed. - 1 2-D simulations in Java, and proof of
stability by Motter at UF in 2000. - User-friendly Java GUI by Holz at UF, 2002.
(Rreal, Iimag., ttime step index)
55Difference Equations
- Consider any system with state x that evolves
according to a diff. eq. that is 1st-order in
time x f(x) - Discretize time to finite scale ?t, and use a
difference equation instead x(t ?t) x(t)
?t f(x(t)) - Problem Behavior not always numerically stable.
- Errors can accumulate and grow exponentially.
56Centered Difference Equations
- Discretize derivatives in a symmetric fashion
- Leads to update rules like x(t ?t) x(t ?
?t) 2?t f(x(t)) - Problem States at odd- vs. even-numbered time
steps not constrainedto stay close to each other!
2?tf
x1
g
x2
g
x3
g
x4
57Centered Schrödinger Equation
- Schrödingers equation for 1 particle in 1-D
- Replace time ( also space) derivatives with
centered differences. - Centered difference equation has realpart at odd
times that depends only onimaginary part at even
times, vice-versa. - Drift not an issue - real imaginaryparts
represent different state components!
R1
g
?
I2
g
R3
g
I4
?
58Proof of Stability
- Technique is proved perfectly numerically stable
convergent assuming V is 0 and ?x2/?t gt ?/m
(an angular velocity) - Elements of proof
- Lax-Richmyer equivalence convergence?stability.
- Analyze amplitudes of Fourier-transformed basis
- This is sufficient due to Parsevals relation
- Use theorem (cf. Strikwerda) equating stability
to certain conditions on the roots of an
amplification polynomial ?(g,?), which are
satisfied by our rule. - Empirically, technique looks perfectly stable
even for more complex potential energy funcs.
59Phenomena Observed in Model
- Perfect reversibility
- Wave packet momentum
- Conservation of probability mass
- Harmonic oscillator
- Tunneling/reflection at potential energy barriers
- Interference fringes
- Diffraction
60Interesting Features of this Model
- Can be implemented perfectly reversibly, with
zero asymptotic spacetime overhead - Every last bit is accounted for!
- As a result, algorithm can run adiabatically,
with power dissipation approaching zero - Modulo leakage frictional losses
- Can map it to a unitary quantum algorithm
- Direct mapping
- Classical reversible ops only, no quantum speedup
- Indirect (implicit) mapping
- Simulate p particles on kd lattice sites using pd
lg k qubits - Time per update step is order pd lg k instead of
kpd