# Cx4240 homework 1 | Mathematics homework help

Cx4240 homework 1 | Mathematics homework help.

CX4240 Homework 1
Le Song
Deadline: 2/06 Thur, 9:30am (before starting the class)

• No unapproved extension of deadline is allowed. Late submission will lead to 0 credit.
• Typing with Latex is highly recommended. Typing with MS Word is also okay. If you handwrite, try
to be clear as much as possible. No credit may be given to unreadable handwriting.
• Explicitly mention your collaborators if any.

1

Probability

On the morning of September 31, 1982, the won-lost records of the three leading baseball teams in the
western division of the National League of the United States were as follows:
Team
Atlanta Braves
San Francisco Giants
Los Angeles Dodgers

Won
87
86
86

Lost
72
73
73

Each team had 3 games remaining to be played. All 3 of the Giants games were with the Dodgers, and
the 3 remaining games of the Braves were against the San Diego Padres. Suppose that the outcomes of all
remaining games are independent and each game is equally likely to be won by either participant. If two
teams tie for ﬁrst place, they have a playoﬀ game, which each team has an equal chance of winning.
(a) What is the probability that Atlanta Braves wins the division? [5 pts]
(b) What is the probability that San Francisco Giants wins the division? [5 pts]
(c) What is the probability that Los Angeles Dodgers wins the division? [5 pts]
(d) What is the probability to have an additional playoﬀ game? [5 pts]

2

Maximum Likelihood

Suppose we have n i.i.d (independent and identically distributed) data samples from the following probability
distribution. This problem asks you to build a log-likelihood function, and ﬁnd the maximum likelihood
estimator of the parameter(s).
1

(a) Poisson distribution [5 pts]
The Poisson distribution is deﬁned as
P ( xi = k ) =

λk e−λ
(k = 0, 1, 2, …).
k!

What is the maximum likelihood estimator of λ?
(b) Exponential distribution [5 pts]
The probability density function of Exponential distribution is given by
f ( x) =

λe−λx
0

x≥0
x<0

What is the maximum likelihood estimator of λ?
(c) Gaussian normal distribution [10 pts]
Suppose we have n i.i.d (Independent and Identically Distributed) data samples from a univariate Gaussian
normal distribution N (µ, σ 2 ), which is given by
(x − µ)2
1
√ exp −
2σ 2
σ 2π

N (x; µ, σ 2 ) =

.

What is the maximum likelihood estimator of µ and σ 2 ?

3

Principal Component Analysis

In class, we learned that Principal Component Analysis (PCA) preserves variance as much as possible. We
are going to explore another way of deriving it: minimizing reconstruction error.
Consider data points xn (n = 1, …, N ) in D-dimensional space. We are going to represent them in
{u1 , …, uD } coordinates. That is,
D

xn =

D

(xn T ui )ui .

n
αi ui =
i=1

i=1

n

Here, αni is the length when x is projected onto ui .
Suppose we want to reduce the dimension from D to M < D. Then the data point xn is approximated
by
M

xn =
˜

D
n
zi ui +

i=1

bi ui .
i=M +1

In this representation, the ﬁrst M directions of ui are allowed to have diﬀerent coeﬃcient zni for each data
point, while the rest has a constant coeﬃcient bi . As long as it is the same value for all data points, it does
not need to be 0.
Our goal is setting ui , zni , and bi for n = 1, …, N and i = 1, …, D so as to minimize reconstruction error.
That is, we want to minimize the diﬀerence between xn and xn :
˜
1
J=
N

N

xn − xn
˜
n=1

2

2

n
(a) What is the assignment of zj for j = 1, …, M minimizing J ? [5 pts]

(b) What is the assignment of bj for j = M + 1, …, D minimizing J ? [5 pts]
(c) Express optimal xn and xn − xn using your answer for (a) and (b). [2 pts]
˜
˜
(d) What should be the ui for i = 1, …, D to minimize J ? [8 pts]
Hint: Use S =

4

1
N

N
n
n=1 (x

− x)(xn − x)T for sample covariance matrix.
¯
¯

Image Compression using Principal Component Analysis

For this section, you will be using PCA to perform dimensionality reduction on the given dataset (q4.mat).
This dataset contains vectorized grey scale photos of all members of the class. The ﬁle contains a matrix
’faces’ of size (62x 4500) for each of the 59 students (as well as 2 TA’s and Prof) in the class. You are to use
Principal Component Analysis to perform Image Compression.
• Submit a plot of the Eigen values in ascending order (Visualize the increase of Eigen values across all
Eigen vectors).
• Select a cut oﬀ to choose the top n eigen faces (or vectors) based on the graph. Discuss the reasoning
for choosing this cut oﬀ.
• For your choosen eigen faces, calculate the reconstruction error (Squared distance from original image,
and reconstructed image) for the ﬁrst two images in the dataset. (They are images of the two TAs).
• Vary the number of eigen faces to view the diﬀerences in reconstruction error and in the quality of the
image. Use imshow() to display the two images for your choosen n eigen faces. Attach the two images
Hint: Use Matlab function eig or eigs for calculating the eigen values and vectors. For reconstructing
the images, you can convert the row vectors to matrices using reshape(rowVector, 75, 60)

Pages (275 words)
Standard price: \$0.00
Client Reviews
4.9
Sitejabber
4.6
Trustpilot
4.8
Our Guarantees
100% Confidentiality
Information about customers is confidential and never disclosed to third parties.
Original Writing
We complete all papers from scratch. You can get a plagiarism report.
Timely Delivery
No missed deadlines – 97% of assignments are completed in time.
Money Back