Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: THEend8_
EECE5644 Assignment
Please submit in Canvas a single PDF file showing all results, and include code as appendix,
separate ZIP file, or as an external repository link in the PDF file. Cite all appropriate sources you
benefit from. Directly exchanging code/results between classmates is not acceptable, but you may
talk to each other in classes, office periods, benefit from each others’ questions and answers for
those.
Question 1 (60%)
Train and test Support Vector Machine (SVM) and Multi-layer Perceptron (MLP) classifiers
that aim for minimum probability of classification error (i.e. we are using 0-1 loss; all error in-
stances are equally bad). You may use a trusted implementation of training, validation, and testing
in your choice of programming language. The SVM should use a Gaussian (sometimes called
radial-basis) kernel. The MLP should be a single-hidden layer model with your choice of activa-
tion functions for all perceptrons.
Generate 1000 independent and identically distributed (iid) samples for training and 10000 iid
samples for testing. All data for class l ∈ {−1,+1} should be generated as follows:
x= rl
[
cos(θ)
sin(θ)
]
+n (1)
where θ ∼Uni f orm[−π,π] and n∼ N(0,σ2I). Use r−1 = 2,r+1 = 4,σ = 1.
Note: The two class sample sets will be highly overlapping two concentric disks, and due
to angular symmetry, we anticipate the best classification boundary to be a circle between the
two disks. Your SVM and MLP models will try to approximate it. Since the optimal boundary is
expected to be a quadratic curve, quadratic polynomial activation functions in the hidden layer of
the MLP may be considered as to be an appropriate modeling choice. If you have time (optional,
not needed for assignment), experiment with different activation function selections to see the effect
of this choice.