Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: THEend8_
CS 486/686 Assignment 3
(135 marks)
Blake VanBerlo
Due Date: 11:59 PM ET on Wednesday, March 23, 2022
Changes
• v1.1: Small changes to hyperparameters in Q2.2a and Q2.b. Fixed some typos.
• v1.2: Fixed typos in function names
• v1.3: Instructions to calculate average cross entropy loss
• v1.4: Updated question Q2.2a and Q2.b to improve clarity
1
CS 486/686 Winter 2022 Assignment 3
Academic Integrity Statement
If your written submission on Learn does not include this academic integrity statement with
your signature (typed name), we will deduct 5 marks from your final assignment mark.
I declare the following statements to be true:
• The work I submit here is entirely my own.
• I have not shared and will not share any of my code with anyone at any point.
• I have not posted and will not post my code on any public or private forum or website.
• I have not discussed and will not discuss the contents of this assessment with anyone
at any point.
• I have not posted and will not post the contents of this assessment and its solutions
on any public or private forum or website.
• I will not search for assessment solutions online.
• I am aware that misconduct related to assessments can result in significant penalties,
possibly including failure in the course and suspension.
Failure to accept the integrity policy will result in your assignment not being graded.
By typing or writing my full legal name below, I confirm that I have read and understood
the academic integrity statement above.
©Blake VanBerlo 2022 v1.4 Page 2 of 12
CS 486/686 Winter 2022 Assignment 3
Instructions
• Submit any written solutions in a file named writeup.pdf to the A3 Dropbox on Learn.
If your written submission on Learn does not contain one file named writeup.pdf, we
will deduct 5 marks from your final assignment mark.
• No late assignment will be accepted. This assignment is to be done individually.
• I strongly encourage you to complete your write-up in LaTeX, using this source file.
If you do, in your submission, please replace the author with your name and student
number. Please also remove the due date, the Instructions section, and the Learning
goals section. Thanks!
• Lead TAs:
• Compute the entropy of a probability distribution.
• Trace the execution of the algorithm for learning a decision tree.
• Determine valid splits for real-valued features.
• Apply overfitting prevention strategies for decision trees.
Neural networks
• Implement a multilayer perceptron
• Implement the backpropagation algorithm including the forward and backward passes
• Understand and interpret performance metrics in supervised learning