ISE 530: Optimization for Analytics
Optimization for Analytics
Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: THEend8_
ISE 530: Optimization for Analytics
1. Compute the square root of 1013 (find a root of g(x) = x2 − 1013) using:
Binary search (compute to 1 decimal digit of precision). Use a lower bound
of 10 and an upper bound 50 in the initial iteration.
Newton method.
Show the detail of all your iterations in all cases.
2. Find an optimal solution of the one-dimensional optimization problem
min
x
f(x) = x2 +
1
1 + e−1−2x
,
i.e., find a point where f ′(x) = 0, using either binary search or Newton method.
Detail each iteration of the algorithm.
3. Consider the lasso problem
min
x
‖y −Ax‖22 + λ
n∑
i=1
|xi|.
The parameter λ needs to be chosen carefully in practice such that the solution is
sparse but still fits the data well.
In particular, using the diabetes dataset (diabetes.dat) on blackboard, use binary
search find the largest value of λ such that the R2 value is at least 0.5. You may
1
assume that this value lies in the interval 0 ≤ λ ≤ 100. You can use the file
(lasso.mod) to solve this question.
4. Find an optimal solution of the optimization problem
min
x
f(x) = (x1 + x2)
2 + (3− 2x1 + x2)2 + (1 + 2x2)2
by setting the gradient to 0.
5. Starting at the point (0, 0), do three iterations of gradient descent for the problem
min
x
f(x) = (x1 + x2)
2 + (3− 2x1 + x2)2 + (1 + 2x2)2.
In all cases, compute step sizes by solving the optimization problem
min
λ
f(xk + λdk)
for current point xk and descent direction dk.
6. Solve the problem
min
x
f(x) = (x1 + x2)
2 + (3− 2x1 + x2)2 + (1 + 2x2)2
using the Newton method.