Q1. What is the state space of local search for a CSP?

Answer: It is the space of complete assignments to the variables of the CSP.

Q2. What are two limitations of local search?

Answer: First, it is not guaranteed to find a solution if one exists (not complete). Second, it is not able to show if solution does not exist.

Q3. Plateaus are set of connected states {n1, …, nk} with h(n1) = h(n2) = … = h(nk). Why are plateaus a problem for local search?

Answer: Because local search relies on h(n) when deciding which neighbor to choose. If they all are the same it will struggle and will probably fail.

Q4. Why does randomization help in local search?

Answer: Randomness helps avoiding getting trapped in local minima.

Q5. Explain how the next state is chosen in simulated annealing.

Answer: First we pick a variable on random and assign a value on random. Then we check if it is an improvement over current state. If it is, adopt it, otherwise adopt it probabilistically with probability e^((h(n) - h(n’))/T) where n is current state, n’ is next state, T is temperature parameter.

Q6. How would you change a local search algorithm to better find global maxima that are surrounded by local maxima? Assume that the neighbor relation is fixed.

Answer: We can add randomization. Random reset and random walk to escape local maximas.

Q7. How would you change a local search algorithm to better find global maxima that are surrounded by plateaus? Assume that the neighbor relation is fixed.

Q8. How would you design a local search algorithm for a problem where there are no plateaus and no local minima that are not global minima? Assume that the neighbor relation is fixed.

Answer: Use only greedy descent. No randomization.

Q9. What is the difference between a random step and a random restart?

Answer: In a random walk, you move to a random neighbour. In a random restart, you randomly assign values to all variables.

Q10. Consider two local search algorithms, A and B. A solves x% of the problems it is given within (.25x)^2 minutes. B solves y% of the problems it is given within y minutes. Is one algorithm always better than the other? If not, which algorithm would you use when?

Answer: For A, solving time for x% is (0.25)^2*x^2. For small x% (< ~15%) it will solve faster than B, but as x% and y% increase, B does better than A.

QA11. What is the key weakness of stochastic local search?

Answer: With SLS we cannot show that no solution exists.

QA12. In local search, how do we determine neighbours?

Answer: A neighbour is usually just a small incremental change to variable assignment. For example, a neighbour might be an assignment in which one variable’s value has changed.