Hone logo
Hone
Problems

Optimizing a Simple Function with Simulated Annealing

Simulated annealing is a probabilistic technique for finding a good approximation to the global optimum of a given function. It's inspired by the annealing process in metallurgy, where a material is heated and then slowly cooled to minimize defects and achieve a low-energy state. This challenge asks you to implement simulated annealing to find the minimum of a simple, one-dimensional function.

Problem Description

You are tasked with implementing the simulated annealing algorithm in Python to find the minimum value of the function f(x) = (x - 2)^2 + 5. The algorithm should start at a random initial point, iteratively explore the neighborhood of the current point, and accept or reject moves based on a probability that depends on the change in energy (function value) and a temperature parameter. The temperature gradually decreases over time, reducing the probability of accepting worse solutions and guiding the algorithm towards a local minimum.

Key Requirements:

  • Neighborhood Generation: The neighborhood of a point x is defined as x + random.uniform(-1, 1). This means a random value between -1 and 1 is added to the current x value.
  • Acceptance Probability: The probability of accepting a move to a new state x' is calculated using the Metropolis criterion: exp(-(f(x') - f(x)) / T), where f(x) and f(x') are the function values at the current and new states, respectively, and T is the temperature. If f(x') - f(x) <= 0 (the new state is better or equal), the move is always accepted.
  • Temperature Schedule: The temperature T should be decreased linearly over time. Start with an initial temperature T_initial and decrease it to T_final over num_iterations. The temperature at each iteration i can be calculated as T(i) = T_initial - (T_initial - T_final) * i / num_iterations.
  • Return Value: The function should return the best x value found during the annealing process.

Expected Behavior:

The algorithm should converge towards the minimum of the function f(x) = (x - 2)^2 + 5, which is at x = 2. The returned x value should be close to 2.

Edge Cases to Consider:

  • Initial Temperature: A high initial temperature allows for more exploration, while a low temperature focuses on exploitation.
  • Temperature Decay Rate: A faster decay rate can lead to premature convergence to a local minimum.
  • Number of Iterations: Insufficient iterations may prevent the algorithm from finding a good solution.

Examples

Example 1:

Input: T_initial = 100, T_final = 0.1, num_iterations = 1000
Output: x ≈ 2.0 (within a reasonable tolerance, e.g., 0.1)
Explanation: The simulated annealing algorithm explores the function space, gradually decreasing the temperature to converge towards the minimum at x=2.

Example 2:

Input: T_initial = 50, T_final = 0.01, num_iterations = 500
Output: x ≈ 2.0 (within a reasonable tolerance)
Explanation:  A lower initial temperature and fewer iterations might still find a good solution, but the convergence might be slower.

Example 3: (Edge Case - Low Iterations)

Input: T_initial = 100, T_final = 0.1, num_iterations = 10
Output: x ≈ 2.0 (but potentially further from the true minimum than with more iterations)
Explanation: With only 10 iterations, the algorithm has limited time to explore the function space, so the result might be less accurate.

Constraints

  • T_initial must be a positive floating-point number.
  • T_final must be a positive floating-point number less than T_initial.
  • num_iterations must be a positive integer.
  • The function f(x) = (x - 2)^2 + 5 is provided and should not be modified.
  • The algorithm should run within a reasonable time (e.g., less than 5 seconds).

Notes

  • The random.uniform() function should be used to generate random numbers for neighborhood generation.
  • The math.exp() function should be used to calculate the exponential term in the Metropolis criterion.
  • Consider using a small tolerance when comparing the final x value to the expected minimum (2) due to the probabilistic nature of the algorithm.
  • Experiment with different values of T_initial, T_final, and num_iterations to observe their impact on the algorithm's performance.
  • The goal is to find a value of x that minimizes f(x). The algorithm doesn't guarantee finding the exact minimum, but it should get close.
Loading editor...
python