Optimization of Benchmark Functions Using Local Search and Stochastic Hill Climbing

The optimization of five benchmark functions using the Local Search and Stochastic Hill Climbing algorithm.

In Local Search, neighborhoods was generated by radius size and selected the best neighborhood.

Local Search (LS):

1- Generate an initial solution

2- In each iteration,

  • Generate a set of neighboring solutions based on the current solution using a predefined radius size.
  • Select the best solution from the generated neighbors.
  • Compare the selected neighbor with the current best solution.
  • If the selected neighbor is superior to the current solution, update the current solution and proceed to the next iteration; otherwise, terminate the algorithm.

Stochastic Hill Climbing (SHC):

1- Generate an initial solution.

2- In each iteration,

  • Generate a set of neighboring solutions based on the current solution using a predefined radius size.
  • Randomly select one of the neighboring solutions.
  • Compare the selected neighbor with the current best solution.
  • If the selected neighbor is superior to the current solution, update the current solution and continue to the next iteration until the maximum number of iterations is reached.

Neighborhood count: 1000

Radius scale size: 0.1

Domain: (-5.12, 5.12)

Table 1 - Functions and Global Minimum Values
Table 1 - Functions and Global Minimum Values

De Jong 1st Function
De Jong 1st Function

2nd De Jong Rosenbrock’s Saddle Function
2nd De Jong Rosenbrock’s Saddle Function

4th De Jong Function
4th De Jong Function

Rastrigin’s Function
Rastrigin’s Function

Griewangk’s Function
Griewangk’s Function

Local Search Results

- De Jong 1st Function

Image description

Image description

Image description

  • 2nd De Jong Rosenbrock’s Saddle Function

Image description

Image description

Image description
- 4th De Jong Function

Image description

Image description

Image description

  • Rastrigin’s Function

Image description

Image description

Image description

- Griewangk’s Function

Image description

Image description

Image description

Stochastic Hill Climbing Results

*- De Jong 1st Function
*

Image description

Image description

Image description

- 2nd De Jong Rosenbrock’s Saddle Function

Image description

Image description

Image description

- 4th De Jong Function

Image description

Image description

Image description

- Rastrigin’s Function

Image description

Image description

Image description

- Griewangk’s Function

Image description

Image description

Image description

Comparison Tables

Image description

Image description

Image description

Local Search (LS) demonstrated high computational efficiency, whereas Stochastic Hill Climbing (SHC) generally yielded better results. Both methods are variants of local search; however, their neighborhood selection strategies differ. SHC selects a neighboring solution randomly, while LS always chooses the best available option. Additionally, LS terminates if no improvement is found, whereas SHC continues until the maximum number of iterations is reached.

Local search (LS) also employs a greedy approach in this context, as it aims to improve the current solution at each step by selecting the best neighboring solution. This approach often leads the algorithm to become stuck in a local optimum, as it consistently chooses the most favorable option available without considering alternative paths.

Link for codes: https://github.com/gokhanergen-tech/python-local-search-stochastic-hill-climbing

References