The Arithmetic Optimization Algorithm (AOA)

Introduction

 In the ever-evolving world of artificial intelligence and also optimization, researchers have still been developing many algorithms inspired by both nature and mathematics. A single revolutionary algorithm is the Arithmetic Optimization Algorithm (AOA), unveiled in 2020 as a mathematical model-based optimization technique. Special from nature-inspired algorithms like Genetic Algorithm (GA), Particle Swarm Optimization (PSO), and Grey Wolf Optimizer (GWO), AOA exists with a basis during arithmetic operations. These operations—Addition, Subtraction, Multiplication, and Division—explore and exploit the solution space efficiently.

Since its initial invention, AOA has gained much attention due to its relative simplicity, fast convergence, and its effectiveness in solving various engineering, industrial, and also real-world optimization problems. It has been successfully applied in structural design, machine learning parameter tuning, in addition to financial forecasting.

Understanding AOA

Key Characteristics of AOA

AOA stands out from other optimization algorithms due to its unique features:

  1. Mathematical Inspiration – Unlike bio-inspired or physics-based approaches, AOA solely depends on fundamental arithmetic principles.

  2. Exploration-Exploitation Balance – The algorithm dynamically adjusts its search strategy using Math Optimizer Acceleration (MOA) and Math Optimizer Probability (MOP).

  3. Fast Convergence – AOA avoids premature convergence and stagnation issues that affect many heuristic algorithms.

  4. Scalability – It can efficiently handle both low-dimensional and high-dimensional optimization problems.

The arithmetic optimization algorithm (AOA)

The Arithmetic Optimization Algorithm (AOA) works by using basic arithmetic operations (Addition, Subtraction, Multiplication, and Division) to search for the best solution in an optimization problem. Let’s break it down into simple steps.

Step 1: Initializing the Solutions (Candidates)

Imagine you are searching for the best answer to a problem. AOA starts with multiple random guesses. These guesses are called candidate solutions.

Mathematically, each solution is represented as:

Initialization Formula: xi,j = LBj + r × (UBj - LBj)

where:

  • LB (Lower Bound): The smallest value a solution can take.

  • UB (Upper Bound): The largest value a solution can take.

  • r: A random number between 0 and 1 to spread the solutions.

Step 2: Controlling the Search Using MOA (Math Optimizer Acceleration)

AOA balances between searching widely (exploration) and focusing on good solutions (exploitation). This is done using MOA, a value that increases over time to shift the focus.

MOA Formula: MOA = 0.2 + ((t / T) × (0.9 - 0.2))

where:

  • t: The current iteration number.

  • T: The total number of iterations.

Step 3: Defining MOP (Math Optimizer Probability)

This helps control how much the best solution influences new candidates.

MOP Formula: MOP = 1 - ((t1/5) / (T1/5))

This value starts high (allowing more exploration) and decreases over time (allowing better exploitation).

Step 4: Updating Candidate Solutions

Each solution is updated based on two rules:

  1. Exploration Phase (Broad Search):

xi,j(t+1) = (best(xj) / (MOP + ε)) × ((UBj - LBj) × μ + LBj)
  1. Exploitation Pha****Exploration Phase (Broad Search):

xi,j(t+1) = best(xj) × MOP × ((UBj - LBj) × μ + LBj)

Where:

  • best(xj): Best solution found so far.

  • μ (mu): A random number in [0, 1].

  • ε (epsilon): A very small number to avoid division by zero.

Step 5: Checking Convergence

AOA repeats these steps until it reaches the maximum number of iterations or finds the best possible solution. As iterations progress, the solutions become more refined and close to the optimal value.



Mathematical Problem 

Problem Statement:

Find the minimum value of the function: f(x) = x² - 4x + 4 within the range x ∈ [-10, 10].

Solution:

Step 1: Initialization

The algorithm starts by generating random candidate solutions within the defined search space using the formula:

xₖ = LB + r × (UB - LB)
  • LB: Lower Bound = -10
  • UB: Upper Bound = 10
  • r: Random number ∈ [0, 1]

Suppose we generate 5 candidate values:

CandidateRandom rxi
x₁0.2-6
x₂0.50
x₃0.74
x₄0.98
x₅0.1-8

Step 2: Calculate MOA and MOP

MOA (Math Optimizer Accelerated):

MOA = 0.2 + (t / T) × (0.9 - 0.2)

If t = 5, T = 20:

MOA = 0.2 + (5 / 20) × 0.7 = 0.375

MOP (Math Optimizer Probability):

MOP = 1 - (t^1/5 / T^1/5)
MOP ≈ 1 - (1.38 / 1.82) ≈ 0.242

Step 3: Update Candidate Solutions

Exploration Phase (if MOA < random r):

xi(t+1) = [best(x) / (MOP + ε)] × ((UB - LB) × μ + LB)

Assume best(x) = 4, μ = 0.4, ε = 10⁻⁸

x₁(t+1) = 4 / 0.242 × (20 × 0.4 - 10) = 16.53 × (-2) = -33.06

Clipped to -10 (since x ∈ [-10, 10])

Exploitation Phase (if MOA ≥ random r):

xi(t+1) = best(x) × MOP × ((UB - LB) × μ + LB)
x₂(t+1) = 4 × 0.242 × (20 × 0.3 - 10) = 0.968 × (-4) = -3.87

Step 4: Evaluate Solutions

Plug updated x values into f(x) and choose the best:

xf(x)
-10100 + 40 + 4 = 144
-3.87~34.46
20 (minimum)

The best solution is updated each time to get closer to the minimum.

Step 5: Convergence

  • AOA repeats the update & evaluation steps
  • Stops when the best x is found or max iterations reached
Optimal x = 2
Minimum f(x) = 0

Implementation:

Arithmetic Optimization Algorithm (AOA) Code On Github:

import numpy as np
Import NumPy, a powerful library for numerical computing (especially useful for arrays and 
random number generation).
# Function to minimize: f(x) = ax^2 + bx + c
def f(x, a=1, b=-4, c=4):
    return a * x**2 + b * x + c
This defines the objective function f(x) that we want to minimize.
Default values are set: a=1,b=4,c=4a = 1, b = -4, c = 4 → i.e., f(x)=x24x+4f(x) = x^2 - 4x + 4

# AOA parameters
LB = -10 # Lower Bound of x
UB = 10 # Upper Bound of x
num_candidates = 5 # Number of candidate solutions (population size)
max_iterations = 50 # Maximum number of iterations for optimization
a, b, c = 1, -4, 4 # Coefficients of the quadratic function
 These define:
  • The search range [LB,UB][LB, UB]

  • The number of random solutions (agents) to test,

  • The maximum iterations to run the optimizer,

  • The coefficients for the function you're minimizing.

# Random initialization of candidates
candidates = LB + np.random.rand(num_candidates) * (UB - LB)
This initializes num_candidates random x values between LB and UB
# Begin optimization
for t in range(1, max_iterations + 1):
Start the optimization loop. t is the current iteration index.
# Evaluate all candidates
fitness = f(candidates, a, b, c)
best_idx = np.argmin(fitness)
best_x = candidates[best_idx]

Compute the fitness value for each candidate using f(x)
● best_idx: index of the best (minimum) fitness
● best_x: value of x that gives the lowest f(x) so far

# Update MOA and MOP
MOA = 0.2 + (t / max_iterations) * (0.9 - 0.2)
MOP = 1 - ((t ** (1/5)) / (max_iterations ** (1/5)))
epsilon = 1e-8
Compute the Math Optimizer Accelerated (MOA) and Math Optimizer Probability (MOP)
epsilon avoids division by zero.
# Generate new candidates
new_candidates = []
Create a fresh list to store updated candidate values for this iteration.
for x in candidates:
r = np.random.rand()
mu = np.random.rand()

For each current candidate x, generate random values:

  • r: to decide between exploration or exploitation

  • mu: used in updating the value of x

if MOA < r: # Exploration
new_x = best_x / (MOP + epsilon) * ((UB - LB) * mu + LB)
If MOA is less than the random number r, perform exploration: Try exploring new areas far from
the current best by combining best_x, MOP, and mu.
else: # Exploitation
new_x = best_x * MOP * ((UB - LB) * mu + LB)
If MOA >= r, then we do exploitation: Search closer around the current best value using a 
different update formula.

# Clip to bounds
new_x = np.clip(new_x, LB, UB)
new_candidates.append(new_x)
Ensure that the new candidate stays within bounds using np.clip()
Store it in the new candidate list
candidates = np.array(new_candidates)

Replace the old population with the newly generated values.

# Final best solution
final_fitness = f(candidates, a, b, c)
best_idx = np.argmin(final_fitness)
best_x = candidates[best_idx]
best_fx = final_fitness[best_idx]
After the loop ends, evaluate the final population
Extract the best solution and its function value
print(f"Optimal x: {best_x:.4f}")
print(f"Minimum f(x): {best_fx:.4f}")
Print the final result with 4 decimal precision

Time & Space Complexity Analysis

  • Best Case: O(N × M)

  • Worst Case: O(N × M × L)

  • Average Case: O(N × M)

  • Space Complexity: O(N × L)

where:

  • N = Population size

  • M = Number of iterations

  • L = Number of dimensions

Real-World Applications of AOA 

The Arithmetic Optimization Algorithm (AOA) isn't just a theoretical model—it has real, practical uses in a variety of domains. Here are some key applications with short explanations:

1. Feature Selection in Machine Learning

Use: Selecting the most relevant features from a dataset to improve model accuracy and reduce overfitting.

How AOA helps: AOA optimizes feature subsets by minimizing classification error and improving learning efficiency.

2. Scheduling and Job Allocation

Use: Efficiently assigning tasks to machines in manufacturing or cloud computing systems.

How AOA helps: AOA minimizes processing time and maximizes utilization by finding optimal task-to-machine matches.

3. Wireless Sensor Networks (WSN)

Use: Reducing energy consumption and improving data routing in smart sensor environments.

How AOA helps: AOA selects efficient paths and sensor clusters to extend network lifespan and speed.

4. Portfolio Optimization in Finance

Use: Allocating assets for maximum return and minimal risk.

How AOA helps: AOA evaluates investment weights that meet constraints such as budget and volatility.

5. Power Systems & Smart Grids

Use: Managing voltage levels, power flows, and load balancing in electrical systems.

How AOA helps: AOA finds efficient configurations to reduce energy losses and enhance stability.

6. Medical Diagnosis & Image Segmentation

Use: Identifying regions of interest in MRI or CT scans.

How AOA helps: AOA optimizes thresholds for precise image segmentation and fast computation.

7. Route Planning in Autonomous Vehicles

Use: Calculating shortest or safest paths for self-driving cars.

How AOA helps: AOA evaluates multiple path options based on fuel, traffic, and distance to find the best route.

8. Cybersecurity – Intrusion Detection

Use: Identifying abnormal or malicious activity in networks.

How AOA helps: AOA fine-tunes detection parameters and rules to accurately classify threats.


Conclusion

The Arithmetic Optimization Algorithm (AOA) is a groundbreaking approach in the field of optimization. Its arithmetic-based foundation provides an innovative and efficient way to explore and exploit solutions. With its balanced approach, it has shown promising results in engineering, finance, machine learning, and industrial applications.

As optimization problems continue to evolve, AOA’s potential for adaptation and hybridization with other techniques will make it an essential tool for solving complex real-world challenges. 

Comments