项目作者: JustinLovinger

项目描述 :
A python metaheuristic optimization library. Currently supports Genetic Algorithms, Gravitational Search, Cross Entropy, and PBIL.
高级语言: Python
项目地址: git://github.com/JustinLovinger/optimal.git
创建时间: 2014-10-31T22:49:00Z
项目社区:https://github.com/JustinLovinger/optimal

开源协议:MIT License

下载


Optimal (beta)

A python metaheuristic optimization library. Built for easy extension and usage.

Warning: Optimal is in beta. API may change. I will do my best to note any breaking changes in this readme, but no guarantee is given.

Supported metaheuristics:

  • Genetic algorithms (GA)
  • Gravitational search algorithm (GSA)
  • Cross entropy (CE)
  • Population-based incremental learning (PBIL)

Installation

  1. pip install optimal

Usage

  1. import math
  2. from optimal import GenAlg
  3. from optimal import Problem
  4. from optimal import helpers
  5. # The genetic algorithm uses binary solutions.
  6. # A decode function is useful for converting the binary solution to real numbers
  7. def decode_ackley(binary):
  8. # Helpful functions from helpers are used to convert binary to float
  9. # x1 and x2 range from -5.0 to 5.0
  10. x1 = helpers.binary_to_float(binary[0:16], -5.0, 5.0)
  11. x2 = helpers.binary_to_float(binary[16:32], -5.0, 5.0)
  12. return x1, x2
  13. # ackley is our fitness function
  14. # This is how a user defines the goal of their problem
  15. def ackley_fitness(solution):
  16. x1, x2 = solution
  17. # Ackley's function
  18. # A common mathematical optimization problem
  19. output = -20 * math.exp(-0.2 * math.sqrt(0.5 * (x1**2 + x2**2))) - math.exp(
  20. 0.5 * (math.cos(2 * math.pi * x1) + math.cos(2 * math.pi * x2))) + 20 + math.e
  21. # You can prematurely stop the metaheuristic by returning True
  22. # as the second return value
  23. # Here, we consider the problem solved if the output is <= 0.01
  24. finished = output <= 0.01
  25. # Because this function is trying to minimize the output,
  26. # a smaller output has a greater fitness
  27. fitness = 1 / output
  28. # First return argument must be a real number
  29. # The higher the number, the better the solution
  30. # Second return argument is a boolean, and optional
  31. return fitness, finished
  32. # Define a problem instance to optimize
  33. # We can optionally include a decode function
  34. # The optimizer will pass the decoded solution into your fitness function
  35. # Additional fitness function and decode function parameters can also be added
  36. ackley = Problem(ackley_fitness, decode_function=decode_ackley)
  37. # Create a genetic algorithm with a chromosome size of 32,
  38. # and use it to solve our problem
  39. my_genalg = GenAlg(32)
  40. best_solution = my_genalg.optimize(ackley)
  41. print best_solution

Important notes:

  • Fitness function must take solution as its first argument
  • Fitness function must return a real number as its first return value

For further usage details, see comprehensive doc strings.

Breaking Changes

09/26/2017

Renamed helpers.binary_to_int offset option to lower_bound,
and renamed helpers.binary_to_float minimum and maximum options to
lower_bound and upper_bound respectively.

08/27/2017

Moved a number of options from Optimizer to Optimizer.optimize

07/26/2017

Renamed common.random_solution_binary to common.random_binary_solution,
and common.random_solution_real to common.random_real_solution

11/10/2016

problem now an argument of Optimizer.optimize, instead of Optimizer.__init__.

11/10/2016

max_iterations now an argument of Optimizer.optimize, instead of Optimizer.__init__.

11/8/2016

Optimizer now takes a problem instance, instead of a fitness function and kwargs.

11/5/2016

Library reorganized with greater reliance on __init__.py.

Optimizers can now be imported with:

  1. from optimal import GenAlg, GSA, CrossEntropy

Etc.