Ryan Hughes

AI 3D Modeler

Software Engineer

AI Developer

C#

C++

Google Cloud Functions

Synchronous evolutionary solver for Rhino Compute written in C# for Grasshopper. Uses examples from Microsoft and wraps everything in native RhinoCommon types, then modifies runtime order to prevent blocking and avoid cyclical errors.

An *evolutionary optimization algorithm* is an implementation of a meta-heuristic modeled on the behavior of biological evolution. These algorithms can be used to find approximate solutions to difficult or impossible numerical minimization problems.

public **class Individual **: **IComparable<Individual>** {
public **double[]** *Chromosome*; // Represents a solution.
public **double ***Error*; // Smaller values are better for minimization.
private **int ***NumGenes*; // Problem dimension.
private **double ***MutateRate*; // Used during reproduction by mutate.
private **double ***MutateChange*; // Used during reproduction.
static **Random** *rnd *= new(0); // Used by ctor.
public **Individual**(**List<IGH_Param> ***genomes*,
**double ***mutateRate*,
**double **mutateChange,
** double ***error*) {
this.*NumGenes *= *genomes*.Count;
this.*MutateRate *= *mutateRate*;
this.*MutateChange *= *mutateChange*;
this.*Chromosome *= new **double**[*genomes*.Count];
for (**int **i = 0; i < this.*Chromosome*.Length; ++i)
this.*Chromosome*[i] =
(*maxGene *- *minGene*) * *rnd*.NextDouble() +
*minGene*;
this.*Error *= error;
}
public **int CompareTo**(**Individual ***other*) // From smallest error (better) to largest.
{
if (this.*Error *< *other*.Error)
return -1;
else if (this.*Error *> *other*.Error)
return 1;
else
return 0;
}
}

The mutation rate value controls how many genes in the chromosome will be modified. One heuristic for the value of the

field is to use *mutateRate *

so that on average one gene in the chromosome will be mutated every time Mutate is called.*1.0 / numGenes*

With arbitrary minimization problems, the target function to be minimized is usually called a cost function. In the context of evolutionary and genetic algorithms, however, the function is usually called a fitness function. Notice the terminology is a bit awkward because lower values of fitness are better than higher values. In this example, the fitness function is completely self-contained. In many optimization problems, the fitness function requires additional input parameters, such as a matrix of data or a reference to an external data file.

The `Evolve`

method begins by initializing the best fitness and best chromosomes to the first ones in the population. The method iterates exactly `maxGenerations`

times, using `gen`

(generation) as a loop counter. One of several alternatives is to stop when no improvement has occurred after some number of iterations. The Select method returns two good, but not necessarily best, individuals from the population.

These two parents are passed to `Reproduce`

, which creates and returns two children. The Accept method places the two children into the population, replacing two existing individuals. The Immigrate method generates a new random individual and places it into the population. The new population is then scanned to see if any of the three new individuals in the population is a new best solution.

The `Select`

method uses a technique called *tournament selection*. A subset of random candidate individuals is generated and the best `n`

of them are returned. The number of candidates is computed into variable `tournSize`

, which is some fraction, `tau`

, of the total population size. Larger values of tau increase the probability that the best two individuals will be selected.

Some of the original demonstration code and description text in this repo was taken from here.