Minimize a non linear and differentiable black-box function, with linear constraints. - Python

632 Views Asked by At

I'm starting in python and I'm trying to solve a problem that Fmincon solves in Matlab.

Basically, my problem has 12 variables, and a list of 2000 values is created (linearly) and my objective is to maximize the largest value of this list.

In addition, the problem has a linear constraint.

I have tried, without success, using scipy but in all the free-gradient solvers or approximate gradient that I tried, it is not possible to insert linear constraints.

I've also tried using cvxopt, but I did not find any free-gradient solver or approximate gradient.

In addition, I would not like to use tools like Genetic Algorithm, PSO, Harmony Search, etc.

I would like to use a free-gradient solver or approximate gradient that is possible to insert linear constraints, like Fmincon in Matlab.

This is my objective function:

import numpy as np

def max_receita(X, f, CONSTANTE_CVAR):

# f is a matrix with 2000 rows and 12 columns
# CONSTANTE_CVAR is a matrix with 2000 rows and 12 columns

NSERIES = len(f)
REC = np.zeros(NSERIES)
X = np.transpose(X)

CONST_ANUAL = np.sum(CONSTANTE_CVAR,1)

for i in range(NSERIES):
    REC[i] = np.dot(f[i],X) + CONST_ANUAL[i]

return -max(REC)

There are 12 variables represented by the vector X with 12 elements and this vector must have sum equal 1. That's my constraint.

Beside this, each variable has a different bound. So the solver must allow to insert bounds.

The vector X (all the 12 variables), with the inputs f and CONST_CVAR, creates the vector REC (1x2000) and my objective is to maximize the biggest value of the vector REC.

In this way, I need a solver that allows:

  • Free-gradient or approximate gradient
  • Linear constraints
  • Non linear function
  • Bounds
  • Python

Could anyone suggest any solver?

0

There are 0 best solutions below