mathopt6.gms : MathOptimizer Example 6

Description

```The Hundred-dollar, Hundred-digit Challenge Problems as stated by
N. Trefethen, Oxford University.

Several random points are used to test the robustness of global and
local codes. You may want to run gams with lo = 0 or lo = 2 to reduce
output to the log.

```

References

• Mathematica, MathOptimizer - An Advanced Modeling and Optimization System for Mathematica Users.
• Pinter, J D, Global Optimization in Action - Continuous and Lipschitz Optimization: Algorithms, Implementations, and Applications. Kluwer Acadameic Publishers, Nonconvex Optimization and Its Applications, 1996.
• Pinter, J D, Computational Global Optimization in Nonlinear Systems - An Interactive Tutorial. Lionheart Publishing, Atlanta, GA, 2001.
• Trefethen, N, A Hundred-dollar, Hundred-digit Challenge. SIAM News 35, 1 (2002).

Small Model of Type : DNLP

Category : GAMS Model library

Main file : mathopt6.gms

``````\$title MathOptimizer Example 6 (MATHOPT6,SEQ=260)

\$onText
The Hundred-dollar, Hundred-digit Challenge Problems as stated by
N. Trefethen, Oxford University.

Several random points are used to test the robustness of global and
local codes. You may want to run gams with lo = 0 or lo = 2 to reduce
output to the log.

N. Trefethen, SIAM News, January - February 2002, page 3.

Mathematica, MathOptimizer - An Advanced Modeling and Optimization System
for Mathematica Users, http://www.wolfram.com/products/applications/mathoptimizer/

Janos D Pinter, Global Optimization in Action, Kluwer Academic Publishers,
Dordrecht/Boston/London, 1996.

Janos D Pinter, Computational Global Optimization in Nonlinear Systems,
Lionheart Publishing, Inc., Atlanta, GA, 2001

Keywords: nonlinear programming, discontinuous derivatives, mathematics, global
optimization
\$offText

\$eolCom //

Variable x, y, obj;

Equation objdef;

objdef.. obj =e= exp(sin(50*x)) + sin(60*exp(y))  + sin(70*sin(x))
+  sin(sin(80*y)) - sin(10*(x + y)) + (sqr(x) + sqr(y))/4;

x.lo = -3;
x.up =  3;
y.lo = -3;
y.up =  3;

Model m / objdef /;

Parameter report 'summary report';
report('best','x0')  = -0.0244030796935730;
report('best','y0')  =  0.2106124271552849;
report('best','obj') = -3.306868647475235;
report('best','x.l') = report('best','x0');
report('best','y.l') = report('best','y0');

Scalar global 'best known solution';
global = report('best','obj')

Set i 'random samples' / rand1*rand100 /;

* You may want to run gams with lo = 0 or lo = 2 to reduce output to the log
m.limRow   = 0;
m.limCol   = 0;
m.solPrint = %solPrint.Report%;

Scalar best / inf /;

* try random starting points and report better solution only
loop(i\$(best > (global + 1e-6)),
x.l = uniform(x.lo,x.up);  // get
y.l = uniform(y.lo,y.up);  // random
report(i,'x0') = x.l;      // starting point
report(i,'y0') = y.l;      // and save

solve m using dnlp min obj;
m.solPrint = %solPrint.quiet%; // turn off solution listing

if(m.solveStat <> %solveStat.normalCompletion%,
display 'solver failed - no further solutions';
best = -inf;
);   // stop the loop
if(obj.l >= best or not(m.modelStat=%modelStat.optimal% or
m.modelStat=%modelStat.feasibleSolution% or
m.modelStat=%modelStat.locallyOptimal%),
report(i,'x0') = 0;  // remove entries from report
report(i,'y0') = 0;  // remove entries from report
else
best := obj.l;
report(i,'obj')   = obj.l;
report(i,'x.l')   = x.l;
report(i,'y.l')   = y.l;
report(i,'optcr') = -(obj.l - report('best','obj'))/report('best','obj');
report(i,'cpu')   = m.resUsd;
);
);

display report;
``````