• Keine Ergebnisse gefunden

In this section, two metrics are considered in comparing the computational effort of differ-ent solvers: (i) the number of functions evaluations required by each solver, and (ii) each solver’s execution time (CPU time). By combining former analysis on solution quality and computational effort, further analysis is then proposed to show solver efficiency in solving the problems of the testbed. Depending on whether or not an evaluation of the objective func-tion is time-consuming, different metrics are more important for obtaining a solufunc-tion faster.

Since the test problems are algebraically and computationally simple and small, the total time required for function evaluations for all runs was negligible. Most of the solvers’ execution time was spent on processing function values and determining the sequence of iterates. In cases where the evaluation of the objective function is not time-consuming, the execution time of the solvers is more important. However, in applications where an evaluation of the objective function requires a significant amount of time, the number of function evaluations that the solvers perform is the factor that determines computational efficiency. Global opti-mization methods will perform more iterations and will likely require more execution time than local methods.

Tables4and5present the computational efficiency of the solvers in terms of function evaluations and execution time. Table4shows that function evaluations of various solvers differ greatly. MIDACO, MISO, SNOBFIT, and all TOMLAB solvers require more than 2000 function evaluations. In some cases, this is due to the solver performing a large number of samples at early function evaluations. In other cases, solvers employ restart strategies to minimize the likelihood of getting trapped in local solutions. As mentioned before, these

Fig. 15 Efficiency of solvers as a function of the number of function evaluations

methods are global optimization methods and thus require more function evaluations. BFO, DAKOTA/MADS, DAKOTA/SOGA, and NOMAD require 1400 to 2000 function evalu-ations, while DFLBOX, and DFLGEN require fewer than 1138 function evaluations. On the other hand, most solvers can find the best solution on relative few function evaluations.

DFLBOX, DFLGEN, and NOMAD find their best solution in less than 650 function evalu-ations on average. In terms of execution time, all solvers, except for the DAKOTA solvers, MISO, NOMAD, and SNOBFIT, need less than 43 seconds on average to solve the instances, regardless of problem size. The DAKOTA solvers, MISO, NOMAD, and SNOBFIT require considerably more time to solve the problems. DAKOTA, MISO, and SNOBFIT demand CPU times that increase with problem size. Interestingly, NOMAD requires fewer iterations for large than medium problems and solves large problems faster. Even though MIDACO and TOMLAB solvers implement global optimization methods, they do not require much time compared to other global methods like MISO, NOMAD, and SNOBFIT.

Figure15presents the efficiency of each solver in terms of the number of function evalua-tions performed. The average number of function evaluaevalua-tions, represented by the horizontal axis, presents the computational effort of each solver, while the vertical axis indicates the quality of solutions in terms of the percentage of problems solved. Solvers that are located on the upper left corner of the figure indicate good efficiency. Approaches found on the lower right area indicate poor efficiency. BFO, DAKOTA/MADS, DAKOTA/SOGA, DFLGEN, MIDACO, MISO, NOMAD, SNOBFIT, TOMLAB/GLCDIRECT, TOMLAB/GLCFAST, and TOMLAB/GLCSOLVE required more than 1135 function evaluations and solved more than 31% of the problems. The least efficient solver is TOMLAB/MSNLP, which required 2474 function evaluations on average and solved 12% of the problems. DFLBOX required only 750 function evaluations on average and solved 38% of the problems. Figure15can also be interpreted as a Pareto front, in which case DAKOTA/MADS, DFLBOX, and NOMAD dominate all others.

Figure16presents the efficiency of each solver in terms of execution time. Even though MISO solved 76% of the problems, it required 4385 seconds on average. Similarly, NOMAD solved 77% of the problems, but it required 2229 seconds on average. DAKOTA/MADS solved 56% of the problems requiring 1266 seconds on average, while SNOBFIT solved

Table4Computationaleffortofsolversintermsoffunctionevaluations(tot:averagetotalnumberoffunctionevaluations,opt:averagefunctionevaluationsbywhichbest solutionwasfound) SolverAllproblemsSmallproblemsMediumproblemsLargeproblems totopttotopttotopttotopt BFO18981788143012471682158023092225 DAKOTA/MADS143012471527794478724592201 DAKOTA/SOGA19771797130210411970174822952190 DFLBOX750361131923871681365659 DFLGEN11364822371138412411818869 MIDACO2500164225007202500154725002110 MISO249810842492677250080925001518 NOMAD173064958663225141217871132 SNOBFIT2464618237352247146325001018 TOMLAB/GLCDIRECT216210652311862229112451979995 TOMLAB/GLCFAST2418144023258312406135724731794 TOMLAB/GLCSOLVE2474123523778872496137425001269 TOMLAB/MSNLP2474134723778732496139325001524

Table5Computationaleffortofsolversintermsofaverageexecutiontime(s) SolverAllproblemsSmallproblemsMediumproblemsLargeproblems BFO16121321 DAKOTA/MADS1266112282781 DAKOTA/SOGA693682471382 DFLBOX2631448 DFLGEN4263368 MIDACO21192023 MISO4385112017238290 NOMAD2229104236131526 SNOBFIT2085781391 TOMLAB/GLCDIRECT19181820 TOMLAB/GLCFAST21191924 TOMLAB/GLCSOLVE22202125 TOMLAB/MSNLP21181923

Fig. 16 Efficiency of solvers in terms of execution time

69% of the problems requiring 208 seconds. The best solvers in terms of time efficiency were MIDACO, TOMLAB/GLCDIRECT, TOMLAB/GLCFAST, and TOMLAB/GLCSOLVE, which solved 37−43% of the problems, requiring less than 23 seconds. BFO, DFLBOX, and DFLGEN were less efficient than the aforementioned solvers as they solved more than 31% of the problems requiring less than 43 seconds. The remaining solvers were not very efficient because of either large execution times (DAKOTA/SOGA) or the small fraction of problems that they solved (TOMLAB/MSNLP).

6 Conclusions

Although significant progress has been made on the algorithmic and theoretical aspects of derivative-free optimization over the past three decades, algorithms addressing problems with discrete variables have not yet attracted much attention. In order to assess the current state-of-the-art in this area, in this paper, we presented a systematic comparison of thirteen mixed-integer derivative-free optimization implementations on a large collection of test problems.

Our computational results show that the existing solvers cannot solve large problems and pure-integer problems efficiently.

Figure17presents the fraction of problems solved to optimality by all solvers collectively.

Solvers were able to solve 93% of the problems. Mixed-integer problems are easier to be solved to optimality than pure-integer problems. Solvers found the optimal solution for 95%

of the mixed-integer problems and 92% of the pure-integer problems. Moreover, non-binary discrete problems are easier to be solved to optimality than binary problems. Solvers were able to solve 85% and 99% of the binary and non-binary discrete problems, respectively.

Additionally, solvers are very efficient when solving problems with up to 50 variables. More specifically, solvers found the optimal solution on all small problems (one to ten variables), 98% of the medium problems (11 to 50 problems), and 85% of the large problems (51 to 500 variables). Moreover, solvers found the optimal solution on 79% of the large pure-integer problems.

Fig. 17 Problems solved by all solvers collectively as a function of problem type and size

The open-source solvers MISO and NOMAD provided the best solutions among all the solvers tested. MISO is the best performer on large and binary problems, while NOMAD outperforms all solvers on mixed-integer, non-binary discrete, small, and medium-sized prob-lems. DAKOTA/MADS and SNOBFIT also performed well on most types of probprob-lems. BFO, DFLBOX, MISO, NOMAD, and SNOBFIT collectively solve 93% of the problems used in this study.

Overall, existing algorithms in this field are complementary in their performance. They should be used collectively rather than individually. The collection of current solvers is capable of solving most small and even medium-sized problems. Clearly, there is a need for the development of algorithms and implementations for large-scale problems. Future work should also investigate the computational performance of derivative-free optimization algorithms on constrained problems.

Supplementary Information The online version contains supplementary material available athttps://doi.

org/10.1007/s10898-021-01085-0.

Acknowledgements We thank the anonymous referees, whose constructive comments helped improve our manuscript.

Data availability All data generated or analyzed during this study are available fromhttps://sahinidis.coe.

gatech.edu/bbo?q=midfo.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

To view a copy of this licence, visithttp://creativecommons.org/licenses/by/4.0/.

References

1. Abramson, M.A., Audet, C., Chrissis, J.W., Walston, J.G.: Mesh adaptive direct search algorithms for mixed variable optimization. Optim. Lett.3, 35–37 (2009)

2. Abramson, M.A., Audet, C., Couture, G., Dennis, Jr., J.E., Le Digabel, S.: TheNomadproject (Current as of 15 March, 2021).http://www.gerad.ca/nomad/

3. Abramson, M.A., Audet, C., Dennis, J.E., Jr.: Filter pattern search algorithms for mixed variable con-strained optimization problems. Department of Computational and Applied Mathematics, Rice University, Tech. rep. (2004)

4. Abramson, M.A., Audet, C., Dennis, J.E., Jr., Le Digabel, S.: OrthoMADS: a deterministic MADS instance with orthogonal directions. SIAM J. Optim.20, 948–966 (2009)

5. Adams, B.M., Ebeida, M.S., Eldred, M.S., Geraci, G., Jakeman, J.D., Maupin, K.A., Monschke, J.A., Swiler, L.P., Stephens, J.A., Vigil, D.M., Wildey, T.M., Bohnhoff, W.J., Dalbey, K.R., Eddy, J.P., Hooper, R.W., Hu, K.T., Hough, P.D., Ridgway, E.M., Rushdi, A.: DAKOTA, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity anal-ysis: version 6.5 user’s manual. Sandia National Laboratories, Albuquerque, NM and Livermore, CA (2016).https://dakota.sandia.gov/

6. Audet, C., Béchard, V., Le Digabel, S.: Nonsmooth optimization through mesh adaptive direct search and variable neighborhood search. J. Glob. Optim.41, 299–318 (2008)

7. Audet, C., Dennis, J.E., Jr.: Pattern search algorithms for mixed variable programming. SIAM J. Optim.

11, 573–594 (2000)

8. Audet, C., Dennis, J.E., Jr.: Analysis of generalized pattern searches. SIAM J. Optim.13, 889–903 (2003) 9. Audet, C., Dennis, J.E., Jr.: Mesh adaptive direct search algorithms for constrained optimization. SIAM

J. Optim.17, 188–217 (2006)

10. Audet, C., Hare, W.: Derivative-free and Blackbox Optimization. Springer, Cham (2017)

11. Audet, C., Le Digabel, S., Tribes, C.: The mesh adaptive direct search algorithm for granular and discrete variables. SIAM J. Optim.29, 1164–1189 (2019)

12. Boneh, A., Golan, A.: Constraints’ redundancy and feasible region boundedness by random feasible point generator (RFPG). In: Third European Congress on Operations Research (EURO III). Amsterdam (1979) 13. Cao, Y., Jiang, L., Wu, Q.: An evolutionary programming approach to mixed-variable optimization

prob-lems. Appl. Math. Model.24, 931–942 (2000)

14. Chipperfield, A.J., Fleming, P.J., Fonseca, C.M.: Genetic algorithm tools for control systems engineering.

In: Proceedings of Adaptive Computing in Engineering Design and Control, vol. 128, p. 133 (1994) 15. Ciccazzo, A., Latorre, V., Liuzzi, G., Lucidi, S., Rinaldi, F.: Derivative-free robust optimization for circuit

design. J. Optim. Theory Appl.164, 842–861 (2015)

16. Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to Derivative-free Optimization. SIAM, Philadel-phia (2009)

17. Costa, A., Nannicini, G.: RBFOpt: an open-source library for black-box optimization with costly function evaluations. Math. Program. Comput.10, 597–629 (2018)

18. Custódio, A.L., Vicente, L.N.: Using sampling and simplex derivatives in pattern search methods. SIAM J. Optim.18, 537–555 (2007)

19. Davis, E., Ierapetritou, M.: A kriging based method for the solution of mixed-integer nonlinear programs containing black-box functions. J. Glob. Optim.43, 191–205 (2009)

20. Eberhart, R., Kennedy, J.: A new optimizer using particle swarm theory. In: Proceedings of the Sixth International Symposium on Micro Machine and Human Science, pp. 39–43. Nagoya, Japan (1995) 21. Fermi, E., Metropolis, N.: Numerical solution of minimum problem. Los Alamos Unclassified Report

LA–1492, Los Alamos National Laboratory, Los Alamos (1952)

22. Fowler, K., Reese, J., Kees, C., Dennis, J., Jr., Kelley, C., Miller, C., Audet, C., Booker, A., Couture, G., Darwin, R., Farthing, M., Finkel, D., Gablonsky, J., Gray, G., Kolda, T.: Comparison of derivative-free optimization methods for groundwater supply and hydraulic capture community problems. Adv. Water Resour.31, 743–757 (2008)

23. García-Palomares, U., Costa-Montenegro, E., Asorey-Cacheda, R., González-Castaño, F.: Adapting derivative free optimization methods to engineering models with discrete variables. Optim. Eng.13, 579–594 (2012)

24. Gross, B., Roosen, P.: Total process optimization in chemical engineering with evolutionary algorithms.

Comput. Chem. Eng.22, S229–S236 (1998)

25. Hemker, T., Fowler, K.R., Farthing, M.W., von Stryk, O.: A mixed-integer simulation-based optimization approach with surrogate functions in water resources management. Optim. Eng.9, 341–360 (2008) 26. Holland, J.H.: Adaptation in Natural and Artificial Systems. The University of Michigan Press (1975)

27. Holmström, K., Göran, A.O., Edvall, M.M.: User’s guide for TOMLAB/OQNLP. Tomlab Optimization (2007).http://tomopt.com

28. Holmström, K., Göran, A.O., Edvall, M.M.: User’s guide for TOMLAB 7. Tomlab Optimization (Current as of 15 March, 2021).http://tomopt.com

29. Holmström, K., Quttineh, N.H., Edvall, M.M.: An adaptive radial basis algorithm (ARBF) for expensive black-box mixed-integer constrained global optimization. Optim. Eng.9, 311–339 (2008)

30. Hooke, R., Jeeves, T.A.: Direct search solution of numerical and statistical problems. J. Assoc. Comput.

Mach.8, 212–219 (1961)

31. Huyer, W., Neumaier, A.: Global optimization by multilevel coordinate search. J. Glob. Optim.14, 331–

355 (1999)

32. Huyer, W., Neumaier, A.: SNOBFIT—stable noisy optimization by branch and fit. ACM Trans. Math.

Softw.35, 1–25 (2008)

33. Jones, D.R.: The DIRECT global optimization algorithm. In: Floudas, C.A., Pardalos, P.M. (eds.) Ency-clopedia of Optimization, vol. 1, pp. 431–440. Kluwer Academic Publishers, Boston (2001)

34. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of the IEEE International Con-ference on Neural Networks, pp. 1942–1948. Piscataway, NJ, USA (1995)

35. Kirkpatrick, S., Gelatt, C.D., Vecchi, M.P.: Optimization by simulated annealing. Science220, 671–680 (1983)

36. Kleijnen, J.P.C., Van Beers, W., Van Nieuwenhuyse, I.: Constrained optimization in expensive simulation:

novel approach. Eur. J. Oper. Res.202, 164–174 (2010)

37. Laguna, M., Gortázar, F., Gallego, M., Duarte, A., Martí, R.: A black-box scatter search for optimization problems with integer variables. J. Glob. Optim.58, 497–516 (2014)

38. Larson, J., Leyffer, S., Palkar, P., Wild, S.: A method for convex black-box integer global optimization (2019).arXiv:1903.11366

39. Lewis, R.M., Torczon, V.J.: Pattern search algorithms for bound constrained minimization. SIAM J.

Optim.9, 1082–1099 (1999)

40. Liao, T., Socha, K., de Oca, M., Stützle, T., Dorigo, M.: Ant colony optimization for mixed-variable optimization problems. IEEE Trans. Evol. Comput.18, 503–518 (2013)

41. Liu, J., Ploskas, N., Sahinidis, N.: Tuning baron using derivative-free optimization algorithms. J. Glob.

Optim.74(4), 611–637 (2019)

42. Liuzzi, G., Lucidi, S., Rinaldi, F.: Derivative-free methods for bound constrained mixed-integer optimiza-tion. Comput. Optim. Appl.53, 505–526 (2012)

43. Liuzzi, G., Lucidi, S., Rinaldi, F.: Derivative-free methods for mixed-integer constrained optimization problems. J. Optim. Theory Appl.164, 933–965 (2015)

44. Liuzzi, G., Lucidi, S., Rinaldi, F.: An algorithmic framework based on primitive directions and nonmono-tone line searches for black-box optimization problems with integer variables. Math. Program. Comput.

12, 673–702 (2020)

45. Liuzzi, G., Lucidi, S., Sciandrone, M.: Sequential penalty derivative-free methods for nonlinear con-strained optimization. SIAM J. Optim.20, 2614–2635 (2010)

46. Lucidi, S.: DFL—derivative-free library (current as of 15 March, 2021).http://www.dis.uniroma1.it/

~lucidi/DFL/

47. Lucidi, S., Piccialli, V., Sciandrone, M.: An algorithm model for mixed variable programming. SIAM J.

Optim.15, 1057–1084 (2005)

48. Lucidi, S., Sciandrone, M.: A derivative-free algorithm for bound constrained optimization. Comput.

Optim. Appl.21, 119–142 (2002)

49. Matheron, G.: Principles of geostatistics. Econ. Geol.58, 1246–1266 (1967)

50. Moré, J., Wild, S.: Benchmarking derivative-free optimization algorithms. SIAM J. Optim.20, 172–191 (2009)

51. Müller, J.: Miso: mixed-integer surrogate optimization framework. Optim. Eng.17, 177–203 (2016) 52. Müller, J.: Miso: mixed-integer surrogate optimization framework (current as of 15 March, 2021).https://

optimization.lbl.gov/downloads#h.p_BjSaeAORU9gm

53. Müller, J., Shoemaker, C.A., Piché, R.: SO-MI: a surrogate model algorithm for computationally expensive nonlinear mixed-integer black-box global optimization problems. Comput. Oper. Res.40, 1383–1400 (2013)

54. Müller, J., Shoemaker, C.A., Piché, R.: SO-I: a surrogate model algorithm for expensive nonlinear integer programming problems including global optimization applications. J. Glob. Optim.59, 865–889 (2014) 55. Nelder, J.A., Mead, R.: A simplex method for function minimization. Comput. J.7, 308–313 (1965) 56. Neumaier, A.: SNOBFIT—stable noisy optimization by branch and FIT (current as of 15 March, 2021).

http://www.mat.univie.ac.at/~neum/software/snobfit/

57. Newby, E., Ali, M.M.: A trust-region-based derivative free algorithm for mixed integer programming.

Comput. Optim. Appl.60, 199–229 (2015)

58. Ploskas, N., Laughman, C., Raghunathan, A., Sahinidis, N.: Optimization of circuitry arrangements for heat exchangers using derivative-free optimization. Chem. Eng. Res. Des.131, 16–28 (2018)

59. Porcelli, M., Toint, P.L.: BFO, a trainable derivative-free brute force optimizer for nonlinear bound-constrained optimization and equilibrium computations with continuous and discrete variables. ACM Trans. Math. Softw.44, 1–25 (2017)

60. Powell, M.J.D.: The BOBYQA algorithm for bound constrained optimization without derivatives. Tech.

rep., Department of Applied Mathematics and Theoretical Physics, University of Cambridge (2009) 61. Rashid, K., Ambani, S., Cetinkaya, E.: An adaptive multiquadric radial basis function method for

expen-sive black-box mixed-integer nonlinear constrained optimization. Eng. Optim.45, 185–206 (2013) 62. Rios, L.M., Sahinidis, N.V.: Derivative-free optimization: a review of algorithms and comparison of

software implementations. J. Glob. Optim.56, 1247–1293 (2013)

63. Sauk, B., Ploskas, N., Sahinidis, N.: GPU parameter tuning for tall and skinny dense linear least squares problems. Optim. Methods Softw.35, 638–660 (2018)

64. Schlüter, M., Egea, J.A., Banga, J.R.: Extended ant colony optimization for non-convex mixed integer nonlinear programming. Comput. Oper. Res.36, 2217–2229 (2009)

65. Schlüter, M., Gerdts, M.: The oracle penalty method. J. Glob. Optim.47, 293–325 (2010)

66. Schlüter, M., Munetomo, M.: MIDACO user guide. MIDACO-SOLVER (2016). http://www.midaco-solver.com/

67. Socha, K., Dorigo, M.: Ant colony optimization for continuous domains. Eur. J. Oper. Res.185, 1155–

1173 (2008)

68. Spendley, W., Hext, G.R., Himsworth, F.R.: Sequential application for simplex designs in optimisation and evolutionary operation. Technometrics4, 441–461 (1962)

69. Sriver, T.A., Chrissis, J.W., Abramson, M.A.: Pattern search ranking and selection algorithms for mixed variable simulation-based optimization. Eur. J. Oper. Res.198, 878–890 (2009)

70. Tawarmalani, M., Sahinidis, N.V.: A polyhedral branch-and-cut approach to global optimization. Math.

Program.103, 225–249 (2005)

71. Toint, P.L., Porcelli, M.: BFO—brute-force optimizer (current as of 15 March, 2021).https://sites.google.

com/site/bfocode/home

72. Torczon, V.J.: On the convergence of pattern search algorithms. SIAM J. Optim.7, 1–25 (1997) 73. Vicente, L.N.: Implicitly and densely discrete black-box optimization problems. Optim. Lett.3, 475–482

(2009)

74. Vigerske, S.: Minlplib 2. In: Proceedings of the XII Global Optimization Workshop MAGO, pp. 137–140 (2014)

75. Winfield, D.: Function and functional optimization by interpolation in data tables. Ph.D. thesis, Harvard University, Cambridge, MA (1969)

76. Winslow, T.A., Trew, R.J., Gilmore, P., Kelley, C.T.: Simulated performance optimization of GaAs MES-FET amplifiers. In: IEEE/Cornell Conference on Advanced Concepts in High Speed Semiconductor Devices and Circuits, pp. 393–402. Piscataway, NJ (1991)

Publisher’s Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

ÄHNLICHE DOKUMENTE