Abstract
Optimizing industrial processes often involves gray‐box models that couple algebraic glass‐box equations with black‐box components lacking analytic derivatives. Such systems challenge derivative‐based solvers. The classical trust‐region filter (TRF) algorithm provides a robust framework but requires extensive parameter tuning and numerous black‐box evaluations. This work introduces four Hessian‐informed TRF variants that use projected positive definite Hessians for automatic step scaling and minimal tuning, combined with both low‐fidelity (linear, quadratic) and high‐fidelity (Taylor series, Gaussian process) surrogates for local black‐box approximation. Tested on 25 gray‐box benchmarks and five engineering case studies, the new variants achieved up to order‐of‐magnitude reductions in iterations and black‐box evaluations, with reduced sensitivity to tuning parameters relative to the classical TRF algorithm. High‐fidelity surrogates solved 92%–100% of problems, compared with 72%–84% for low‐fidelity surrogates. The developed TRF methods also outperformed classical derivative‐free optimization solvers. Results show that new variants offer robust, scalable alternatives for gray‐box optimization.