The space of generalized gradient approximation (GGA) and meta-GGA (mGGA) exchange approximations is systematically explored by training 25 new functionals to produce accurate lattice parameter, cohesive energy, and bandgap predictions. The trained functionals are used to reproduce exact constraints in a data-driven way and to understand the accuracy trade-off between the mentioned properties. The functionals are compared to notable mGGA functionals to analyze how changes in the enhancement factor maps influence the accuracy of predictions. Some of the trained functionals are found to perform on par with specialized functionals for bandgaps, while outperforming them on the other two properties. The error surface of our trained functionals can serve as a soft-limit of what mGGA functionals can achieve.