Andreas Graefe writes (see here here here):

The usual procedure for developing linear models to predict any kind of target variable is to identify a subset of most important predictors and to estimate weights that provide the best possible solution for a given sample. The resulting “optimally” weighted linear composite is then used when predicting new data. This approach is useful in situations with large and reliable datasets and few predictor variables. However, a large body of analytical and empirical evidence since the 1970s shows that the weighting of variables is of little, if any, value in situations with small and noisy datasets and a large number of predictor variables. In such situations, including all relevant variables is more important than their weighting. These findings have yet to impact many fields. This study uses data from nine established U.S. election-forecasting models whose forecasts are regularly published in academic journals to demonstrate the value of weighting all predictors equally and including all relevant variables in the model. Across the ten elections from 1976 to 2012, equally weighted predictors reduced the forecast error of the original regression models on average by four percent. An equal-weights model that includes all variables provided well-calibrated forecasts that reduced the error of the most accurate regression model by 29 percent.

I haven’t actually read the paper, but I have no reason to disbelieve it. I assume that you could get even better performance using a Bayesian approach that puts a strong prior distribution on the coefficients being close to each other. This can be done, for example, in a multiplicative model like this:

Suppose your original model is y = b_0 + b_1*x_1 + b_2*x_2 + . . . + b_10*x_10, and suppose you want the coefficients b_3,…,b_10 to be close to each other. Then you can write b_j = a*g_j, for j=3,…,10. The equal-weighting model sets g_j=1 for all j. A Bayesian version could set g_j ~ N(1,s^2), where s is some small value such as 0.2. Or something like that. This is just an idea I’ve had; I’ve never actually tried it out.