Population and Ecological Models
 
 
1]
2]
3]
4]
5]
6]
7]
8]
9]
10]
11]
12]
   
  Parsimony  

 

     
   

Models with more free parameters typically 'fit' the data better, i.e., model error variance is reduced, but at a cost that has to be considered.

The trade-off is that bias in parameter estimates increases, and prediction uncertainty increases, as the likelihood of the model improves (i.e., as the data are better explained by the model).

Too few parameters tend to underfit the data, too many overfit the data - both extremes increase estimation bias.

You may know the principle of parsimony in non-statistical terms if you are familiar with the concept of 'Occam's razor', it advocates favouring the simplest hypothesis consistent with your knowledge.

Notwithstanding a failure of goodness-of-fit diagnostics, the 'best' model is one that optimally balances the competing stresses of better model fit and increased bias and parameter uncertainty.

Fortunately, inference and ranking tools such as Akaike's Information Criterion (AIC or AICc, QAIC, QAICc), Bayes Information Criterion (BIC), and likelihood ratio tests assist investigators in deciding upon the 'best' model.

However, the 'best' model may not be a good enough model if it fails goodness-of-fit diagnostics or retrospective analysis.