I haven't read the linked article, but weighting the front end (inputs) of the system under optimization with a covariance matrix is an excellent idea. Moreover, this is exactly what the Kalman filter does to deal with stochastic inputs. (Note: The general solution to the Kalman filter is MIMO, multi-input multi-output. But the "lame solution" used in finance is single-input and single-order, so you're just weighting by the reciprocal of the variance [one input], not the covariance [multi-inputs], in today's Kalman solution employed in finance.)
But I love the idea of weighting the optimization problem by the covariance matrix. If you can find a paper that solves this problem with matrix algebra, I may look at it (But not this year.). Of course, you'll need to install a linear systems package (e.g. Math.NET) for solving that part of the problem. I don't think Math.NET is multi-core, but I don't think multi-core is going to help with the linear systems part anyway. However, a large on-chip cache will. But, you could optimize several stocks independently with multi-core, which would speed up the collective optimization of individual DataSets.
QUOTE:
... is important especially [since] PSO is not compatible with the BTUtils multi-core tool.
And that's the real fix. The PSO works pretty good, but it needs to be updated. Any idea when that might be?
Lastly, I wouldn't try optimizing more than six Preferred Values at a time with any technique. For the fastest,
time-variant variables, I would employ
adaptive indicators. If you don't have one, then write one out of an old indicator. Remember, any parameter optimizer is going to set that parameter to a constant value for the entire Data Range--and in most cases--that's not what you want.