QUOTE:
Building a new PC. One of the goals is to get backtests (esp. on minute data going way back) to be as fast as possible.
One minute data is a great deal of data. You'll need a CPU with as
much cache as possible to fit the problem in cache.
And you may want to tweak some of the internal caching behavior of WL so it's not caching anything more than necessary so you're using every bit of your cache as judiciously as possible.
The other thing you'll be fighting is the Garbage Collection (GC) behavior of the .NET framework. Be sure none of the array sizes change that where created with the "new" operator. This includes any declared DataSeries since they all use the new operator. As long as the new operator isn't called (either to create or change object size), the GC won't need to be called. Remember the rule Hewlett Packard uses for all their avionics: All dynamic arrays must be declared and sized before the aircraft takes off. That's primarily to
avoid calling the GC so deterministic execution time is realized for servicing hard real-time events. But in your case, you want to avoid calling the GC because it will
greatly increase your overall execution time.
You can start by creating any DataSeries
before entering the trading loop.
If you're streaming charts, the set your data ranges relative to a
fixed number of bars, so the size of the dynamic arrays
remains constant. Streaming intrabar activity during the trading day over a date range (instead of a
fixed bar range) will dynamically change array sizes, which is why the GC is called with each new bar (pausing WL) that comes into the streaming chart.