IQFeed provides second data, and 30 second datafiles are much smaller than 1 minute datafiles because the 30 second files only contain less than 100K bars or the data for 6 months rather than the 1.3M bars for 1 Minute data. The updates are much quicker and when wanting to backtest for the past few weeks everything is quicker. Programmatically, second data converts to any datascale, but the WLD gui does not work nicely with the data. For example, 1 minute data converts to 5 minute, 10, ..., Daily, Weekly, by the dropdown on the upper left hand corner of the display. Anything reasonable is possible. Second data will not convert to minute data using the dropdown. To display 5 minute data, you must use 300 seconds. The conversions are in WLD but need to be connected to the GUI. Also it would be helpful if the GUI remembered that last "Custom intraday scale" was seconds rather than minutes.
Size:
Color:
QUOTE:
Second data will not convert to minute data using the dropdown.
Thank you for reporting this. There are 2 workarounds to make WL will pick up the proper scale:
1. Enable on demand data update
2. Update an existing IQFeed DataSet with 1-min (5-min etc.) data.
QUOTE:
IQFeed provides second data, and 30 second datafiles are much smaller than 1 minute datafiles because the 30 second files only contain less than 100K bars or the data for 6 months rather than the 1.3M bars for 1 Minute data.
File size is the downside of having an extensive historical Minute-based data going back many years ago. In a future version we will consider optimizing 1 Minute data files.
Size:
Color:
How about using the
Data Tool to truncate the first part of your 1-minute histories? (Does another word exist to "truncate" at the beginning?)
Select your DataSet, and select Truncate All data... Before 1/1/2020, for example.
Size:
Color:
There are times where being able to run a backtest on historical data is very useful, but when using WL as a highly customized RT analysis screen it is helpful to have the data smaller. I could see optimizing the 60 second data files for RT analysis and leaving 1 minute files full for backtesting might be highly desirable. However, making the graphing work properly is the starting point. It would be unreasonable to update all dataset timeframes for every symbol, but not impossible. If I enable the ondemand update, does it download all data for a symbol for that timeframe?
QUOTE:
(Does another word exist to "truncate" at the beginning?)
Regarding Cone's question, maybe "Trim". Trim before date.
Size:
Color:
Size:
Color:
I am trying to utilize the datatool's data truncation feature and must be using it incorrectly. I would like to limit a datasets symbols to the last n bars. I have tried the Truncate Before and After 100K bars and it appears to do the same thing of giving the first 100K bars and then the symbol is automatically updated when graphed undoing the work. I have also tried Before and After Days and it appears to work just like "Bars" and always keeps only the first 100K bars. What does appear to work is to eliminate data before a date, but it is painful. A symbol with about 1.4 M bars with removing all but the last 128K bars takes just over 7 minutes per symbol. What am I doing wrong?
Size:
Color:
In my testing the feature worked both ways. If the Date option works for you, that's fine. Sorry but we probably cannot increase the speed as the Data Tool makes a standard API call and I don't see any shortcomings.
You might try to create a new .WL file using WealthScript's SaveToFile.
1. Set up a loop with necessary Bars.Count (e.g. last 100K bars)
2. Loop backwards over your original Bars object
3. Keep only the last bars
4. Save
5. Replace the original .WL file
See the QuickRef for a code sample.
Size:
Color:
QUOTE:
What am I doing wrong?
Check the Date option to provide a reference to truncate Before your specified Date. Forget about the number of bars.
Size:
Color:
Robert's answer is precise. You can disregard SaveToFile.
Size:
Color: