Welcome to NexusFi: the best trading community on the planet, with over 150,000 members Sign Up Now for Free
Genuine reviews from real traders, not fake reviews from stealth vendors
Quality education from leading professional traders
We are a friendly, helpful, and positive community
We do not tolerate rude behavior, trolling, or vendors advertising in posts
We are here to help, just let us know what you need
You'll need to register in order to view the content of the threads and start contributing to our community. It's free for basic access, or support us by becoming an Elite Member -- see if you qualify for a discount below.
-- Big Mike, Site Administrator
(If you already have an account, login at the top of the page)
I am impressed with all the contributions in this thread. I will read this again and try to make use of it...
However, the real problems I had with NT 7 so far, came from another source. My NT sometimes used more than 50% of the CPU, when I had added some custom indicators to various charts of my workspace, although all the indicators were in CalculateOnBarClose = "True" mode.
The reason for the high CPU usage was the customized plot. The custom plots are calculated for every incoming tick, and this can have a huge impact on performance.
The solution here is to define an external method, which writes the new input values to a DataSeries or arrays object, and then let the Plot Override check for changed values, before performing the calculations.
Great participation and help from everyone who has been visiting here lately!
I attempted to incorporate Richard's latest suggestions into the DoubleStochasticOptimized. I posted it briefly until I realized it had problems. Will re-post after it is fixed.
Regarding previous post by FatTails, if the PlotGraphics method is used, the update frequency can be throttled back by having a minimum millisecond value between screen refreshes, as is done in the RealStats indicator.
if (lastRefresh.AddMilliseconds(LevelIIRefreshDelay) < DateTime.Now){
lastRefresh = DateTime.Now;
}
It looks like the two biggest drains on performance are excessive graphics demand, and excessive and redundant use of calls to external indicators. Addressing both as applicable is very productive, especially combined with the efficiency improvements of NT7.
This version of the Double Stochastics has every coding trick I can think of.
All calls to external indicators are pre-instantiated.
It uses Richard's FastMAX and FastMin indicators instead of MIN and MAX.
Redundant p2 DataSeries from earlier version has been removed.
Calls to the FastMIN and FastMAX indicators can only happen during the first tick of a bar. And they are only called if the internal logic can no longer be sure of the actual minimum and maximum values. Usually it WILL be sure, and will not need to call these external indicators at all.
I consider this to be a coding exercise. Is it worth it? Not sure....
.... I am Ming the Merciless. I will RULE THE UNIVERSE!
Another hint : if you use collections in your code, you have to beware of the complexity of access/insert the data.
Here are the objects I use the most
Most simple object is List
It's basically a checked, autogrowing ,and typed, array.
Access/Read is very quick (if you access by index of course)
You can sort the List, which is o(n log n) complexity
Then you have the Dictionary
It's a generic and typed Hashtable.
Access/Read is o(1) complexity, meaning access time doesn't depend on the size of the Dictionary, and is very quick (obviously less than an array as there is the hashing stuff to compute)
Then you have the sorted objects : SortedList and SortedDictionary
SortedList and SortedDictionary have o(log n) access
SortedList has o(n) insert, SortedDictionary has o(log n) insert
You can access SortedList by index (not possible with Dictionary)
So you have to remember if you're accessing tick data that you will be manipulating large structures, so choose the object with the less complexity that fits your need.
In special cases, if you do lots of random inserts and modifcations, and very rarely you need to access the sorted values, it can be quicker to use a List and sort it when needed than using a SortedList
Since MAX and MIN maintain the max and min, I'm not sure why the indicator code keeps checking to see if the low is lower than the minLow[0]... if it's working correctly it never should be... so without running the code, I imagine OnBarUpdate() should look more like this:
much simpler. Maybe I misunderstood something, but glancing at the code that's what it looks like to me.
I'm still trying to get my head around this, so can you guys help me with something simple? An example of something that I find to be painfully slow is a Market Analyzer grid I use every week or so that looks at the 14, 21, 50, and 200 day ROC of a basket of roughly 70 ETFs. The default ROC code is this:
Is there an example of how this code could be optimized? I'm still not getting what types of calculations should be taken out of OnBarUpdate... or am I totally missing something? Thanks for the enlightenment.
The above will execute every time a BarUpate is triggered. However, during the lifetime of a bar, the value of CurrentBar never changes. So after executing the first time, on successive updates to a bar, this statement is calculating the same thing over and over and over and over.... and over again.
It should only execute ONCE per bar, as soon as a new bar starts.... ie on FirstTickOfBar.
So try this, (and before doing the division on the Value.Set line, be sure that the value of the denominator cannot be zero)
I get the FirstTickOfBar thing if CalculateOnBarClose == false, but in this particular case its historical EOD data.
In retrospect, my performance issue is probably because I'm calling up four indicators per instrument (one each for the 14, 21, 50, and 200 ROC), on roughly 90 instruments.
So perhaps that was a poor choice on my part to use an example.
Let me look at some of the code posted here and I'll post back if I'm still confused.