Artificial Intelligence (Machine Learning) - Advanced computer algorithms to solve generally complex problems.
Contents
|
|
|
[hide][top]Internal nexusfi.com (formerly BMT) AI Links
[hide][top]External Research AI Links
ABC Information:
Particle Swarm Optimization:
Particle Swarm Optimization:
AI/Machine Learning Libraries:
- AI - Artificial Intelligence (Machine Learning)
- ABC - Artificial Bee Colony
- ACO - Ant Colony Optimization
- ANN - Artificial Neural Network
- CGP - Cartesian Genetic Programming
- GA - Genetic Algorithm
- GP - Genetic Programming
- HMM- Hidden Markov Model
- NN - Neural Network
- ML - Machine Learning (Artificial Intelligence)
- PSO - Particle Swarm Optimization
- SBC - Simulated Bee Colony
- SVM - Support Vector Machine
Generally, to apply Artificial Intelligence/Machine Learning, the problem to solve needs to be defined. This seems logical, but sometimes is skipped and coders rush to a solution. Consider what is the goal of the system:
- Predict the next close price
- Predict market movement X bars out (Up/Down/Chop)
- Is the market condition Choppy?
- What is the confidence the current trend will continue?
- etc.
Once the problem has been defined, one can then look to preparing a hypothesis to solve the problem.
The hypothesis is the formula of the proposed solution. So if in real life, the solution for a time problem is Y(t), the hypothesis generally written as h(t) will be the computers approximate solution. The difference of these two values makes up the error in the system, and usually expressed as means square error:
Mean Square Error = (h(t) - Y(t))^2 "The difference between the hypothesis solution and actual squared"
The hypothesis is a formula made up of parameters (usually referred to by X) chosen to help solve the problem and their associated weights (usually referred to as
Theta, but other references are used):
h(t) = Theta0 + Theta1*X1 + Theta2*X2 + ...+ ThetaN*XN
Notice the Theta0 value with no parameter (or parameter = 1) is referred to as the
Bias term. This helps system converge when the solution doesn't pass through the origin (0,0,0 if you will).
The parameters X1, X2, ... XN need to be selected such that an expert in the proposed problem might agree there is enough data to reasonably supply an accurate hypothesis value. The expert might not know how to do this reliably, but need to agree this data should help provide a solution and is sufficient. Due to the complex nature of the hyperplane (N dimensional virtual surface) formed by the hypothesis and computer optimization.
A good example of what might not be a good hypothesis for predicting the next bar's Close price would be:
h(t) = Theta0 + Theta1*Close[t-1]
Experts will look at the above which says you only look at the current close and then you know the next close is simply not a good enough hypothesis statement. Clearly, if the market is in a strong up trend or down trend the close will go in the direction of the market, this trend data isn't present in the hypothesis to aid in prediction of the next Close. This is likely to be a random number generator which has nothing to do with market price and will not generalize to predict market Close prices.
[hide][top]Artificial Bee Colony (ABC)
Place Holder
[hide][top]Artificial Bee Colony (ABC) - Subgroup Test?
Place Holder
[hide][top]Artificial Bee Colony (ABC) - Subgroup Test?
Place Holder
[hide][top]Artificial Bee Colony (ABC) - Subgroup Test?
Place Holder
[hide][top]Particle Swarm Optimization (PSO)
Place Holder