VB Neural Net
VB Neural Net V1.12
Please note VB Neural is an older development.
Our VB BackProp is not quite an attempt to write another neural net tutorial. There are many very good ones already on the Internet, which we would only plagiarize poorly anyway. We therefore rather recommend users who are not yet familiar with neural nets to google away and/or take a look at: ftp://ftp.sas.com/pub/neural/ (Neural nets newsgroup FAQ)
The following tools are here to provide the beginner/intermediate VBA programmer with some code to get you going with your own backprop neural nets.
VB Backprop is a simple and versatile little neural net application for MS Excel, which is entirely written in Visual Basic for Applications (MS Excel 97 / 2000 / XP / 2K3). The possible drawback is speed to some extent, when compared with C/C++ equivalents. However, it does a fair job in small neural applications, where one shall try and predict a value based on small patterns (ideally 5, or up to 10 inputs), over around 1000 examples in the training set. As the provided example shows, mapping nonlinear functions using VBBackProp is simple and effective. Another example (stock market predictor applied to AOL) is also available.
Many of its features make it a little more than your ordinary feed-forward MLP. Its learning algorithm is the well-known BackProp, with either incremental or batch learning, using the popular Generalized Delta Rule, i.e. with learning rate and momentum.
There are we believe two main benefits in using this program, namely its versatility, and optional access to source code at a very reasonable price.
Many commercial software give the user very little to play with. It can indeed be difficult for most traders to fully comprehend algorithms details, and there is maybe no real need to as long as the neural net is used in circumstances it was designed for. Too many traders however put their faith into programs without really knowing what they are doing. VB BackProp users, and more particularly programmers will quickly realize the importance of finding the right settings to get their neural nets to function properly. For those of you who have explored the world of neural nets, there are indeed faster algorithms than BackProp (like QuickProp, RProp, Conjugate gradient,... for instance), and we shall implement them later according to demand.
It is true that the BackProp algorithm requires setting a learning rate, as well as a momentum factor to guide its training process, whereas other algorithms like those cited above, don't. However, the real complexity lies primarily in setting many training parameters other than the learning rate, like the optimal size of the neural net, or when to stop training. And this applies to all neural techniques.
The good old BackProp is still pretty good once one has experimented the problem and come up with a good learning rate. In practice, and for specific applications, like stock market predictions, one usually keeps learning rate to around 0.1 to 0.3, with a momentum a little higher (0.3 to 0.7), and rather focus on input selection. Whatever model, algorithm, and parameter settings, noise in your input variables is likely to be your worst enemy.
We do not pretend giving users all answers here. However, as many traders are more familiar with Visual Basic and Excel, than C/C++, this tool can serve as an excellent stepping stone into the field of neural nets for traders who wish to craft a neural net tool of their own. There may be of course the inevitable compromise between understandability and speed, but the flexibility and versatility of this tool make it definitely worth having.
Weight updating can indeed occur incrementally, i.e. after each example presentation to the net, or in a batch or epoch mode, i.e. after one or several passes through the whole training set. The implementation of the batch mode lays the ground far faster learning techniques like QuickProp, or RProp.
Among less common features here included:
Although the basic model only covers the conventional 3-layer net, the code allows for very easy additions of several layers of hidden nodes (more than two is a waste of time and effort anyway), as well as jump links.
It is possible to start with no hidden layer and slowly populate the hidden layer with nodes in order to find the optimum size for your network. (akin to the Cascade Correlation technique)
Nodes are added one by one until the process convergence slows down to a grind. This helps determine the optimal net size. We have implemented one node addition technique, but programmers may choose to be more aggressive and decide to add node for instance according to the shape of the error curve.
The learning rate can be reduced gradually or according to the error curve.
Here again programmers can use various adaptive learning rate techniques to accelerate or decelerate the learning process. We have implemented a single learning rate for the neural net, but one can easily define a learning rate at layer level or at node level, for those who would want to Delta-Bar-Delta, or SuperSAB or variations thereof.
All are similar S-shaped sigmoid functions. The logistic computes slightly quicker than tanh. Tanh is sometimes considered better in trading because of its [-1,1] output range. We have however not noticed any significant improvement on most tests. Inverse Abs: f(x)= 1/ (1+|x|) is often used in brain simulation. It is in our view often just as good as the others, and is much faster to compute.
This is a technique used to confirm a local optimum. We have not quite implemented the weight penalty factor commonly seen in research papers though, but our own.
Not to be confused with Input Jitter, i.e. noise addition. This option is not supported by research, but it should confirm a global/local minimum for the error function.
Inputs/ Outputs must be present in the active Excel workbook. To keep things simple, inputs are assumed to be placed in the first contiguous columns of the worksheet. A blank column separates inputs to output. We have not implemented multiple outputs, but the algorithm allows for this.
Inputs/Outputs are scaled according to min/max. An option to correct min max according to a number of standard deviations of the mean has been added. This can reduce the effect of outliers in your data set.
In order to best balance training and generalization, the neural net can be trained against a test set. Although this neural net utility could be used for many purposes, we have kept a historical split, i.e. assuming older data is on top of your data set, the training data will be made of the first n%, the following records to be used for test and/or evaluation.
Cross-validation is an early stopping technique. In a nutshell, the neural net can stop training on time limit, number of iterations (data scans), max error swing, error stagnation, calculated either on the training or test sets.
Some users will find Visual Basic to be a little slow compared to C/C++, and this is true to an extent, but it really isn't that bad. VB handles numerical variables pretty well. The speed difference would be more significant if we dealt with string variables. We encourage experienced VB programmers to try and improve on the code performance. Although this is not a dedicated programmers' tool, the real value of VB BackProp undoubtedly lies in the optional access to source code.
The main benefit is definitely to be able to learn more about neural nets, in particular feed-forward MLPs. For users of other neural net software, you will better appreciate the complexity of finding the right parameter, and will take a second look at your software hidden or default parameters. A lot, if not all learning algorithms are converging processes in order to fit a model to the data set provided. Unlike many other programs, we have here given very extensive control over this process, both on the users' forms, and obviously in the VB code.
Released as is. There will be no update.
Price: The VB BackProp is available in 3 different versions, all ALMOST FREE OF CHARGE: We indeed engage you to contribute towards a welfare organization such as the W.W.F., UNESCO, GreenPeace, or similar.
Spreadsheet neural net sample: Click here to display a Excel sample of a mapping of a quadratic function of 4 variables: VB Net Sample.
Installation: Like any MS Excel Add-in (Contact us for assistance). A new icon represented by a upper case bold N will appear on the main menu bar.
More Info: Some technical details are available here
Page last modified:
May 08, 2008