NXL2 API Specifications
Home Up NXL2 API Specifications NXL3 Notes on Neural Nets

 

 

Home Page
Yahoo Charts


NEW:
ADVANCED
TRADESTATION TECHNIQUE

US Markets Daily
snapshots

Technical
Description


Forthcoming Releases:

Fuzzy Logic:
sFLC3
DLL & API


Neural Net:
Release of NXL3
DLL & API



 

NXL2 API Specifications

In previous versions, we had chosen to keep Microsoft VC "decorated" function names, allegedly for performance purposes, with the caveat we had to use aliases to indicate memory usage for its parameters. In the latest version, we have recompiled the API with a definition file, which in a way is more in line with good coding practice, i.e. making things easier for the user.  The impact on performance is not quite noticeable, although the DLL is certainly bigger.  In good ol'days, many would argue that a DLL must be as small and as fast as possible.

I take no side here, and will gladly recompile the DLL to whoever wants to do extensive performance analysis in this respect.  In the meantime, you'll see below the original declarations with due notice that you may have to remove ALL ALIASES below to be able to use the latest version, otherwise, the DLL will return an error "Can't find entry point for ... in NXL2 DLL".

In the rare event the API specs would change, please refer to the sample spreadsheets which will be always be updated to the latest specs.

Input/Output Settings

Public Declare Function EnableLibrary Lib "NXL2.DLL" Alias "_EnableLibrary@4" (cn As String12) As Integer

This EnableLibrary statement is to be used with your personal code (12 character string) at the beginning of the code to enable NXL2 in its full mode.  If this statement is omitted, NXL2 will remain in demo mode.

Please note that demo mode is no longer enforced.  There is a freeware mode that will work for most neural exercises, and a commercial unrestricted version will be compiled on request... and payment... :)

Input/Output Settings

 

Public Declare Function Load_TestPatternOutput Lib "NXL2.DLL" _
Alias "_Load_TestPatternOutput@12" (ByRef outno As Integer, ByRef patno As Long, ByRef out As Double) As Double
Public Declare Function Load_TrainPatternOutput Lib "NXL2.DLL" _
Alias "_Load_TrainPatternOutput@12" (ByRef outno As Integer, ByRef patno As Long, ByRef out As Double) As Double
Public Declare Function Load_TestPatternInput Lib "NXL2.DLL" _
Alias "_Load_TestPatternInput@12" (ByRef inpno As Integer, ByRef patno As Long, ByRef inp As Double) As Double
Public Declare Function Load_TrainPatternInput Lib "NXL2.DLL" _
Alias "_Load_TrainPatternInput@12" (ByRef inpno As Integer, ByRef patno As Long, ByRef inp As Double) As Double
Public Declare Function Save_TrainPatternInput Lib "NXL2.DLL" _
Alias "_Save_TrainPatternInput@8" (ByRef inpno As Integer, ByRef patno As Long) As Double
Public Declare Function Save_TestPatternOutput Lib "NXL2.DLL" _
Alias "_Save_TestPatternOutput@8" (ByRef outno As Integer, ByRef patno As Long) As Double
Public Declare Function Save_TrainPatternOutput Lib "NXL2.DLL" _
Alias "_Save_TrainPatternOutput@8" (ByRef outno As Integer, ByRef patno As Long) As Double
Public Declare Function Save_TestPatternInput Lib "NXL2.DLL" _
Alias "_Save_TestPatternInput@8" (ByRef inpno As Integer, ByRef patno As Long) As Double

These calls can be used in loops (For-Next or Do-While). This gives a better control on the VB side, and is no big overhead as loading data is fast and is done once only.  The sample spreadsheet client uses sequential access to patterns using a loop.  It is possible to access patterns individually and randomly if so desired.  Please note that it is pointless loading any data before building a neural net structure (the function call is described below).

Please note that Inputs, Outputs, and Pattern Numbers are all 1-based, i.e.
    Inputs range from 1 to i,
    Outputs from 1 to o,
    Patterns Numbers from 1 to size.

From Version 1.3, loading calls now are functions (previously subs) returning the value sent to the DLL.  It may help check the correct data is sent to the NXL2 DLL.

Please note that NXL2 ONLY TAKES NUMERICAL DATA . In most commercial neural software, conversion is done as part of data pre-processing, so here has to be done within the VB client or on the Excel sheet itself.

Sizes

Public Declare Sub Set_TrainingSize Lib "NXL2.DLL" Alias "_Set_TrainingSize@4" (ByVal numpat As Integer)
Public Declare Sub Set_TestSize Lib "NXL2.DLL" Alias "_Set_TestSize@4" (ByVal numpat As Integer)
Public Declare Sub Set_MaxNumNodes Lib "NXL2.DLL" Alias "_Set_MaxNumNodes@4" (ByVal numnodes As Integer)
Public Declare Sub Set_NumInputs Lib "NXL2.DLL" Alias "_Set_NumInputs@4" (ByVal NumInputs As Integer)
Public Declare Sub Set_NumOutputs Lib "NXL2.DLL" Alias "_Set_NumOutputs@4" (ByVal NumOutputs As Integer)

The library uses dynamic arrays internally to allocate enough memory for the problem on hand.  Simpler neural libraries (mostly shareware/freeware) often use fixed size arrays, which limit the number of inputs or outputs, etc,... and could run into problems if array boundaries are not properly controlled from the VB client.  NXL2 users do not have to worry about these issues.

Before the net is built internally, NXL creates all the required data structures, hence the number of inputs, outputs, and the size of the training set will have to be defined.  Failing to do so will safely abort the building process.

MaxNumNodes does not need to be used in the current version.  By default, a net will be constructed with just enough nodes to cater for Inputs/Hidden/Outputs nodes.

Training Parameters ("Accessors")

Public Declare Function Get_NumInputs Lib "NXL2.DLL" Alias "_Get_NumInputs@0" () As Integer
Public Declare Function Get_NumOutputs Lib "NXL2.DLL" Alias "_Get_NumOutputs@0" () As Integer
Public Declare Function Get_NumHiddens Lib "NXL2.DLL" Alias "_Get_NumHiddens@0" () As Integer
Public Declare Function Get_MaxNumNodes Lib "NXL2.DLL" Alias "_Get_MaxNumNodes@0" () As Integer

Public Declare Sub Set_Epsilon Lib "NXL2.DLL" Alias "_Set_Epsilon@8" (ByVal e As Double)
Public Declare Sub Set_WeightRange Lib "NXL2.DLL" Alias "_Set_WeightRange@8" (ByVal wr As Double)
Public Declare Sub Set_SigOffset Lib "NXL2.DLL" Alias "_Set_SigOffset@8" (ByVal o As Double)
Public Declare Sub Set_HyperErr Lib "NXL2.DLL" Alias "_Set_HyperErr@4" (ByVal h As Integer)
Public Declare Sub Set_Decay Lib "NXL2.DLL" Alias "_Set_Decay@8" (ByVal d As Double)
Public Declare Sub Set_Seed Lib "NXL2.DLL" Alias "_Set_Seed@4" (ByVal s As Integer)
Public Declare Sub Set_OFMagnitude Lib "NXL2.dll" Alias "_Set_OFmagnitude@4" (ByVal of As Integer)
Public Declare Sub Set_SplitEpsilon Lib "NXL2.dll" Alias "_Set_SplitEpsilon@4" (ByVal se As Integer)
Public Declare Sub Set_Tolerance Lib "NXL2.dll" Alias "_Set_Tolerance@8" (ByVal tol As Double)
Public Declare Sub Set_MaxFactor Lib "NXL2.DLL" Alias "_Set_MaxFactor@8" (ByVal mf As Double)
Public Declare Sub Set_Pruning Lib "NXL2.DLL" Alias "_Set_Pruning@4" (ByVal p As Integer)
Public Declare Sub Set_PruningCutOff Lib "NXL2.DLL" Alias "_Set_PruningCutOff@8" (ByVal pco As Double)
Public Declare Sub Set_JumpLinks Lib "NXL2.DLL" Alias "_Set_JumpLinks@4" (ByVal j As Integer)

Public Declare Function Get_Epsilon Lib "NXL2.DLL" Alias "_Get_Epsilon@0" () As Double
Public Declare Function Get_WeightRange Lib "NXL2.DLL" Alias "_Get_WeightRange@0" () As Double
Public Declare Function Get_SigOffset Lib "NXL2.DLL" Alias "_Get_SigOffset@0" () As Double
Public Declare Function Get_HyperErr Lib "NXL2.DLL" Alias "_Get_HyperErr@0" () As Integer
Public Declare Function Get_Decay Lib "NXL2.DLL" Alias "_Get_Decay@0" () As Double
Public Declare Function Get_Seed Lib "NXL2.DLL" Alias "_Get_Seed@0" () As Integer
Public Declare Function Get_MaxFactor Lib "NXL2.DLL" Alias "_Get_MaxFactor@0" () As Double
Public Declare Function Get_Pruning Lib "NXL2.DLL" Alias "_Get_Pruning@0" () As Integer
Public Declare Function Get_PruningCutOff Lib "NXL2.DLL" Alias "_Get_PruningCutOff@0" () As Double
Public Declare Function Get_JumpLinks Lib "NXL2.DLL" Alias "_Get_JumpLinks@0" () As Integer

A lot of training parameters have default values as prescribed in the neural literature. It is however possible to change most of the internal training parameters after the net is built and before training starts.

  1. Epsilon is the equivalent of a learning rate in conventional backprops (generally a low positive value from 0.1 to 0.5).
  2. WeightRange is used to set the initial neural network.  In most cases, any value in the region of 0.5 to 1 seems to have little effect on neural performance.  Initial node weights will be generated randomly between - WeightRange and + WeightRange.
  3. Seed can be used to reset the random number generator.  If Seed is set to 0, the same random generator will be re-initialised for every new net, otherwise a fixed seed will be used.  Only advanced users will be interested in this feature, in order to test training parameters with the exact same initial conditions.
  4. SigOffset is used to reduce the effect of flat spots on the Error Curve which could slow down the learning process. A value of 0.1 is often used.
  5. HyperErr is set to 0 or 1.  A value of zero tells the neural net to use straight deltas i.e. Net Output - Actual Output as a error level (cost function).  A value of 1 uses a hyperbolic arctangent of such difference to accentuate error levels. HyperErr = 1 is recommended by Prof. S. Fahlman.
  6. Decay is a penalty factor added to larger weights in the neural network.  In most instances, we'll keep the minute negative default value on, i.e. -0.0001, or not use decay at all (0).
  7. MaxFactor is a parameter set to 1.75 by default, to control possible algorithm overshooting, which could cause unnecessary oscillations while training.
  8. Pruning is used to delete insignificant connections, hence simplifying the neural network structure. Use 0 to disable pruning. 
  9. PruningCutOff is the weight cut-off value.  Pruning is generally a good measure to simplify neural architectures.  It is recommended to train a few nets, and check average node weights before applying pruning.

Saving / Loading the Neural Net
Public Declare Function Get_Weight Lib "NXL2.DLL" Alias "_Get_Weight@8" (ByVal i As Integer, ByVal j As Integer) As Double
Public Declare Sub Set_Weight Lib "NXL2.DLL" Alias "_Set_Weight@12" (ByVal i As Integer, ByVal j As Integer, ByVal val As Double)

Please note that the weight array does not mean anything if the neural net structure is not also saved.  The sample spreadsheet builds a net then either trains it or runs it.  In real life applications, the two processes are generally separated.  It  is therefore the VB programmer responsibility to regenerate a neural net identical to the original one.

Input Contributions

Public Declare Function Contribution Lib "NXL2.DLL" Alias "_Contribution@8" (ByVal inputno As Integer, ByVal outputno As Integer) As Double

This function retrieves individual contribution from inputs. The formula used is available from our Documents section (in PDF format).  Our CF calculation is a truer reflection of input participation than in other commonly used methods.  You will also notice that the sum of contributions does not add up to 100% as our contribution factors also exhibit direction (sign).  It is however trivial to normalize the factors to 100% if so desired.  The individual participation of inputs can then be easily be calculated (CF(i)/SumCF).

Should you prefer the trivial method often used, it can be easily derived from the saved net by taking individual weights in the hidden layer divided the sum of weights in that same layer.

Build Net / Release Net

Public Declare Function BuildNet Lib "NXL2.DLL" Alias "_BuildNet@16" _
(NumInputs As Integer, NumHidden As Integer, NumOutputs As Integer, _
jumplink As Integer) As Integer

Public Declare Function BuildNetL Lib "NXL2.DLL" Alias "_BuildNetL@12" _
(NumInputs As Integer, NumHidden As Integer, NumOutputs As Integer) As Integer

Public Declare Sub ReleaseNet Lib "NXL2.DLL" Alias "_ReleaseNet@0" ()

Two calls are provided to build a net. The first one specifies in the parameter list whether jump links will be used. The second one doesn't.  Jump links combines searches for linear and non-linear solutions, and often gives better results.  The NXL2 library does not specify individual jump links between a particular input and a output.   If the link is redundant, training will attribute a low weight to the link, which can eventually be pruned.

Jump links help detect linear features in the data set.

Both BuildNet and BuildNetL return a value:

  •     0     Build OK
  •     -1    Demo mode
  •     -2    Neural net construct failed: wrong parameters like negative number of inputs, etc
  •     -3    DLL memory already allocated (has previous net's memory been released?)
  •     -4    DLL memory allocation failure (very rare event... is the neural net huge?)

Public Declare Function Get_MemAllocated Lib "NXL2.DLL" Alias "_Get_MemAllocated@0" () As Integer

If a neural net is in memory, the function will return a positive value, otherwise 0.

Training / Testing the Neural Net

Public Declare Function Train_Epochs Lib "NXL2.DLL" Alias "_Train_Epochs@4" (ByVal NumEpochs As Integer) As Double
Public Declare Function Train_1Epoch Lib "NXL2.DLL" Alias "_Train_1Epoch@0" () As Double
Public Declare Function RunTrainingPatternThroughNet Lib "NXL2.DLL" Alias "_RunTrainingPatternThroughNet@8" _
(ByVal patno As Long, ByVal outno As Integer) As Double
Public Declare Function RunTestPatternThroughNet Lib "NXL2.DLL" Alias "_RunTestPatternThroughNet@8" _
(ByVal patno As Long, ByVal outno As Integer) As Double

Public Declare Function TotalNetError Lib "NXL2.DLL" Alias "_TotalNetError@0" () As Double
Public Declare Function AvgTrainNetError Lib "NXL2.DLL" Alias "_AvgTrainNetError@0" () As Double
Public Declare Function AvgTestNetError Lib "NXL2.DLL" Alias "_AvgTestNetError@0" () As Double

' Testing Procedure. TestNet returns TotalTestError
Public Declare Function TestNet Lib "NXL2.DLL" Alias "_TestNet@4" (ByVal outno As Single) As Double

' Get/Reset Test Error
Public Declare Function GetTestError Lib "NXL2.DLL" Alias "_GetTestError@0" () As Double
Public Declare Sub ResetTestError Lib "NXL2.DLL" Alias "_ResetTestError@0" ()

Training can be executed in a VB loop using Train_1Epoch, or executed many times using a single Train_Epochs call.  Using the latter is more efficient, and gives the user better control over network calibration. Calibration is the number of training epochs before the net is validated against the Test Set (an epoch is the presentation of the entire Training Set to the neural network).  In some cases, when predictions or classifications are relatively easy, the Calibration parameter can be set high, up to 100 or more.

For better flexibility, training and test patterns are fed into the neural network within VB loops, using the RunTrainingPatternThroughNet and RunTestPatternThroughNet function calls.


NXL Version Number

Public Declare Function NXL_VersionNo Lib "NXL2.DLL" Alias "_NXL_VersionNo@0" () As Double

Lastly, for those of you interested in boring technicalities, the actual original DLL function exports are available here.

 

Back to NXL page

Last update: 2007/12/08
Best viewed with MS Internet Explorer 5 and above

Today: - Page last modified: December 08, 2007
Copyright ForeTrade Technologies 21st century and thereafter