The program for the neural network tool is structured in such as a way
that training classes selected in the neural network tool could also
be used in the maximum likelihood classifier. This facilitates the
implementation of i.maxlik within GRASS for validation of neural
network outut.
In GRASS, the maximum likelihood classifier assumes a
Gaussian distribution for the training data, which is a widely used
method for satellie imagery classification. In the use of the tool,
the user is asked to enter the name of the output map layer, the
number of output classes, and the names of the input map layers.
Using the lump option of the menu, the tool selects the "dominant" category
within a specified window and generates a new map layer. The user can
reset the resolution to the specified window, or retain the old
resolution in which he entered the tool. In existing GRASS routines,
when resolution (window) of a region is enlarged, the middle pixel of
the window in the lower resolution is selected.
Training areas are selected using the define areas option.
Using the zoom option the user can zoom out parts of the output map
in which he wishes to delineate training areas. Training areas can be
delineated by clicking on points, drawing circles, or by drawing
polygons.
Using the delete function, the users can interactively select
and delete polygons, with the number of samples after deletion shown in the window.
Histograms of training sites can be examined and signatures saved so that the
user can use i.maxlik. Once the user is satisfied with the training
sites selected, all input map layers are sampled for their data. At
intersections of training areas with input map layers, training data
for the neural network are gathered.
Training data are stored as an
ASCII file so that the user may examine and change it, if
necessary. Input data to the network is obtained cell-wise from all
areas of the input maps.
The classes option of the neural network tool
lets a user examine the distribution of data when two input map layers
are used. For higher input dimensions, it is necessary to link the
tool to a more sophisticated program such as xgobi. The user may
eliminate outliers, and data conflicts by drawing rectangular boxes
around data points. If necessary a whitening and diagonalization
operation can be done on the data so that better class separability is
achieved. Unlike in using traditional classifiers, careful
preprocessing of the training data should be performed since neural
networks give equal consideration to all data.
Once the user is
satisfied with the class distributions, the configure option is
selected. Here the user selects a quick propagation network, or the
traditional back propagation The quick propagation network uses
gradient descent to adjust weights and assumes a parabolic shape for
global minimum. Iterations of the network are performed by the number
of training cycles set by the user. Back propagation uses gradient
descent and converges to a root mean square error value set by the
user.
In r.nntool, performance of the network as training progresses
is shown on the left half of the GRASS screen. Once training of the
neural network is complete, the user propagates cell values of the
input map layers through the network. The new map layer generated by
the neural network can then be queried. Upon completion of network
training, the user may save the neural network structure such as the
number of input, hidden, and output units, and the network weights.
Ex., Say there are 5 classes. Extend the attribute values for the classes from 0 to 100. So that,
0 - class 1 25 - class 2 50 - class 3 75 - class 4 100 - class 5.This is a limitation of GRASS since the color intensities of a map are determined by the attribute values. The input values to the network are scaled by the highest attribute of each input. Users may wish to try other schemese such as sgn(x)(1 + ln|x|), or transform the data using a squashing function such as tahn(x). Users will have to look at the source code to do this (see nntool.c).
Last changed: $Date: 2002/01/25 05:45:34 $