Subsymbolic Knowledge Extraction Environment for a Time-Series prediction parallel Neural Network
The goal of this research is to build an integrated environment to serve as a possible
assistant for a teacher during the educational process. Getting useful information for the teaching process is very difficult when dealing with unstructured knowledge.
The complete Subsymbolic Knowledge Extraction Environment (SSKEE) system contains three building blocks: the neural network module, the knowledge extraction module and the user
As the system was built from scratch, during the design and implementation many contributions and results were obtained, as follows.
The first module described is the neural network module.
As a building block for the neural network I used an original energy function.
In this thesis I will also present the construction and usage of the energy
function for supervised learning on feedforward networks,
based on restrictions.
The focus is on the mathematical deductions of the energy function,
based on the Lyapunov (also called infinite) norm, from error
I will show how the movement equations derived from this
energy function improve the learning and generalization capacity of
the neural tool in the case of stock exchange (SE) prediction, in
the sense of time-series (TS) prediction.
I will also show some comparative results of my method and the
classical backpropagation (BP) method, obtained by means of the T (Theill)
test and the correlation computation.
The verification of the proposed energy function is done through
Moreover, in the context of the ANN implementation, the parallel aspects of an ANN, as well as
the advantages of a parallel implementation of a ANN will be discussed. The different levels of
parallelism, namely, coarse, medium and fine grain parallelism are put in correspondence with the
ANN functions and structure. For instance, at a coarse grain parallelism, the layer parallelism
is considered. At the medium level, I studied the neuron level parallelism. At the fine level
I had to consider the very functions of the ANN and to enter the instruction level parallelism.
Out of these types, an original combined optimization is built and described.
The second module referred to in this thesis is the knowledge extraction module.
can store sub-symbolic knowledge, but until recently it was believed to be only in a
"black-box" format. Knowledge extraction from ANNs is a relatively new
field, which tries to reduce these disadvantages and build a bridge between
sub-symbolic and symbolic knowledge.
As the teaching process requires only symbolic
knowledge, I believe this new tool to be a chance for teachers to significantly improve their
teaching materials and/or style by combining the symbolic knowledge of the domain
theory with the rules extracted from the empirical sub-symbolic knowledge stored in ANNs
trained on examples.
The third module, the user interface module is the one to make the link to the end user, in this case,
The whole system adds up to a Neural Network's Sub-Symbolic Knowledge Eliciting Environment for a possible application in the Education Process, based on a
Time-Series prediction parallel Neural Network, verified on a case study of teaching stock exchange developments
in a classroom teaching process, with the theme Economy.
read short version in html format: here
download FTP short version: here
Mail comments to:
<- Back to my homepage