Archaeological Auto Classification System - PowerPoint PPT Presentation

1 / 24
About This Presentation
Title:

Archaeological Auto Classification System

Description:

... an Image Manipulation system from the Image Pro Plus Image Library. Identify the ... Develop a Database to hold the information gathered form the Images ... – PowerPoint PPT presentation

Number of Views:35
Avg rating:3.0/5.0
Slides: 25
Provided by: iri91
Category:

less

Transcript and Presenter's Notes

Title: Archaeological Auto Classification System


1
Archaeological Auto Classification System
2
History of Development
  • Spring 2002 Concepts of Data Management collides
    with Information Systems.
  • Skip Lohse and Corey Schou discuss automation of
    classification systems.
  • Summer 2002 Corey Schou Proposes having the
    Advanced System Analysis and Design class build
    an Expert System for Skip.
  • Fall 2002 The Advanced System Analysis and
    Design class was given the assignment of
    developing an application to classify Projectile
    Points.

3
Development Team
4
Development Strategy
  • Identify all the potential classification Types
  • Identify the Classification Method
  • Generate an Image Manipulation system from the
    Image Pro Plus Image Library
  • Identify the information needed from each image
  • Develop a Database to hold the information
    gathered form the Images

5
Results of Initial Development
  • 18 types were identified.
  • Discriminant Analysis was dropped in favor of
    Logistic Regression.
  • Decided to write our own Image Manipulation code
    to avoid Licensing Issues.
  • Prototype Database in MS Access, then upsize to
    MS SQL server or other Industrial Database.
  • Eventually dropped Logistic Regression in favor
    of Artificial Neural Networks due to problems
    getting the computer to measure an image.

6
Initial Required Measurements
7
What is an Artificial Neural Network?
  • ANNs are an attempt to model the way a brain
    works by creating a series of artificial Neurons,
    called Nodes, that are tightly connected just
    like a biological brain.

8
Comparison
9
ANNs Continued
  • Just like real Neurons, Nodes can have two
    states
  • Activated
  • Un-Activated
  • Their state depends on the results of an
    Activation Function.

10
Activation Functions
  • There are two types of Activation functions
  • Step Activation
  • Sigmoid Activation

11
Comparison of Activation Functions
12
How does a Node Activate?
  • The Node sums the values of all its inputs,
    applies the result to the Activation function,
    and then sends the resulting value through its
    outputs.
  • The Input values are modified by a Weight
    factor of the path from the outputs of the
    previous nodes.

13
Inputs modified by Weights
  • Outputs
  • Hidden Layer
  • Inputs

14
Results of the Neural Network
  • As the initial input values travel through the
    ANN, they are modified until they reach the
    Output Layer.
  • The various output values are examined, and the
    output with the largest value is the Result.
  • The results of all the output nodes can be
    normalized so that their results fall between 0
    and 1, which gives a probability result for
    each node, rather than a single Result for the
    whole network.

15
Training a Neural Network
  • Supervised
  • Unsupervised
  • We use Backpropagation

16
Backpropagation
  • The main step of training on a pattern utilize
    the following steps.
  • Present the pattern at the input layer.
  • Let the hidden units evaluate their output using
    the pattern.
  • Let the output units evaluate their output using
    the result in step 2) from the hidden units.
  • The steps 1) - 3) are collectively known as the
    forward pass since information is flowing
    forward, in the natural sense, through the
    network.
  • Apply the target pattern to the output layer.
  • Calculate the error on the output nodes.
  • Train each output node using gradient descent.
  • For each hidden node, calculate its error.
  • For each hidden node, use the error found in step
    7) to train according to gradient descent.
  • Steps 4) - 8) are collectively known as the
    backward pass
  • Step 7) involves propagating the error from those
    output nodes in the hidden unit's outputs back
    towards this node so that it can process them.
    This is where the name of the algorithm comes
    from.

17
Backpropagation
18
What are the inputs?
  • An image of a point is taken and its outline is
    extracted.
  • The outline is then split up into a set of
    smaller pieces.
  • Those pieces are then reduced to Tokens, which
    are fed into the network.

19
The steps in generating inputs for the network
20
Demo of the Project
  • AACS Project

21
Problems with the current version
  • Various display problems
  • Problems training the network
  • The tokenizer was wrong
  • The network training function was bad
  • The training set is incomplete
  • The training set lacks definition

22
Definition problems
Rabbit Island Stemmed
Mahkin Shouldered Lanceoalate
23
The Future
  • Attach a database to the interface to capture the
    outputs.
  • Modify the interface to allow for multiple
    artifact types and classification systems.
  • Design a web interface to allow internet access.
  • Two levels of availability Research and Public.

24
  • The End
Write a Comment
User Comments (0)
About PowerShow.com