IEEE 2015 MATLAB MODELING NEURON SELECTIVITY OVER SIMPLE MIDLEVEL FEATURES FOR.pptx - PowerPoint PPT Presentation

About This Presentation
Title:

IEEE 2015 MATLAB MODELING NEURON SELECTIVITY OVER SIMPLE MIDLEVEL FEATURES FOR.pptx

Description:

PG Embedded Systems www.pgembeddedsystems.com #197 B, Surandai Road Pavoorchatram,Tenkasi Tirunelveli Tamil Nadu India 627 808 Tel:04633-251200 Mob:+91-98658-62045 General Information and Enquiries: g12ganesh@gmail.com – PowerPoint PPT presentation

Number of Views:10
Slides: 7
Provided by: pgembedded
Category:
Tags:

less

Transcript and Presenter's Notes

Title: IEEE 2015 MATLAB MODELING NEURON SELECTIVITY OVER SIMPLE MIDLEVEL FEATURES FOR.pptx


1
MODELING NEURON SELECTIVITY OVER SIMPLE MIDLEVEL
FEATURES FOR IMAGE CLASSIFICATION
2
ABSTRACT
  • Our proposed that good mid-level features can
    greatly enhance the performance of image
    classification, but how to efficiently learn the
    image features is still an open question. In this
    paper, we present an efficient unsupervised
    midlevel feature learning approach (MidFea),
    which only involves simple operations, such as
    k-means clustering, convolution, pooling, vector
    quantization, and random projection. We show this
    simple feature can also achieve good performance
    in traditional classification task. To further
    boost the performance, we model the neuron
    selectivity (NS) principle by building an
    additional layer over the midlevel features prior
    to the classifier. The NS-layer learns category
    specific neurons in a supervised manner with both
    bottom-up inference and top-down analysis,

3
  • and thus supports fast inference for a query
    image. Through extensive experiments, we
    demonstrate that this higher level NS-layer
    notably improves the classification accuracy with
    our simple MidFea, achieving comparable
    performances for face recognition, gender
    classification, age estimation, and object
    categorization. In particular, our approach runs
    faster in inference by an order of magnitude than
    sparse coding-based feature learning methods. As
    a conclusion, we argue that not only do carefully
    learned features (MidFea) bring improved
    performance, but also a sophisticated mechanism
    (NS-layer) at higher level boosts the performance
    further.

4
EXISTING SYSTEM
  • The existing methods usually make use of a
    hierarchical architecture in which each layer
    accumulates information from the layer beneath to
    form more complex features. The nonlinearity in
    our approach is reflected by the convolution with
    thresholding, max-pooling and VQ. Although these
    operations are adopted by existing methods ours
    is a purely feed-forward one that runs faster and
    does not introduce heavily parameterized
    functions as opposed to PSD and CDBN. Through
    comparison with SIFT and HMAX, we explain why our
    MidFea learns desirable features.

5
Proposed system
  • Our propose a simple and efficient method to
    learn mid-level features, and give the
    explanation why the proposed approach generates
    desirable features. We model the neuron
    selectivity principle over the mid-level features
    by the NS-layer to boost the performance. The
    NS-layer is a general layer that supports both
    top-down analysis and bottom-up inference, which
    is an appealing property in real-world
    application. Through extensive experiments, we
    demonstrate our framework not only achieves
    comparable or even state-of-the-art results on
    public databases, but also runs faster by an
    order of magnitude than the related sparse coding
    based methods.

6
SOFTWARE REQUIREMENTS
  • Mat Lab R 2015a
  • Image Processing Toolbox 7.1
Write a Comment
User Comments (0)
About PowerShow.com