Information Architecture

1 / 30
About This Presentation
Title:

Information Architecture

Description:

Information Foraging and Information Scent. Paths of Web Use Captures User Goals and Behavior ... Web Scent and Foraging. Web Use Goal Prediction ... – PowerPoint PPT presentation

Number of Views:30
Avg rating:3.0/5.0
Slides: 31
Provided by: gslisU

less

Transcript and Presenter's Notes

Title: Information Architecture


1
Information Architecture Design
  • Week 11 Schedule
  • IA and Web Verification/Evaluation and
    Maintenance
  • Rosenfeld Chapters 20 21
  • Other Primary Readings
  • BREAK
  • Research Topic Presentations (2)
  • 10 Minute Drill
  • 3 Every Week
  • Other 3 Next Week
  • Design Critiques Returned Discussed

2
IA Methodology
Planning
Analysis
Design
Verification
Construction
Maintenance
3
The Verification Phase
  • Verification is ensuring the usefulness of the
    product.
  • Testing the product with the target user to
    uncover weaknesses in the product.
  • Implementing solutions to iron out these
    weaknesses
  • Planning when to return to the Construction phase
    to iron out these weaknesses.

4
Verification/Evaluation
  • Error Tracking
  • Logging
  • Notification
  • User Testing
  • Test Plan
  • Functional tests
  • Completeness tests
  • Evaluating Test Results
  • Metrics

5
The Maintenance Phase
  • Maintenance is providing for future releases of
    the product.
  • Establishing some intervals and responsibilities
    to keep the product up to date.
  • Deciding if it is necessary to return to or
    modify other phases to improve the product or the
    methodology itself.

6
Maintenance
  • Support
  • Post-Mortem
  • Versions
  • Mixed Lifecycle Versioning
  • Maintenance is always more difficult than planned

7
MS Web Intranet Study
  • 3 Million Pages
  • 50,000 (Potential) Users
  • 74 Countries
  • 8,000 Separate Intranet Sites
  • 2.3 Hours a Day Used
  • 50 of Users Time Looking for Information

8
MS Web Intranet Problems
  • Starting Points
  • Navigation Systems
  • Labels
  • Answers Resolution
  • Portal Design
  • Diverse Authoring Tools
  • Diverse Authorship
  • Age of Information
  • Massive Team Approach To Solving Problems

9
MS Web Taxonomies
  • The Language of Clients
  • Descriptive Vocabularies
  • Server Log Analysis
  • Pre-Existing Work
  • Political and Content Experts
  • Universal Applicability
  • Metadata
  • Basics (URL, Desc, Dates, Contact, Status)
  • Extensions (Importance, Categories, Keywords)
  • Category Labels
  • Site Maps
  • Page Terms

10
MS Web Construction/Evaluation
  • Search Log Analysis for Taxonomy Development
  • Controlled Vocabulary Use
  • Set of Tools
  • Metadata Registry
  • Vocabulary Manager
  • URL Catalog
  • Tools Enforce Processes
  • What Other Tools Would Be Appropriate for
    Construction, Evaluation and Maintenance?

11
MS Web Verification For Improvement
  • Helping Where It Hurts (p 403)
  • Fix Major Broken Areas
  • Search
  • Often the Most Broken
  • Often the First To Be Fixed
  • Collection and Analysis Services
  • Portable Search Technologies
  • Any Tool With Import and Export
  • XML
  • Analysis Fixes Problems and Helps Future Design
  • Best Bets Most Likely Applicable Result
  • Interaction Analysis Before and After

12
evolt.org Adaptive Verification
  • Online Community
  • Atypical Users
  • Atypical Development?
  • Different Possible Users Tasks
  • Site Functions Added Variably
  • Gradual Shift in User Functions
  • IA Should Support Community by Sharing and
    Monitoring
  • Let Members Verify IA Structures and Construct
    Content
  • Use Determines What Gets Fixed or Added

13
IA Evaluation Using Heuristics
  • Nielsens Discount Usability Engineering
  • Quick
  • Dependent on Experience of Eval Team
  • Done Throughout the IA Methodology (_at_Design)
  • Group Work Different People Find Different
    Problems
  • Follow Basic Usability Principles
  • Find More Problems Than Time To Fix
  • IA Plan Determines Ranking Problems to Fix
  • Severity Ratings Good, But Ranking is Better
  • Often Too Arbitrary
  • Tie to IA Plan and User Analysis

14
Web Usage Mining
  • VL Verification
  • Data Mining to Discover Patterns of Use
  • Pre-Processing
  • Pattern Discovery
  • Pattern Analysis
  • Site Analysis, Not User Analysis
  • Srivastava, J., Cooley, R., Deshpande, M., Tan,
    P.N. - 2000

15
Web Usage Discovery
  • Content
  • Text
  • Graphics
  • Features
  • Structure
  • Content Organization
  • Templates and Tags
  • Usage
  • Patterns
  • Page References
  • Dates and Times
  • User Profile
  • Demographics
  • Customer Information

16
Web Usage Collection
  • Types of Data
  • Web Servers
  • Proxies
  • Web Clients
  • Data Abstractions
  • Sessions
  • Episodes
  • Clickstreams
  • Page Views
  • The Tools for Web Use Verification

17
Web Usage Preprocessing
  • Usage Preprocessing
  • Understanding the Web Use Activities of the Site
  • Extract from Logs
  • Content Preprocessing
  • Converting Content Into Formats for Processing
  • Understanding Content (Working with Dev Team)
  • Structure Preprocessing
  • Mining Links and Navigation from Site
  • Understanding Page Content and Link Structures

18
Web Usage Pattern Discovery
  • Clustering for Similarities
  • Pages
  • Users
  • Links
  • Classification
  • Mapping Data to Pre-defined Classes
  • Rule Discovery
  • Rule Rules
  • Computation Intensive
  • Many Paths to the Similar Answers
  • Pattern Detection
  • Ordering By Time
  • Predicting Use With Time

19
Web Usage Applications
  • Application Goals
  • Improved Design
  • Improved Delivery
  • Improved Content
  • Personalization (XMod Data)
  • System Improvement (Tech Data)
  • Site Modification (IA Data)
  • Business Intelligence (Market Data)
  • Usage Characterization (User Behavior Data)

20
Real Life Information Retrieval
  • 51K Queries from Excite (1997)
  • Search Terms 2.21
  • Number of Terms
  • 1 31
  • 2 31
  • 3 18 (80 Combined)
  • Logic Modifiers (by User)
  • Infrequent
  • AND, , -
  • Logic Modifiers (by Query)
  • 6 of Users
  • Less Than 10 of Users
  • Lots of Mistakes

21
Real Life Information Retrieval
  • Sessions
  • Flawed Analysis (User ID)
  • Some Revisits to Query (Result Page Revisits)
  • Page Views
  • Accurate, but not by User
  • Use of Relevance Feedback
  • Not Used Much (11)
  • Terms Used Typical
  • Mistakes
  • Typos
  • Misspellings
  • Bad (Advanced) Query Formulation
  • Jansen, B. J., Spink, A., Bateman, J.,
    Saracevic, T. (1998)

22
Analysis of a Very Large Search Log
  • 280 GB Six Weeks of Web Queries
  • 1 Billion Search Requests
  • 285 Million User Sessions
  • Web Users
  • Use Short Queries
  • Mostly Look at the First Ten Results only
  • Seldom Modify Queries
  • Traditional IR Isnt Accurately Describing Web
    Search
  • Phrase Searching Could Be Augmented
  • Silverstein, Henzinger, Marais, Moricz (1998)

23
Analysis of a Very Large Search Log
  • 2.35 Average Terms Per Query
  • 0 20.6 (?)
  • 1 25.8
  • 2 26.0 72.4
  • Operators Per Query
  • 0 79.6
  • Terms Predictable
  • First Set of Results Viewed Only 85
  • Some (Single Term Phrase) Query Correlation
  • Augmentation
  • Taxonomy Input
  • Robots vs. Humans

24
Scent of a (Web) Site
  • Exploring Hypotheses About Web Site Use
  • Goals Analysis and Prediction
  • Predicting Usability of Alternate Designs
  • What is the Overall Site Traffic Flow?
  • Where Do Visitors Come From?
  • What Pages Are Related?
  • What Are the User Interests for a Page?
  • Information Foraging and Information Scent
  • Paths of Web Use Captures User Goals and Behavior

25
Scent of a (Web) Site
  • Look for Longest Repeating Subsequences
  • Among Different Users
  • The Same User Over Time
  • For One Web Site Only
  • Assume User Has Information Goal
  • Users Like Ants Exploring and Foraging
  • Paths are Links from Page to Page
  • Analyze All the Paths and What Were Used
  • Visualization Methods
  • Prediction

26
Using Web Use Evaluation for IA
  • How Can These Ideas Be Used for IA?
  • Verification for Design and Construction
  • Web Usage Clustering and Classification
  • Web Site Design Rules
  • Web Searching
  • Web Scent and Foraging
  • Web Use Goal Prediction

27
Evaluation the Utility Usability of Adaptive
Hypermedia System
  • As Web Sites, Web Users IA Advance How Do You
    Evaluate Them?
  • Help With Large Info Structures
  • Somewhere between System User Control
  • Adaptive Systems Influence User Behavior
  • Less Actions
  • Less Decisions
  • Preferred

28
Adaptive Systems Evaluation
  • Ways to Evaluate
  • Part of Iterative Design Process
  • Time to Task Measurement
  • Diagnostic Testing
  • Goal Measurement
  • How Is This Different?
  • User Perceptions of Adaptation
  • Variable Experience for Each User
  • Longer Evaluation Times
  • Selected Goals and Tasks That Show Adaptation
  • Interfaces and Content Changes!
  • More Users and Evaluations May Be Needed
  • Work Environments, Not Labs
  • Real Content

29
Bonus Points IA Tool Reviews
  • Each Review Contributes 1 point to Final Grade
  • Up to Five IA Tool Reviews
  • Submit Notice of IA Tool Review and Name of Tool
  • IA Tool Review Format
  • One IA Tool (or class of tool)
  • Key IA Functions of Tool
  • Describe Interface and Unique Features
  • Menus
  • Commands
  • Formats
  • Screen Capture of Main Tool Interface
    Additional Noteworthy Features
  • Example of Tool Output (graphic, search results,)

30
Next Week
  • Non-Web IA
  • Readings
  • Two Questions for Class Discussion
  • Class Participation
  • Presentations
  • 10-Minute Drills Continue Project Group Check-In
Write a Comment
User Comments (0)