LREC 2004 2004-05-28 - PowerPoint PPT Presentation

1 / 16
About This Presentation
Title:

LREC 2004 2004-05-28

Description:

SpeechLogic , Prolog Development Center A/S. Laila Dybkj r ... Extremely simple yet illuminating. LREC 2004. 2004-05-28. SpeechLogic & NISLab ... – PowerPoint PPT presentation

Number of Views:23
Avg rating:3.0/5.0
Slides: 17
Provided by: spokend
Category:

less

Transcript and Presenter's Notes

Title: LREC 2004 2004-05-28


1
From Acts and Topics to Transactions and
Dialogue Smoothness Hans Dybkjær
SpeechLogic, Prolog Development Center
A/SLaila DybkjærNISLab, University of
Southern Denmark
2
Automating annotation
  • Key concerns for commercial spoken dialogue
    systems (SDSs)
  • high transaction success rate
  • smooth dialogue
  • Very time consuming and costly to measure
    manually
  • Manual annotation more or less only possibility
    today
  • Two-step approach towards automatic annotation
  • annotate utterances with a basic act-topic
    structure
  • can be automated using a parser
  • transform act-topic patterns into transaction
    segments
  • using rule engine

Automation only way to serious statistics
3
Background
  • Over-the-phone FAQ SDS on holiday allowance
  • General (non-person related) questions, e.g.
  • is Saturday considered a holiday
  • when unemployed who should sign my certificate
  • 2700 lines of grammar, 800 (full) words in
    vocabulary
  • 85 semantic concepts in input, 100 stories in
    output
  • Contractual minimum transaction success rate, but
  • transaction not clearly defined
  • no baseline from human-human dialogues

Complex domain but simple tasks
4
Transaction success
  • Dialogue level task completion?
  • Works if task is well-defined and goal state
    clear
  • But many independent tasks
  • So no single clear goal state
  • Need to define transaction at sub-task level
  • What constitutes a transaction?
  • Initiation and conclusion?
  • Is miscommunication part of a transaction?
  • Sub-task level transactions may also inform on
    which parts of the system may be problematic
  • Start and end does not tell about dialogue
    smoothness

Provide users with required information
5
Our first, manual approach
  • Created act-topic annotation scheme
  • Transactions defined in terms of patterns of
    act-topics
  • A transaction may either be a success or a
    failure
  • PDC created web annotation tool
  • 225 dialogues from test iterations and 217
    dialogues from production system were manually
    annotated
  • Time consuming process
  • Decided to investigate the following claims
  • act-topic annotation at utterance level can be
    automated
  • transactions can be derived from act-topic
    patterns
  • act-topic patterns can inform on smoothness

Investigation still in progress
6
Simple act-topic annotation
  • Simplest scheme distinguishes topic names T
  • Distinguishes 6 acts
  • accept, reject
  • informT any topical information
  • other e.g. Im talking to a computer!
  • pause silence
  • hangup the user or the system disconnects
  • A pattern is a list of utterance level act-topics
  • supplemented with set of conditions
  • Acts added context-independently to utterances
  • System utterances may be annotated at
    design-time, user utterances by using a parser

Extremely simple yet illuminating
7
First step annotation
  • .u .inform T.student
  • I am a student"
  • .s .inform T.student, T.payment, T.money
  • If you are a student living from a grant
    you may get your holiday allowance while still
    studying. If you also work you need to take
    holiday to get your money.
  • .s .inform T.more
  • Is there anything else you want to ask
    about?"
  • .u .accept , .inform T.leave
  • yes what about maternity leave"
  • .s .inform T.leave
  • You asked about - leave"

8
Second step Transformation
  • Apply act-topic rules to step one annotated
    dialogues
  • Transform basic acts into composite acts

.s .inform T.pay "Payment in general .u .accept "Yes"
rule select1 _y .select Ts_a lt-
_x .inform Ts_a _y .accept where
_x ! _y end rule
.u .select T.pay lt- select1 .s .inform T.pay .u .accept
Formal rules
9
Problem in only identifying topics
  • Basically only distinguish between two composite
    acts
  • select continue with same topic
  • request change to new topic
  • So cannot distinguish success and failure

No success .u .inform T.phone Your phone number? .s .inform T.phone Phone number" Success .u .inform T.phone Your phone number? .s .inform T.phone Phone 48204910"
More distinction needed
10
Name and value
  • Solution distinguish
  • topic name N the mentioning of a topic
  • topic value V details about a topic

.u .inform N.phone Your phone number? .s .inform N.phone Phone number" .u .inform N.phone Your phone number? .s .inform V.phone Phone 48204910"
Simple yet powerful distinction may still be
parseable
11
Rule examples
rule select2 _y .select N_b lt- _x .inform N_b _y .inform N_b where _x ! _y end rule rule answer _y .request N_b _x .inform Vs_a lt- _y .inform N_b _x .inform V_b where _x ! _y end rule
(_x ! _y) rules out talking with yourself...
12
Transaction rules
rule success1 _y .success N_b lt- _x .select N_b _y .inform Vs_a where _b in Vs_a _x ! _y end rule rule success3 _y .success N_b lt- _x .request N_b _y .inform Vs_a where _b in Vs_a _x ! _y end rule
Examples of second level transformations
13
Status (1)
  • 13 act-topic rules
  • 9 success/failure rules
  • Tested on 12 FAQ dialogues selected to be very
    different
  • Further testing is of course needed
  • Generalisability is an issue
  • Quick test on a single flight ticket reservation
    dialogue
  • sub-topics city name can relate to both
    departure and arrival so we added a sub-topic
    relation, e.g. from lt place
  • composite tasks contrary to FAQ the sub-tasks in
    ticket reservation are hierarchically related to
    a super-task

No serious trouble yet
14
Status (2)
  • Some rules in act-topic scheme produce acts
    pointing to problems (lack of smoothness), e.g.
  • repair
  • repeated requests for same information
  • Transaction failure also contributes negatively
    to smoothness
  • Open question if requests for repetition and
    system clarification questions count negatively
    to smoothness
  • Further investigation needed of which negative
    contributors there are

Smoothness is related to transactions, but how?
15
What is still needed
  • To obtain a fully automated annotation process
  • parse dialogues to produce basic act-topic
    annotation
  • combine parser and pattern transformer into
    automatic batch system
  • test parser and pattern transformer on larger
    number of FAQ dialogues
  • test on other kinds of dialogue
  • Both theoretical conceptual work and practical
    tools work needed

Input and help from others welcome
16
Smooth dialogues
  • More precise overview of problems and their
    causes and seriousness
  • Same topic may fail and succeed in same call
  • Few or many repairs
  • Information blocks may contain more than asked
    for
  • Distinguish unwanted and erroneous information
  • Erroneous information is unacceptable (tomorrow
    is Friday, phone 36 36 00 01)
  • Other information than asked for may be more or
    less serious (fax instead of phone, fax instead
    of email)
  • Misunderstanding a yes for a no is usually not so
    serious (repairable) but can be a nuisance
  • Misrecognitions

Laila, you are beyond your time frame!
Write a Comment
User Comments (0)
About PowerShow.com