Measuring Financial Success Using Program Evaluation Jump$tart Coalition for Personal Financial Lite - PowerPoint PPT Presentation

1 / 113
About This Presentation
Title:

Measuring Financial Success Using Program Evaluation Jump$tart Coalition for Personal Financial Lite

Description:

Measuring Financial Success Using Program Evaluation Jump$tart Coalition for Personal Financial Lite – PowerPoint PPT presentation

Number of Views:123
Avg rating:3.0/5.0
Slides: 114
Provided by: jlh85
Category:

less

Transcript and Presenter's Notes

Title: Measuring Financial Success Using Program Evaluation Jump$tart Coalition for Personal Financial Lite


1
Measuring Financial Success Using Program
EvaluationJumptart Coalition for Personal
Financial LiteracyFall 2007
2
Instructor
  • Dr. Angela Lyons
  • Associate Professor
  • Department of Agricultural and Consumer Economics
  • University of Illinois Urbana-Champaign
  • (217) 244-2612
  • anglyons_at_uiuc.edu

3
The Million Dollar Question
  • At the end of the day, does
    financial education make a difference?

4
Lessons Learned from Current Research
  • Jumptart Survey of Financial Literacy Among High
    School Students Captures knowledge levels
  • NEFE High School Financial Planning Program
    Impact of formal financial education on
    confidence levels and behaviors of high school
    students
  • Bernheim, Garrett, and Maki (2001) Effect of
    mandated financial education during high school
    (longitudinal study)
  • FDICs Money Smart Program Moving the unbanked
    into the financial mainstream
  • See complete reference list for recent research
    on financial education and program evaluation.

5
Becoming a critical evaluator.
  • Read media stories carefully
  • Look at the samples being used
  • Information vs. education
  • Planned behavior vs. actual behavior
  • Avoid focusing only on the successes
  • Think beyond participants finances
  • Be aware of the barriers and challenges related
    to measuring program impact

6
An Overview of the Training Session
  • Setting the Stage for Program Success
  • The Evaluation Process
    Creating Your Toolkit
  • Putting It All Together
    Sample Evaluations
  • NEFE Financial Education and
    Program Evaluation Toolkit
  • Barriers and Challenges to
    Building Successful Programs
  • Building Program Success
    Reporting Program Impact

7
Setting the Stage for Program Success
8
Current State of Program Evaluation
  • Current evaluation efforts are still far from
    satisfactory
  • General lack of evaluation capacity and
    understanding of how to conduct effective
    evaluations
  • Evaluation is still often treated as an after
    thought needs to be built into the design of the
    program upfront
  • Lack of attention given to evaluation at all
    levels
  • Need for industry standards for program
    evaluation

Source Lyons, A. C., Palmer, L., Jayaratne,
K.S.U., and Scherpf, E.  (2006). "Are We Making
the Grade? A National Overview of Financial
Education and Program Evaluation. The Journal of
Consumer Affairs, 40(2), 208-235.
9
One non-profit administrator commented.
  • The people that typically end up being told
    that they have to do evaluation, its dumped on
    them and its usually not a person that has any
    experience with financial education or expertise
    in evaluation. Theyre pretty much told heres
    your new hat, weve been told we have to do this
    and heres your new hat, and they dont know.
    Its not for lack of wanting to do a good
    evaluation or trying to do a good evaluation.
    They just dont knowits not the right person
    trying to oversee it.

10
On the frontlines.
  • What even is an evaluation?
  • What do we mean by evaluation?
  • How do we know if participants are getting
    better? Its difficult to assess.
  • What are we trying to measure? Theres a lot
    of confusion out there.
  • What constitutes a successful, or even
    acceptable, evaluation?

11
Getting Started Thinking like an
evaluator.(Strategic Planning Guide)
  • Take stock of what you know
  • Identify your signature program(s)
  • Conduct a needs assessment
  • Collect baseline information from your target
    audience
  • Identify your program objectives. Be realistic!
  • What do you want to accomplish? At the end of
    the day, what do you want to show?

12
Outcome-Based Evaluations
  • Outcomes are benefits to clients from
    participating in the program.
  • What do you want your participants to know or be
    able to do when they have finished the program?
  • Outcomes are usually in terms of enhanced
    learning and improved behaviors.
  • Outcomes are often confused with program outputs
    or units of service (i.e., number of clients who
    went through the program)

13
The Logic Model
  • A picture of the program
  • Simple representation of the program theory or
    action which explains the program and what it
    is to accomplish
  • Shows relationships between inputs, outputs, and
    outcomes

14
The Logic Model (conti.)
  • INPUTS

OUTPUTS
OUTCOMES
Resources used to develop the program are called
inputs. Time and money are the most common
inputs needed to implement educational programs.
If inputs are invested into the financial
education program, then learning opportunities
will be created for the target audience. The
created educational materials, services, and
opportunities are called the program outputs.
Changes in participants perceptions, knowledge,
and behavior that represent real impact in their
lives. The benefits derived by the participants
from the program are called outcomes.
15
University of Wisconsin - Extensionhttp//www.uwe
x.edu/ces/pdande/evaluation/evallogicmodel.html
16
Impact Hierarchy of Outcomes
17
Another useful framework.Transtheoretical Model
of Behavior Change (TTM)
  • TTM integrates major psychological theories into
    a theory of behavior change.
  • Used to identify the state at which individuals
    are ready and able to change their financial
    behaviors.
  • Appropriate educational interventions are then
    tailored to meet individuals specific needs at
    that particular stage.

18
5 Stages of Change
  • Precontemplation
  • Individual not ready to take action and change
    behavior in the immediate future.
  • Rarely seeks help and rarely uses information.
  • Contemplation
  • Individual is getting ready to take action and
    intends to change behavior in next
    6 months.
  • Open to education.
  • Preparation
  • Individual is ready to take action and intends to
    change behavior in next 30 days.
  • Practices behavior by taking small steps towards
    the goal.
  • Seeks information and support, but often
    concerned that changing behavior may be too
    difficult and they may not succeed.

19
5 Stages of Change (conti.)
  • Action
  • Individual changes behavior and maintains
    behavior for at least 6 months.
  • Believes they can change.
  • Can control triggers that cause them to relapse
    into old behaviors.
  • Has a support system to get them through
    challenging times.
  • Maintenance
  • Individual has changed behavior and it has lasted
    for more than 6 months.
  • May relapse into old behaviors, but can overcome
    temptations so that behavior becomes permanent.
  • Can assess the conditions under which relapse
    might occur.
  • Can establish successful coping strategies.

20
Example
21
Identifying Program Objectives
  • Objectives should be
  • Specific
  • Measurable
  • Achievable and observable
  • Reasonable
  • Time specific
  • S.M.A.R.T. objective statements should clearly
    define what you want to achieve with your
    program.
  • They should list the end outcomes the program
    intends to affect or change.

22
Writing objective statements
  • First-time home buyer education program
  • The objectives of this program are to
  • Develop first-time home buyers ability to shop
    for the lowest mortgage interest rate.
  • Teach first-time home buyers how to save money
    for closing costs.
  • Teach first-time home buyers how to assess
    affordable housing.
  • Debt reduction education program
  • The objectives of this program are to
  • Develop participants ability to identify needs
    and wants separately.
  • Develop participants ability to control wants
    to reduce expenditures.
  • Develop participants ability to avoid impulse
    and emotional spending.

23
Achieving your objectivesSelecting appropriate
indicators
  • General Indicators (objective and subjective)
  • Number of programs, participants, etc.
  • Knowledge gains
  • Changes in attitudes and satisfaction
  • Changes in skills and confidence
  • Changes in intended and actual behaviors
  • Specific Indicators (objective)
  • Actual dollar changes (reduce debt, increase
    savings)
  • Development of financial plans
  • Changes in spending habits
  • Building or rebuilding credit reports and credit
    scores

24
(No Transcript)
25
Activity Creating a Road Map for Your Program
Evaluation
26
The Evaluation Planning ProcessCreating Your
Toolkit
27
What evaluation method should you use to collect
impact data?
  • Surveys
  • Focus groups
  • Interviews
  • Observations
  • Case studies
  • Tests of ability
  • Some examples from financial education.

28
Questions to ask yourself .
  • What are the pros and cons of each method?
  • What is the purpose of the evaluation?
  • Who will use the information and how?
  • What information do you want to collect?
  • Who is your target audience?
  • What are your available resources (i.e., time,
    money, and staff)?
  • What is your timeline?
  • What is your expertise and evaluation capacity?
  • Who are your partners, funders, and stakeholders?

29
Common survey methods used to collect impact data
  • Post evaluation only
  • Pre and post evaluation
  • Follow-up
  • Stages to Change (TTM)
  • Control groups and longitudinal studies
  • Key question to ask
  • What is the length of your program?

30
Post evaluation only
  • When to use Short programs that are less than 2
    hours
  • Advantages
  • Only need to survey group once
  • Good for limited-resource audiences and groups
    that are transient
  • Relatively inexpensive and less time intensive
  • Can document participants levels of knowledge,
    skills, and planned behaviors at the end of the
    program.
  • Disadvantages
  • With no pre assessment, its difficult to
    document potential and actual changes in
    knowledge, attitudes, and behavior.
  • Retrospective pre-tests (RPTs) The Post-Then-Pre
    Evaluation

31
Retrospective Pre-Test (RPT)
  • Examples and more info on RPTs
  • Collecting Evaluation Data End-of-Session
    Questionnaires. University of Wisconsin-Extension
    . www.uwex.edu/ces/pdande/evaluation/evaldocs.html
  • Lyons, A. C., Y. Chang, and E. Scherpf. 
    Translating Financial Education into Behavior
    Change for Low-Income Populations. Financial
    Counseling and Planning Journal, 17(2) 27-45.
  • Chang, Y. and A. C. Lyons. Are Financial
    Education Programs Meetings the Needs Financially
    Disadvantaged Consumers? Networks Financial of
    Institute, Indiana State University, 2007-WP-02.

32
Pre and post evaluations
  • When to use Programs that are 2 hours or longer
  • Advantages
  • Can compare pre and post responses and document
    changes in knowledge, attitudes, and behavior.
  • Can be used to document immediate changes in
    knowledge, skills and planned behaviors following
    the program.
  • Disadvantages
  • More time intensive.
  • Identification numbers are needed to match pre
    and post surveys.
  • May be difficult to show actual behavior change.
  • May be difficult to show that the intervention
    caused the change.
  • Doesnt account for other possible reasons for
    change.

33
Follow-Ups
  • When to use
  • Program is comprehensive enough to potentially
    result in intermediate and long-term impact.
  • Must have adequate resources and evaluation
    capacity.
  • Usually administered three to six months after
    the program.
  • Can document changes in actual financial
    behaviors, ability to achieve financial goals,
    and overall financial position.

34
Delivery methods for follow-ups
  • Face-to-face
  • Mail (paper survey, post cards)
  • Telephone
  • Internet (e-mail, website)
  • Group interviews

35
Stages to Change (TTM)
  • When to use Programs that have multiple
    sessions
  • Advantages
  • Can document intermediate and long-term change.
  • Easier to measure actual behavior change and to
    control for other factors that may lead to change
    over time.
  • Can identify stage at which individual is ready
    and able to change behavior.
  • Behaviors can be recorded at the beginning,
    middle, and end of the program so that changes in
    actual behavior can be observed.
  • Disadvantages
  • Time and resource intensive.
  • May require additional progress reporting and
    long-term follow-up.
  • Can only be used with multi-session programs.

36
Train-the-trainer evaluations
  • Similar to pre and post evaluation, but more
    content specific.
  • Covers subject material in more detail to ensure
    that trainers have an adequate level of knowledge
    to teach the program to others.
  • Can be used to document changes in both the
    instructors teaching skills and personal
    financial behaviors.
  • Follow-ups can document how the curriculum
    materials are being used and identify additional
    programming needs.

37
Designing the Evaluation InstrumentSurvey
Content
  • General reactions to the session
  • Changes in knowledge
  • Changes in motivation, confidence, and abilities
  • Intended changes in behavior
  • Actual changes in behavior
  • Future programming needs and preferences
  • Demographics
  • Qualitative / open-ended responses

38
General reactions to the session
  • Please rate the instructor(s), materials, and the
    overall program
  • by checking the box that best applies.

39
Measuring changes in knowledge
  • Testing Knowledge
  • Please circle your answer to each of the
    following statements.

40
Measuring changes in knowledge (conti.)
  • Format can be True/False or multiple choice.
  • True/False is reliable indicator for low literacy
    audiences and youth.
  • The more questions you ask, the greater the
    reliability measure.
  • Post-test 5 questions
  • Pre- and post-test 5-10 questions
  • Train-the-trainer 10-25 questions

41
Changes in motivation, confidence, and abilities
  • Building Skills/Confidence Indicators
  • Please check the box that best describes your
    confidence to do
  • the following

42
Changes in motivation, confidence, and abilities
(conti.)
  • Recording Participants Attitudes
  • Please check the box that best describes how much
    you agree
  • with the following statements.

43
Intended changes in behavior
  • Taking Charge Indicators
  • Please check the box that best describes your
    answer.

44
Actual changes in behavior
  • Financial Behavior Indicators
  • For each financial practice, please check the box
    that best describes
  • your current behavior.

45
Using TTM Scale
  • Financial Behavior Indicators
  • For each financial practice, please check the box
    that best describes
  • your current behavior.

46
Capturing behavior change with follow-ups
  • Since completing the program, please check the
    box that best describes
  • how often you are doing each financial practice.
    There is no right or
  • wrong answer. (Choose only one)

47
Capturing behavior change with follow-ups
(conti.)
  • Progress Reporting
  • Please record your financial position based on
    yoru current progress in
  • the program.

48
A few words about train-the-trainer programs.
  • Testing knowledge
  • Building teaching skills
  • Shaping personal skills
  • Taking action for teaching
  • Taking action for personal financial success
  • Follow-ups

49
Qualitative / Open-Ended Questions (common
examples)
  • Post Evaluation Only and Pre and Post
    Evaluation
  • What did you like the most about this program?
  • What did you like the least about this program?
  • How could this program be improved?
  • Would you recommend this program to others?
  • Stages to Change Evaluation
  • What has made it easier for you to improve your
    financial practices?
  • What has prevented you from improving your
    financial practices?
  • With respect to the overall program, what did you
    like the most?
  • What did you like the least?
  • How could this program be improved?
  • Have you shared what you learned with others?
  • Would you recommend this program to others?

50
Qualitative / Open-Ended Questions (conti.)
  • Train-the-Trainer Evaluation
  • What was the most helpful information you
    received during this training program?
  • How could this training program be improved?
  • How do you plan to share this information with
    your target audience(s)?
  • What information and materials from this training
    do you plan to share with your target
    audience(s)?
  • Will you share what you learned with other
    instructors and colleagues?
  • Would you recommend this training program to
    other instructors and colleagues?

51
Demographic Questions
  • Age
  • Gender
  • Race, Ethnicity, and Language
  • Marital Status
  • Education
  • Employment
  • Family Structure
  • Health Status
  • Income, Assets, and Debts
  • Region/Location
  • Financial Experience
  • Students/Youth
  • Instructors/Educators

52
Common types of survey questions
  • Yes/No questions
  • True/False
  • Agree/Disagree
  • Multiple choice
  • One best answer
  • Multiple answers
  • Rating and ranking questions
  • Qualitative / open-ended questions

53
Choosing measurement scales and scoring
  • Example
  • Resource
  • Collecting Evaluation Data End-of-Session
    Questionnaires.
  • University of Wisconsin-Extension, p. 62-64.
  • www.uwex.edu/ces/pdande/evaluation/evaldocs.html

54
Other helpful tips on survey design.
  • Think carefully about how to write the questions
    given your target audience. Use plain language.
  • Make the evaluation form easy to complete (i.e.,
    white space and font).
  • Include simple instructions.
  • Start with non-threatening questions.
  • Keep the evaluation as short as possible.
  • Cluster similar items to save time and space.
  • Protect the participants identity.
  • Consider issues such as sample selection,
    response bias, and measurement error.

55
Putting It All Together! Sample Evaluations
56
Useful references for evaluation design
  • NEFE Financial Education Evaluation Toolkit
  • http//www2.nefe.org/eval/index.php
  • Collecting Evaluation Data End-of-Session
    Questionnaires.
  • University of Wisconsin-Extension.
  • www.uwex.edu/ces/pdande/evaluation/evaldocs.html
  • A Step-by-Step Guide to Developing Effective
    Questionnaires
  • and Survey Procedures for Program Evaluation
    Research.
  • Rutgers Cooperative Research Extension, FS995.
  • www.rcre.rutgers.edu/pubs/publication.asp?pidFS99
    5

57
NEFE Financial Education Evaluation
Toolkithttp//www2.nefe.org/eval/intro.html
58
NEFE Financial Education Evaluation Toolkit
  • Database
  • Post evaluation only with option for follow-up
  • Pre and post evaluation with option for follow-up
  • Stages to Change Evaluation
  • Train-the-Trainer
  • Testing Knowledge
  • Building Skills
  • Taking Charge
  • Manual
  • How-to-guide for grass-roots level organizations
  • Examples (survey instruments, executive summary,
    reports)
  • Guidance on how to organize and present impact
    data

59
Manual http//www2.nefe.org/eval/manual.html
60
(No Transcript)
61
Part I Financial Education Overview
62
Part II Understanding Program Evaluation
63
(No Transcript)
64
(No Transcript)
65
Part III The Evaluation Planning Process
66
(No Transcript)
67
Part IV Using the Evaluation Database
68
Part V Reporting Program Impact
69
(No Transcript)
70
(No Transcript)
71
Appendix Sample Evaluation Instruments
72
Database http//www2.nefe.org/eval/index.php
73
Step 1 Program Info and Follow-up
74
Step 2 Knowledge Questions
75
Step 2a Selecting Questions
76
Step 2b Customizing Questions
77
Step 3 Confidence and Behavior Indicators
78
Step 4a Recommendations
79
Step 4b Selecting Statements
80
Step 4c Customizing Statements
81
Step 5 Qualitative Data
82
Step 6 Demographics
83
(No Transcript)
84
Step 7 Follow-Up Financial Progress Indicators
85
Step 8 Follow-Up Personal Achievements
86
Step 9 Follow-Up Demographics
87
Step 10 Finalizing Evaluation
88
Activity Creating Your Evaluation Action Plan
89
Implementing Your Evaluation Putting Your
Tools into Action
90
Barriers and Challenges of Conducting Program
Evaluations
  • Defining program success
  • Setting realistic expectations for program
    participants.
  • Choosing appropriate outcomes and indicators
    based on participants financial situation or
    other external constraints.
  • Identifying the stage when a participant
    is ready and willing
    to change.
  • Finding the teachable moment.

91
Barriers and Challenges (conti.)
  • What is driving this financial education
    movement? Why is it so important? What are we
    ultimately trying to address? Is it reducing the
    poverty gap in this country? Between those that
    have and those that dont have. And its
    widening. And those at the bottom end of the
    spectrum.what were asking them is to build
    wealth. And at the same time, what were asking
    people in this country who make 20,000 or less
    is Absent us raising your wages in this
    country, were asking you to build wealth, to
    participate in IDA programs. Were asking you to
    save with the little amount of money youre
    making. Were asking you to reduce your debt
    burden, learn how to manage your money, and clean
    up your credit history with the little amount of
    money youre working with. And we want you to
    get from point A to point B with all those
    constraints.

Source Lyons, A. C., Palmer, L., Jayaratne,
K.S.U., and Scherpf, E.  (2006). "Are We Making
the Grade? A National Overview of Financial
Education and Program Evaluation. The Journal of
Consumer Affairs, 40(2), 208-235.
92
Barriers and Challenges (conti.)
  • Collecting data from program participants is
    challenging
  • Little incentive to complete evaluations (like
    pulling teeth).
  • Reluctance to divulge personal information
    (surveys too personal lack of trust).
  • High drop out rates, low response rates, and
    difficult to track.
  • Literacy levels (i.e., ESL, reading level).
  • Tradeoff between participation and evaluation
    rigor.
  • Collecting sensitive data and information.
  • Participants rights and human subjects
    requirements.

93
Barriers and Challenges (conti.)
  • Designing and implementing program evaluations
  • The PUSH for increased rigor.
  • Limitations of one-shot evaluations.
  • (intended vs. actual behavior change)
  • Lack of resources to conduct longitudinal
    studies.
  • (follow-ups and tracking of program
    participants)
  • Control groups help to mitigate selection bias
    but difficult to realistically implement.
  • Evaluation process is cumbersome.
  • A rush to the finish line.
  • Lack of time, staff, and financial resources.

94
Overcoming the Barriers
  • Increase rigor by planning more strategically.
  • Focus on signature programs and on multi-session
    programs.
  • Partner and pool resources.
  • Were jumping into evaluating everything,
    instead oftaking a couple of projected outcomes
    or a subset of all that we work with and trying
    to do evaluations with those.

95
Overcoming the Barriers (conti.)
  • Identify available resources financial and
    non-financial.
  • Understand funders needs and how they fit into
    your evaluation plan.
  • Take into consideration program delivery methods.

96
Overcoming the Barriers (conti.)
  • Establish a consistent and workable set of
    standards for measuring program impact.
  • Create evaluation tools that are flexible to
    account for the wide range in programs (i.e.,
    one-stop shop with survey instruments, best
    practices, online training workshops, etc.)
  • Reality of program evaluation at all levels
    (disconnect need better awareness of resource
    constraints continued recognition of traditional
    evaluation methods).

97
Activity Overcoming Your Barriers and Challenges
98
Building Program SuccessReporting Program Impact
Preview of coming attractions.
99
The common fear of evaluation
  • It will show what were doing wrong!
  • Learning from the successes and failures.

100
Putting it all together
  • Look for themes.
  • Work with what youve got.
  • Learn as you go and be flexible.
  • Tell the story, which can be the most powerful
    depiction of the benefits and services of your
    program.
  • Use the findings to improve your program.

101
Tips for telling your story
  • Know your audience.
  • Use simple descriptive statistics (i.e., counts,
    percentages, and averages) when analyzing and
    interpreting data.
  • Dont use jargon. Be straightforward and
    clearly state major findings.
  • Use language that is suggestive rather
    than decisive (i.e., the data suggest
    rather than the data show). Be
    careful not to overstate your
    findings.

102
  • Blend the presentation with quantitative and
    qualitative data.
  • Do not generalize the findings to the entire
    group. Report the results in terms of the
    program participants rather than all U.S.
    families or all New York residents.
  • Clearly describe who the results represent.
    Provide information and demographics on the
    sample of program participants.
  • Be honest about your programs strengths and
    weaknesses, while highlighting the positive.

103
Writing Impact Statements - Examples
  • Statements that reflect intentions
  • As a result of participating in this financial
    education program, X percent reported that
    they.
  • plan to do/use/adopt
  • are more knowledgeable
  • are more confident in their ability to do
  • are more likely to do/use/adopt
  • will do/use/adopt
  • .a particular attitude, piece of information,
    or behavior.

104
  • Statements that reflect actual actions
  • As a result of participating in this financial
    education program, X percent reported that
    they.
  • are now doing
  • did
  • used
  • increased knowledge of
  • adopted
  • .a particular attitude, piece of information,
    or behavior.

105
Analyzing the findings
  • How will you use the findings for program
    improvement and internal reporting?
  • How will the evaluation findings be communicated
    and shared with others?

106
Disseminating the findings
  • Written reports
  • Short summary statements
  • Media releases
  • Internet postings
  • Graphs and visuals
  • Presentations
  • Displays, posters, etc.

107
Useful references for reporting impact
  • Collecting Evaluation Data Surveys.
  • University of Wisconsin-Extension.
  • www.uwex.edu/ces/pdande/evaluation/evaldocs.html
  • Taking Stock A Practical Guide to Evaluating
    Your Own
  • Programs.
  • Horizon Research, Inc.
  • www.horizon-research.com/reports/1997/stock.pdf
  • Tipsheets 66, 80, 81.
  • Penn State Cooperative Extension.
  • www.extension.psu.edu/evaluation/titles.html

108
Where do we go from here?Useful resources at
your fingertips
109
Link to Program Evaluation
Cornell University Extension http//staff.cce.cor
nell.edu/administration/program/index.htm
110
Cornell University Extension http//staff.cce.cor
nell.edu/administration/program/evaluation/evalref
s.htm
111
University of Wisconsin-Extension http//www.uwex
.edu/ces/pdande/evaluation/index.html
112
General Reading List
  • Lyons, A. C., Palmer, L., Jayaratne, K.S.U., and
    Scherpf, E.  (2006). "Are We Making the Grade? A
    National Overview of Financial Education and
    Program Evaluation. The Journal of Consumer
    Affairs, 40(2), 208-235.
  • Lyons, A. C. (2005). Financial Education and
    Program Evaluation The Challenges and
    Potentials for Financial Professionals. Journal
    of Personal Finance, 4(4), 56-68.
  • US Government Accountability Office. (2004). The
    Federal Governments Role in Improving Financial
    Literacy, GAO-05-93SP.
  • Financial Literacy Education Commission.
    (2006). Taking Ownership of the Future The
    National Strategy for Financial Literacy.
    www.mymoney.gov

113
Questions
Write a Comment
User Comments (0)
About PowerShow.com