Commercialisation Throughout Clinical Development Commercial Awareness from Bench to Market Ethical - PowerPoint PPT Presentation

1 / 39
About This Presentation
Title:

Commercialisation Throughout Clinical Development Commercial Awareness from Bench to Market Ethical

Description:

Everyone has the right to health and wellbeing (Helsinki Declaration) ... that passed pre-clinical development lead to a new registration (Ernst and Young) ... – PowerPoint PPT presentation

Number of Views:87
Avg rating:3.0/5.0
Slides: 40
Provided by: 216148
Category:

less

Transcript and Presenter's Notes

Title: Commercialisation Throughout Clinical Development Commercial Awareness from Bench to Market Ethical


1
Commercialisation Throughout Clinical
DevelopmentCommercial Awareness from Bench to
Market Ethical and Academic AspectsPeter Paul
De DeynMiddelheim Hospital University of
AntwerpACRP - 14.12.02 - Brussels
2
Contents
  • Introduction
  • Examples
  • Planning
  • Execution
  • Data Analysis
  • Reporting
  • Recommendations

3
Industrys mission
  • Industrys Ethical Obligation towards society
  • Contributing to biomedical sciences
  • Providing good products or appliances
  • Improve human well-being world wide
  • Requires marketing / commercialisation
  • Industrys obligation towards shareholders and
    employees
  • Economic Mission
  • Patents 20 yrs - 12 yrs development leaves 8
    years with a possible extension for profit making
  • Marketing / commercialization are needed to be
    able to finance new programmes

4
Interests of the industry in conducting research
  • Different individuals or departments in
    pharmaceutical companies can behave differently
    in this proces
  • Research and Development personnel will want to
    innovate, and come up with new molecules
  • Marketing personnel will want to increase
    marketshare
  • Marketing people tend to get involved in earlier
    development stages

5
Phases in the accomplishment of the common
mission of industry, researchers and authorities
  • Preclinical development
  • Clinical development (RCTs)
  • Registration
  • Commercialisation, distribution, marketing
  • Meta-analysis
  • Evidence-based medicine clinical practice
    guidelines

Planning Execution Data analysis Reporting
6
Actors in marketing and clinical development
  • Patients
  • Researchers
  • Pharmaceutical companies
  • Government and regulatory authorities
  • Journals
  • Financial market

7
What this presentation in not about
  • Fraud and Research
  • Fraud hinders the quest for knowledge by
    promulgating falsehoods
  • It indirectly hinders knowledge advancement by
    destroying trust among researchers
  • It undermines the public support for science
  • There is little ethical doubt that a researcher
    has a duty not to falsify data

Resnick 1996, Science Communication
8
WHO
  • 2001 WHO editorial by Jonathan Quick
  • Clinical trials form the basis of effective
    research
  • Three major perils exist
  • Conflict of interest on the part of the
    investigator
  • Inappropriate involvement of research sponsors in
    study design and management
  • Publication bias in result dissemination

Quick 2001, Editorial, Bulletin of the World
Health Organization, 79 1093
9
Sponsorship, authorship and accountability
  • Position paper by editors of several general
    medical journals
  • Clinical trials can have substantial economic
    impact
  • Until recently, academic, independent clinical
    investigators were key players in the design,
    recruitment and data interpretation of clinical
    trials

Davidoff et al 2001, Lancet 358 854-856 (and 12
other journals)
10
Sponsorship, authorship and accountability
  • However, economic pressures mount
  • Many clinical trials are done to facilitate drug
    approval, rather than to test a novel scientific
    hypothesis
  • Trials have increased in size and cost
  • The average cost of bringing a new drug to market
    in the USA is estimated at 802 million

Davidoff et al 2001, Lancet 358 854-856 (and 12
other journals) Healthcare Economist, April 2006.
11
Sponsorship, authorship and accountability
  • Corporate sponsors can dictate the terms of
    participation in the trial
  • Little or no input into trial design
  • No access to raw data
  • Limited participation in data interpretation
  • Results may be buried if they are unfavourable

Davidoff et al 2001, Lancet 358 854-856 (and 12
other journals)
12
Sponsorship, authorship and accountability
  • Action taken by the editors
  • Guidelines for publication ethics were adapted
  • Authors are requested, upon manuscript
    submission, to confirm in writing that they
  • accept full responsibility for the trial
  • had access to the data
  • and controlled the decision to publish

Davidoff et al 2001, Lancet 358 854-856 (and 12
other journals)
13
Examples
  • Planning
  • Execution
  • Data Analysis
  • Reporting

14
Bias in Research Planning
  • EVIDENCE trial
  • Orphan drugs
  • Non-commercial bias
  • Seeding trials

15
Bias in Research Planning
  • Neurology editorial published November 2002
    Needed in MS Evidence, not EVIDENCE
  • It is unfortunate that the large investment made
    in this trial was not directed toward comparing
    the impact of different preparations or routes or
    frequencies of administration (while holding the
    other factors constant) on disability
  • The same study generated controversy following
    the 2001 AAN meeting in Philadelphia

16
What research does not get done ?
  • Everyone has the right to health and wellbeing
    (Helsinki Declaration)
  • R D for new treatments risky business
  • Unpredictability whether drugs that passed
    pre-clinical development lead to a new
    registration (Ernst and Young)
  • Industry prefers
  • topics which lead to end-products with commercial
    application
  • diseases that affect patients who can afford
    treatment
  • Giving products a legal orphan drug status is
    one answer

17
What research does not get done ? Thioctacid
  • Thioctacid is an agent with presumed efficacy in
    the treatment of diabetic and alcoholic
    polyneuropathy.
  • The compound is being marketed in Germany without
    extremely convincing evidence of its activity,
    the manufacturing and marketing company prefers
    not to take the option to distribute their
    compound throughout Europe since this would
    entail a registration procedure necessitating
    additional placebo-controlled randomized clinical
    trials.
  • In case their compound has indeed specific
    efficacy, it would be unethical not to provide
    the compound to all patients world-wide.
  • As a matter of fact, this specific company
    prefers to keep a significant market for a
    possible placebo or pseudo-placebo instead of
    developing a potential standard therapy for
    patients suffering from diabetic and alcoholic
    neuropathy world-wide.
  • De Deyn et al. In Ethics of animal and human
    experimentation. Eds. De Deyn et al. John Libbey,
    London, 1994.

18
What research gets done or repeated too often?
  • Companies can develop me too molecules to
    compete with existing products
  • eg triptans, statins, cephalosporins,
    thrombolytics
  • Sometimes these are not actually me too
    products
  • Development may occur quasi-simultaneously in two
    or more different companies
  • It can be difficult to assess which ones were the
    first for the break-through idea

19
Bias in research planninghead-to-head studies
  • In this trial design, choice of dosages is
    crucial
  • For the comparator, sub-optimal dosing would lead
    to a bias in favor of the new product
  • Bias might also occur when the comparator,
    against which superiority is proven, is not any
    longer the treatment of choice at completion of
    the study
  • Alternatively, an open label study could also
    induce bias when patients and investigators
    expect better results for the new product

20
Basic Research Planning
  • Basic research can also be funded by industry
  • This can help to position a certain drug, by
    highlighting mechanisms of action.
  • Correct and objective papers may still lead to
    some bias, if certain directions in research are
    explored more then others

21
Examples
  • Planning
  • Execution
  • Data Analysis
  • Reporting

FDA GCP ICH
22
Commercial Biasoutcome
  • Safety of calcium antagonists in arterial
    hypertension (Stelfox NEJM 1998)
  • researchers had ties with the industry in 96 of
    supportive studies
  • researchers had ties with the industry in 37 of
    critical studies
  • Thrombosis risk with 3rd generation oral
    contraceptives
  • Sponsored relative risk 1.1
  • Public financing 2.5

Ned Tijdschrift Geneeskunde 2002
23
Diagnostic Trials biological markers eg.
Tuberculosis
  • Errors in design or reporting are common
  • Significance of trial results are routinely
    overstated
  • A commercial kit for detecting IgG antibody to
    (LAM) lipoarabidomannan showed 79 sensitivity
    for pulmonary tuberculosis
  • Several studies later, an overall sensitivity of
    26 was found in a well designed trial

SmallPerkins 2000, Lancet 356 1048-1049
24
Ten ways to cheat on statistical tests when
writing up results (1)
  • Throw all data into a computer and report as
    significant any relationship where p lt 0.05
  • If baseline differences between the groups favour
    the intervention group, remember not to adjust
    for them
  • Do not test your data to see if they are normally
    distributed
  • Ignore all withdrawals
  • Always assume that you can plot one set of data
    against another and calculate an r value
    (Pearson correlation coefficient), and that a
    significant r value proves causation
  • If outliners are messing up your calculations,
    just rub them out

25
Ten ways to cheat on statistical tests when
writing up results (2)
  • If the confidence intervals of your result
    overlap zero difference between the groups, leave
    them out of your report
  • If the difference between two groups becomes
    significant four and a half months into a six
    month trial, stop the trial and start writing up.
    Alternatively, if the results are nearly
    significant, extend the trial for another three
    weeks
  • If your results prove uninteresting, ask the
    computer to go back and see if any particular
    subgroup behaved differently
  • If analysing your data the way you plan does not
    give the result you wanted, run the figures
    through a selection of other tests.

26
Bias in Data Analysis
  • Primary versus secondary outcome measures
  • Predefined levels of clinical significance of an
    outcome difference
  • Post-hoc analyses (data dredging)
  • Severity of disease
  • Therapeutic time window
  • Subpopulations
  • Arbitrary use of cut-offs after statistical
    analyses ...

27
Pharmaco-Economic Analysis Science or Marketing
?
  • May be used by industry to justify high prices
  • Positive outcomes can be used as marketing tools
  • Authorities request pharmaco-economic analysis in
    phase III studies

28
Examples
  • Planning
  • Execution
  • Data Analysis
  • Reporting

29
Reporting channels
  • Peer reviewed journals
  • Supplements of peer reviewed journals
  • Sponsored (sattelite) symposia
  • Abstracts, posters
  • Proceedings
  • Sponsored monographies
  • Marketing leaflets, booklets
  • Web-sites, Mainstream media
  • Information towards financial analysts

30
Publication Bias Examples
  • Satellite symposia and their proceedings can
    create false impressions of objectivity
  • Peer review remains necessary for these symposia,
    and for special issues of journals
  • Sometimes clear deception is used, cfr
    announcement of EVIDENCE trial results during
    the 2001 AAN meeting in Philadephia
  • At the meeting, attendees were correctly informed
    of a press release
  • Outside of the meeting, the impression was
    created that the data was presented as a paper
    during the meeting

31
Publication Bias Studies
  • Objective to determine the extent to which
    publication is influenced by study outcome
  • Method 748 studies submitted to a hospital
    ethics committee in Australia over 10 years
  • Main outcome measure time to publication

Stern Simes 1997, Publication bias evidence
of delayed publication in a cohort study of
clinical research projects, BMJ 315 640-645
32
Publication Bias Studies
  • Results response to questionnaire was received
    in 70 of cases
  • 218 studies were statistically analysed
  • Those with positive results (p lt 0.05) were much
    more likely to be published than those with
    negative results (p 0.10)
  • Median time to publication after submission to
    IRB 4.7 and 8.0 years

Stern Simes 1997, BMJ 315 640-645
33
Economics and scientific journals
  • The articles editors write give little public
    accounting of how much money their journals make
    and what they do with the profits or surpluses
  • in 1998 Reuters news agency said it would defy
    the embargo or Ingelfinger rule if findings in a
    paper affected the stock market
  • Wigzell on Nature and Science behaving just
    like pharmaceutical companies - gagging the
    scientists to protect some commercial interest of
    their own

Altman 1996, Lancet 3471459-1463
34
Partial Diagnosis and now PreventionRecommendati
ons
  • Orphan drug status
  • Planning
  • GCP and ICH guidelines
  • Execution
  • Conflict of interest disclosure policy
  • Execution and reporting
  • Contract between researcher and industry
  • Execution and reporting/publication policy
  • Trial database
  • Reporting

35
Conflict of interest disclosure policy
  • This may seem trivial, but is often not
    explicitly stated
  • Review Science and Engineering Ethics in 2001
  • Only 16 of 1396 highly ranked scientific and
    biomedica journals had conflict of interest
    disclosure policies
  • Has been implemented on a larger scale during the
    last years
  • Nature did not have a financial disclosure policy
    before 23 August 2001

36
Contract
  • Researchers should be careful about the contracts
    they close with the industry
  • Agreement to publish only after approval of the
    sponsoring company should be limited
  • The sponsors advice should be given within a
    predefined and acceptable period
  • The sponsor should not have power of veto
  • Independent publication committee and steering
    committee should be negotiated
  • Publication policy should be stated in the
    contract in very clear terms

37
International Standard Randomised Controlled
Trial Number
  • Negative trials tend not to be published and
    postive trials tend to be published repeatedly
  • The Cochrane Olanzapine review found 162 reports
    of 15 trials (Duggan et al 2001)
  • Prospective registration should help to prevent
    publication bias
  • Developed by a working group (UK MRC, UK Cochrane
    Centre, trialists, consumers and industry

Cochrane 2001 1online pb093
38
Academic advisers
  • Use of academic advisers in RD is evident
  • Experts in the pathology under study are
    mandatory
  • Needs and tools in the specific domain have to be
    identified with their help
  • Use of academic advisers in marketing efforts
  • Acceptable provided contribution to accessible
    and correct information

39
In Conclusion
Industrys and researchers ethical obligation
towards society, industrys obligation towards
shareholders and employees, and researchers
personal ambitions should be further reconciled
through increased awareness, selfregulation, GCP
and ICH guidelines and a series of recently
introduced recommendations
Write a Comment
User Comments (0)
About PowerShow.com