Send the link below via email or IMCopy
Present to your audienceStart remote presentation
- Invited audience members will follow you as you navigate and present
- People invited to a presentation do not need a Prezi account
- This link expires 10 minutes after you close the presentation
- A maximum of 30 users can follow your presentation
- Learn more about this feature in our knowledge base article
QTA 2015 & beyond
Transcript of QTA 2015 & beyond
Now that we have a consensus about what we want to measure, how can we do that?
What & how are MSI peers measuring to assure quality?
What do we want to achieve through QTA?
The Supply–Enabling Environment–Demand (SEED)™
Assessment Guide for Family Planning Programming
is a tool for identifying strengths and weaknesses in
national FP programs through the identification of
programmatic gaps that require intervention or more
in-depth assessment through other methodologies.
usaid measure project
The primary objective of this initiative was to
develop and test a practical, low-cost methodology for monitoring quality of care (QC) in clinicbased family planning programs in developing countries.
pop council: situation analysis
WHO: Quality of care in the provision
of sexual and reproductive
Why are we having this session?
consideration of scope & organisational scale...
If the facility is the unit of analysis, the results will reflect the experience of clients in the average facility. If the client is used as the unit of analysis, the results will reflect the experience of the average client in the network of facilities.
interventions can enhance a facility’s readiness and /or the quality of care provided.
Different ways to conceptualise how QTA fits into the quality agenda...
Bruce & Jain
RamaRao & Monaham
QAF & the NHS system
Or, is QTA just about these? or more? what should it be? should it also look at impacts?
o • A framework for assessing quality from the client’s perspective: The client usually does not see the apparatus behind her experience, all the vital work required to provide services. Thus, the policies, resource allocation decisions, and management tasks that precede the delivery of services are not directly experienced, but their outcome, the service giving, is.
Lord Darzi established a single definition of quality for the NHS in his 2008 review High Quality Care for All. This definition comprises three dimensions of quality, all of which are required for a high-quality service:
• clinical effectiveness;
• patient safety;
• and, patient experience.
monitoring of which inputs? or is it monitoring of the foundations? and what is the relationship of audit to outcomes represented in the dashboard?
these are the same as the outcomes represented in MSI QAF dashboard
In the QIQ field test, the client was used as the
unit of analysis, since many of the indicators attempt
to capture the client-provider experience. A
basic decision that needs to be made in each future
application of these instruments is whether to
use the client or the facility as the unit of analysis.
If the facility is the unit of analysis, the results
will reflect the experience of clients in the average
facility. If the client is used as the unit of analysis,
the results will reflect the experience of the
average client in the network of facilities.
The Situation Analysis approach to research on the supply side of family planning programs has four essential objectives:
1. To describe the potential of current policies and program standards to promote the delivery of quality services
2. To describe and compare the current readiness of service delivery staff and facilities to provide quality services to clients against the current policies and program standards;
3. To describe the actual quality of care received by clients;
4. To evaluate the impact the provision of quality services has on client satisfaction, contraceptive use dynamics, fulfillment of reproductive intentions, and ultimately, on fertility (in expanded
research designs, most often using a panel of respondents).
The relationship between the four objectives and three levels of measurement-national level, program service delivery point level, and client level-is shown in Figure 1.
The second objective implies a description
of the existing situation at service delivery
points. Typically this "basic" Situation
Analysis study describes the extent to
which the current service delivery system is
"ready" to provide quality services. At the
SDP level, readiness to deliver quality of
care services means that staff are available,
trained, and competent to give services;
commodities and equipment are available,
functioning, and used; and the facility is adequate
to handle the client load. In other
words, the conditions for the delivery of
quality services exist.
The third objective suggests a more analytic study which 1) describes the quality of services received by the client, that is to say, the interaction between the provider and the client, and 2) compares the potential
readiness of the program to provide quality services against the actual receipt by clients of quality services. For example, a clinic that does not have contraceptive commodities, trained staff, and basic equipment does not have the potential to provide quality services. Such a clinic is simply not ready toprovide family planning services of any kind, let alone quality services. On the other hand, the presence of commodities, trained staff, and basic equipment is not a guarantee that quality services actually will be received by a client. A provider, even highly trained, may not want to give services to unmarried adolescents, offer clients a full
range of methods, fully describe each method, discuss side effects of methods, encourage
A representative sample of SOPs, or all SDPs within a geographic area of interest (country, city, district, province), are visited for at least a full day by a team of at least three people, including at least one with clinical training (a physician, nurse, nurse/midwife) and at least one with a social science background and field interview experience.
A complete inventory is taken of equipment and supplies.
Service statistics (if available) are recorded for the past 12 months.
All family planning service providers are interviewed regarding family planning and other reproductive health issues.
Observations are made of the interaction between service providers and all new and continuing family planning clients on the day of the visit.
All clients observed are subsequently interviewed as they leave the SDP.
A selection of MCH clients are also interviewed.
The basic minimum data collection instruments
• Inventory for Facilities Available and
Services Provided at the Service Delivery
• Observation Guide for Interaction
Between Family Planning Clients and
• Exit Interview for Family Planning
• Interview for Staff Providing Family
Planning/Reproductive Health Services
at the Service Delivery Point
• Interview for MCH Clients Attending
the Service Delivery Point
Typically, at the beginning of the day, the
person with clinical training will begin observing
client and provider interactions.
Later, this person will also help one of the
other team members to complete the inventory
of equipment. The social scientist usually
will be responsible for client exit
interviews and staff interviews. He/she may
also help with collecting information
on clinic records, reporting, and service
how does this work?
This paper summarizes the evidence that emerged from 25 research studies completed between 2001 and 2010 and supported by the Social Science and Operations Research Initiative on Quality of Care in Sexual and Reproductive Health
As a first step in ensuring choice, products and services must be available at service-delivery points. In an evaluation of differential contraceptive use among clinics in Chile, investigators found that measures of availability of methods had the greatest predictive power over other aspects of quality
Information given to clients
Counseling sessions are an opportunity for providers to dispel myths, ensure comprehension of treatment instructions and follow-up, tailor treatment to client needs and circumstances, and build rapport for future interactions. Tables 3 provides data on the content of counselling sessions. Data indicate that the opportunity to provide essential information to clients during counselling is often missed.
Several studies documented current clinical
practices – both recommended practices and
the use of non-evidence-based procedures.
Overall, the evidence suggests
unequal implementation of evidence-based clinical
protocols and, in some cases, substandard care. Key
to improving the quality of clinical care, however,
is understanding the reasons for these observed
deficiencies in and barriers to implementation.
Some women who seek free services or subsidized services in the public sector report experiencing mistreatment, disrespect, and even physical abuse (9, 25, 26, 36). In such situations, women lack the power to demand quality services, and even expect mistreatment from care providers. Observers note that providers give differential treatment to clients based on age, social class, or cultural or economic status.
Appropriate constellation of services
This element of quality refers to the organization of
services so that they are convenient and acceptable
to clients, respond to the cultural concept of health,
and meet the health needs of the communities
they serve. While there are many aspects of service
organization that influence the perceived quality
of services, the studies reviewed here highlight
the importance of financial cost and availability of
Next, clients need to be able to obtain services and products that are appropriate for their care. Several studies examined women’s ability to obtain their preferred contraceptive method
Table 3 displays selected findings on information provided to clients regarding STIs and condom use – often difficult topics for providers and clients to discuss. Few clients who receive sexual and reproductive health services are given the benefit of an opportunity to discuss condoms.
Based on interviews with providers and their supervisors, many health professionals simply do not view counselling as a priority, or even an essential component of their job. In interviews, providers report time limitations and disinterest as the main
barriers to providing counselling
In Uganda, providers reported differential treatment based on the client’s perceived ability to pay. For example, providers would not bother referring a client for treatment or contraceptives if the provider felt the client could not pay for such services (14).
In Sri Lanka, women of low socioeconomic status (SES) were less likely to receive adequate home-based postnatal care than women of high SES.
Where services are provided through outreach, providers hold even greater power in determining which clients receive care.
written for high or mid-level FP program managers in ministries of
health, donor agencies, or technical organizations
EngenderHealth’s SEED Programming Model is based on the principle that FP/SRH programs will be more successful and sustainable if they comprehensively address the multifaceted determinants of health, and if they include synergistic interventions that:
Attend to the availability and quality of services an other supply-related issues
Strengthen health systems and foster an enabling environment for FP/SRH-seeking behavior
Improve knowledge of FP/SRH and cultivate demand for FP/SRH services
1. FP is offered through a variety of service delivery modalities.
2. Facilities are adequately equipped and staffed to provide quality FP services.
3. Providers and facility staff have the necessary skills to provide quality FP services.
4. Management, supervision, and quality assurance (QA) and quality improvement (QI)
systems are operational.
5. A broad mix of FP methods is available.
6. FP services are integrated with other health services.
7. Referral systems are functional where FP methods or services are unavailable.
8. T he private sector is involved in the provision of FP services.
9. FP services are inclusive of youth.
10. Clients receive high-quality FP counseling.
each indicator comes with a 1 page guide
a discussion guide for each of thesze types of people is also provided:
11. T he FP program has effective leadership and management.
12. Supportive laws, policies, and guidelines for FP are operational at all levels.
13. Human and financial resources are available for FP and are allocated effectively.
14. Programmatic decision making is evidence-based.
15. Contraceptive security measures are in place.
16. Advocacy efforts support the FP program.
17. Champions at all levels advocate for FP.
18. Communities are engaged in addressing barriers to FP use.
19. T he FP program works to foster positive social norms and transform gender roles.
20. The program reduces the cost of FP to increase demand.
21. An SBCC strategy for FP is in place.
22. Commercial and social marketing are used to create demand.
23. T he FP program utilizes mass media SBCC approaches.
24. T he FP program engages communities and champions in SBCC.
25. The FP program utilizes peer education.
results in a comprehensive in depth narrative report
The FHI QA/QI for Program Implementation
Initiative was launched to support
FHI’s goal of implementing high-quality
programs, and in response to the need
to operate within more dynamic and decentralized
structures. The QA/QI Initiative
incorporates the use of strategies,
frameworks, minimum standards and
SOPs, checklists, proxy indicators and
monitoring processes covering the major
program areas, including all technical activities
as well as program management
and administrative functions.
snapshot: msi field
Quick supervision checklist (Oscar)
Obstetrics: monthly reporting template (Monica)
Internal audit checklists developed by country programmes?
Readiness refers to infrastructure, contraceptive supplies, buildings, etc.
quality of care covers intricacies of client-provider contact, including interactions with health care personnel not directly related to service provision.
improvements in quality are hypotehesized to result in clients’ greater satisfaction and understanding, and in longer-term effects such as extended practice of contraception.
So, is QTA a way of measuing readiness and care? where for MSI 'interventions' mean clinical standards, training & capacity, and anything locally driven?
groupwork: using one of these frameworks, a combination thereof, or something totally different, what do you think the QTA is measuring now? and which aspect of quality should it be measuring to best reflect MDT's broader conceptual approach to quality?
what's our unit of analysis?
where is our duty of care?
community health workers?
marie stopes ladies?
what about organisational risks?
are there any particular risks we want to bear in mind when thinking about clinical audits? for instnace, non-core services...
which aspects of quality service delivery should our indicators look at?
we need a plan
all checklists for all channels in all languages complete, along with corresponding process documents
what are the technical inputs required? and when can they be completed?
who are the stakeholders? and at what point should they be consulted?
from the medical advisers...
groupwork: break into 2 groups and discuss the first two questions, with the aim of reaching consensus. and given our answers to the first two questions, what makes sense re: our unit of analysis? and the way we do site selections?
where could the standards for these aspects of quality service delivery come from?
what are the datasources or assessment methods we could use to assess these standards?
Centres and outreach
pre-brainstorm recap: what does the current tool look at?
what about the new CCS&PT checklist?
We need some modification of the indicators.
CG- # 19 Incident reporting. If there have been no incident, there must be no score, Should be N/O. Same for #18
FP: # 111, we need to separate implant insertion from removal. Some SD can be good in one and not the other.
We need to remove some indicators as they only make teams to get extra scores. For example:- Emergency management : Oxygen indicator 642 to 645. if a team has functioning oxygen they score e 2. But from experience a team had everything but the oxygen tubing could not fit well, so we struggled to manage an emergency. This also reiterate the point that to check &sign emergency equipment daily.
Can we change the heading ‘MEM’ to Emergency Preparedness???
- in addition to looking at the tool specifically, we should also look at different ways of assessing clinical quality objectively and improving it ( the new Clinical Supervision Policy being drafted by Rob being an example )
I know of a program that has been in a peer QTA arrangement for more than a year but has had large numbers of SSI incidents recently due to poor IP practises . Poor IP was also observed at trainings hosted by the program.
I have few ideas for brainstorming as below to make the entire process as more supportive:
1. There are projects which rendering clinical services and are focusing on specific services only mainly in OR. Like LAM in Nepal rendering long term reversible methods including IUD and Implant in remote OR using public health posts, IUD Roving team under MSB providing only IUDs in OR using public health facility, Pehli Kiran (PK) under MSS( Pakistan) providing temporary methods including IUDs, at the household level, etc.
Section of Medical Emergency Management may be reviewed to come up with limited logistic supplies for these low resource settings.
2. Under Clinical Governance, there are few indicators on clients’ satisfaction. However, there is nothing to address provider’s satisfaction.
3. Some elements of Supportive Supervision may be included in the checklist to assess that internal monitoring/supervisions are in line with supportive supervision ( good relationship, two way communication, team work, etc). These will ultimately help sustainable improvement in quality, sense of empowerment, ownership and feeling of valued by the organization. Rob may help us in this regard.
5.We may also explore to introduce a section on assessing providers knowledge as it is done in COPE monitoring system practiced by USAID supported organizations.
Obstetrical and Gynaecological Society of Bangladesh (OGSB) + Marie Stopes Clinics Society (MSCS): ASSESSMENT CHECKLIST FOR Technical support from OGSB on Comprehensive Reproductive Health Care Centre run by MSCS
1x per channel, + 1 group for new channels (i.e. MS Ladies, CHW, social marketing)
what are the datasources within the organisation??? or what sources of information does MDT have available to it (either actually or potentially) to understand the state of quality in a programme? or what new kinds of information could MDT easily look at?
what are the datasources within the organisation??? or what sources of information does MDT have available to it (either actually or potentially) to understand the state of quality in a programme?
how do these two compare??
master checklist complete, execution of various channels/languages and corresponding documents (TORs, etc.)
QTA: when, where, by whom?
Site selection requirements & process
the bigger picture
for instance: the table below compares the services observed through Centres and Outreach QTA in Ethiopia to projected total service delivery targets in 1x country programme:
note: service delivery information broken down by social franchises is available, however the MDT doesn't consistently capture the number of clients/services observed as part of the SF QTA process... yet...
What's the context?
What & how are MSI peers measuring to assure quality?
What do we want to achieve through QTA?
If we have a consensus about what we want to measure, how can we do that?
Can we agree a top-line project plan, identify key inputs and stakeholders?
question to bear in mind throughout the session is how to manage scale and a growing organisation
early morning warm-up
TED Talk: Joel Selanikio:
The surprising seeds of a big-data revolution in healthcare
Carrying out this QTA required 10 days of a consultant's time (from Ghana), and cost approximately £3,500.