Send the link below via email or IMCopy
Present to your audienceStart remote presentation
- Invited audience members will follow you as you navigate and present
- People invited to a presentation do not need a Prezi account
- This link expires 10 minutes after you close the presentation
- A maximum of 30 users can follow your presentation
- Learn more about this feature in our knowledge base article
QTA BEYOND 2015
Transcript of QTA BEYOND 2015
analyse a programme’s
to deliver high quality services
of services delivered to clients
SCOPE & SUBSTANCE
EVIDENCE TO ACTION
Summary of clinical audit results to be presented in a tailored way to each IPD region on a routine basis -- perhaps a regional dashboard of sorts to correspond with quarterly review periods
THE CLINICAL AUDIT BEYOND 2015
COUNTRY PROGRAMME TYPOLOGIES
Based on a programme’s typology there will be a corresponding set of benchmarks and assessment methods (one minimum standard across the organisation, but with different 'stretch' goals that certain typologies would have to achieve beyond that minimum standard). For instance:
in a rapidly expanding country programme, clinical governance could be a more important part of the readiness score.
assessment methodologies: we might insist on going in and observing procedures for certain programme typologies, whereas for other typologies we would be happy with desk review. Thinking like this will help us manage scale, and is in line withe accreditation work which has already commenced.
create an algorithm used 1x per year to systematically categorise programmes into different risk categories or ‘quality typologies’ – it would take into consideration things like…
Warning or suspension notices received in the prior 12 months
Previous year’s QTA score
Service delivery growth rate for the previous 3 years
Projected service delivery growth rate for 2015
Provision of identified high risk services?
Note: MDT typologies will likely be different to IPD typologies
1) An aggregate score for evaluating readiness to provide quality services, broken down into:
Facilities and supplies management
2) A dashboard/score to analyse quality of services actually delivered, broken down into:
In 2015 this will have to remain an Excel checklist populated 1x year. As we move away from assessing exclusively through observation, and look at evaluation through other methods, one thing to look at would be how to make this continuous.
In 2015 this will have to take the form on a 'beefed' up QAF dashboard generated 1x per year as part of the audit process -- which would include annual data. Programmes could continue to run the 'slimmed down' dashboard from CLIC whenever they want.
meanwhile, a side benefit: by integrating something that looks like the QAF dashboard into the QTA, we will support to institutionalization of the dashboard
quality of services received
readiness score: 2015
can include a set of indicators looking at readiness to distribute MA drugs - this is an easy 'win' as standards have already been developed
can include new incident management and supportive supervision indicators, and potentially also new product quality indicators to be derived from the product quality module launched in 2014
can adapt technical competency indicators to reflect competency frameworks as these have been developed by Rob for core services already
can add cervical cancer screening and treatment indicators
there are no source materials for many of the elements listed in clinical governance - the first step here in 2015 is to re-write the clinical governance policy. key to this new policy will be setting standards for new datasources, which will pave the way for shifting our methodology away from resource-intensive observation exclusively and creating 'quality accounts' which rely more heavily on desk review and data analysis
there are no source materials for community health works from which to develop indicators for 2015
What about MS Ladies?
quality received score: 2015
many of the data required to analyse these indicators exists already, but is only collected annually -- especially in non-CLIC programmes
aspects of readiness:
outcomes associated with quality of care received:
CLIC, partnership stats, country programme MRPs, exit interviews, mystery client surveys, retail audit assessment toolkit, sales invoices, approved products lists, internal audit scores, incident databases, supportive supervision databases, call centre data (for some programmes)
clinical governance policy 2009
infection prevention policy 2009
medical emergency management policy 2009
family planning policy 2014
counseling policy 2014?
incident management policy 2014
supportive supervision policy 2014
safe abortion care policy 2013
cervical cancer screening and treatment policy 2014
tubal ligation policy 2009
vasectomy policy 2009
vocal local policy?
Product quality module 2014
STI policy 2014
Accountability framework (establishing MAT)?
data sources and means of verification
standard quality reports as mandated by a new clinical governance policy, standard internal audit scores as mandated by a new clinical governance policy, audio-recordings (research currently being conducted on this by IBIS), auditing of tele-medicine (as being trialled in Zambia), remote vignettes (USAID research), OTHERS!
where are we at now?
Ultimately, we want some way of ensuring that data related to clinical quality is considered as part of business planning
Ultimately, as the process becomes more continuous than discrete, we want to see country programmes using the data routinely rather than annually -- this could comprise part of the QAF install/use/benefit work that Barbara outlined
so... how much of this can we do in 2015?
indeed! this score could mirror the QAF dashboard in format -- but would only be generated once per year -- since it would be supplemented with additional indicators drawing on additional datasources which are only available once per year
hmm... this sounds like the QAF...
as the QTA moves towards becoming a continuous rather than discrete process, due to changes in the availability of data (which will be developed as part of an updated clinical governance policy -- more to come on this), the QAF dashboard and the QTA 'quality of services received would morph into a single tool.
readiness to provide services
we can't yet generate this more robust dashboard/score continuously (although programmes with CLIC could continue to generate the 'slimmed down' version of the QAF dashboard and CLIC operational efficiency dashboard