The Healthy states, progressive India report (HSPI report) seeks to assess various Indian states in terms of their performance in health over time. I have noted below a few quick points from a Karnataka-centred perspective. The note is written mainly for the purposes of generating discussion and debate on strengthening Karnataka’s health. This assessment is by no means a comprehensive summary of the report. The full report is available on the Niti Aayog website here: http://niti.gov.in/content/healthy-states-progressive-india-report-ranks-states-and-union-territories#
In comparing Indian states, are we not comparing apples with oranges?
Indian states are very diverse in their socio-economic, cultural, political and health services/systems organisations. Health being a state subject, states have also organised financing and resourcing, management and monitoring of their states differently. However, by carefully choosing a mix of inputs and processes into the health system, performance-based indicators (like coverage rates and health service output indicators) and indicators of health outcomes (such as mortality rates for infants and children, mothers), the HSPI report seeks to build a composite index based on which states can be ranked. If done periodically, this can serve as a good tool to assess progress of one state with respect to another. That said, a careful understanding of what goes into these indices will help better interpret why one state is doing better than another.
Figure 1: Categorisation of states (p.12)
The report seeks to stratify states by size to better compare between them. So, larger states are compared with each other, while smaller ones with each other and Union territories form another comparison group.
The index used is a weighted composite Index based on indicators in three domains. In each domain indicators have been assigned a weight based on the expert’s assessment of their relative importance. Each indicator is standardised on a scale of 0 to 100 and compared between the baseline year (2014-15) and reference year (2015-16). The weighing of the indicators shows a higher weightage to health outcomes and key inputs/processes (health systems/service delivery) indicators with relatively lesser weightage to the data integrity of the health monitoring information systems. Interestingly, the OOP data has not been included in the analysis of the rankings, so states which may have helped decrease OOPE for their populations may not have translated into better indices. The three broad domains on which indicators have been chosen for the composite index are:
- Health Outcomes
- Governance and Information
- Key Inputs/Processes
Figure 2: Page 3 of report showing absolute & incremental scores of larger states
Where is the data from? Do the states agree with this data?
The report makes use of data fro 2014-15 and from 2015-16 based on data given by the states and later validated by the experts. In many instances, the data used is from nationwide surveys such as SRS and NFHS. They have also triangulated sometimes the same indicators from the HMIS and from the independent surveys and used that to decide how reliable the HMIS systems are.
How did they do the rankings? How can states be compared?
The various indicators in the three domains listed above were converted into a single number on a scale of 0 to 100 (using well described methods in the report). So, each state had a score for each of the indicator chosen. Scores were calculated for both years, the base year (2014-15) and for the reference year (2015-16) to allow for an assessment to be made on (1) whether the state improved with respect to a given indicator, (2) by how much it improved. Based on this, an overall rank on the scale from 0 to 100 was assigned for each state, for both years, allowing us to rank the states as they are all organised on a common scale.
What does the ranking capture? How has Karnataka performed?
There are two types of rankings. Rankings which are absolute for a given year (base year 2014-15 or reference year 2015-16), and rankings based on the states that moved most versus states that haven’t moved much.
So, positive movement appears to be easier/higher in states that had a rather low baseline due to the difficulty in improving indicators the higher they already are (cf. Kerala and Karnataka to some extent). See for eg. Jharkhand’s change in the composite index of over 6 points, where as Kar/Guj of just 1-2 points, because Jharkhand is coming up from a lower absolute score. In some instances (fig E1, p.3), states whose ranking has fallen in the one year have a poor “incremental rank” although because of their total score being quite high, have not altered much in their ranking. See for eg. Kerala, whose “incremental rank” is one of the poorest in India due to rather minor drop in its performance in the one year, but since most worse-off states were moving forwards (rather than dropping), they have ranked much higher than Kerala in “incremental rank”. But in terms of absolute rank this year, Kerala is still high up (because of high overall ranking). Karnataka for eg. fell by – 1.03 points, whereas Guj fell by a higher number of points – 1.29, but the overall score of Guj is still higher (61.99 for Guj in 2015-16 vs 58.70 for Kar). This is also the reason why Jharkhand, Jammu & Kashmir, and Uttar Pradesh despite having low overall scores are the top “gainers” in terms of improvements in the one year period.
So, among larger states “Uttarakhand, Himachal Pradesh, Karnataka, Gujarat, Haryana and Kerala have shown a decline in performance from base year to reference year, despite some of them being among the top ten in overall performance”.
Figure 3 Karnataka ranking falling slightly between base and reference year
What explains Karnataka’s ranking and downward trend?
A few explanations based on a rapid reading of the document.
Stagnation of health outcomes
Karnataka registered little movement in terms of health outcomes and key inputs/processes (see fig 4.4)
Karnataka gives an impression of having stagnated in terms of improving health outcomes. This will need special focus. See for eg. stagnation in U5MR for Kar at 31 U5 deaths per 1000 live births whereas Guj fell from 41 to 39 deaths per 1000 LB, or Telangana which fell from 37 to 34. Or see for eg. proportion of low birth weight infants which marginally fell in Guj and many other states but saw an alarming rise in Kar (from 10.8 to 11.5)
Figure 4 Increase in proportion of newborns with low birth weight
Another example of relative stagnation is seen with TB case notification. See for eg. a well-performing, yet non-moving state such as Kerala which increased case notification for TB from 87 to 139 per 100,000 population or even Guj going from 170 to 193.
Things left out of the index: NCDs and OOPE from the rankings: Non-communicable diseases like Diabetes, Hypertension, Cancer, mental health achievements (if any) are not captured by this data. If Karnataka has invested in these, the rankings will NOT show them.
The data on out-of-pocket expenditure is not included to assess changes between base and reference year as data was available only for the reference year. So, states which have invested in or have brought down OOPE for their populations will not show up as improving in their incremental rankings.
Poor data integrity: One of the striking losses of ranking for Karnataka is the in the data integrity measure under Governance and information. The report captures the difference between what routine data from HMIS shows (say for e.g. for institutional deliveries) and what independent population-based surveys show, the rather assumed to be more robust estimates. Karnataka’s estimates of institutional delivery for e.g. show a 20% difference between HMIS and population surveys whereas Gujarat/Assam/Kerala etc. show around 1% (see fig 4.36 on p.50). Karnataka needs to invest heavily in improving quality of its existing HMIS data, including robust data collection from the private sector providers. Closing the gap between what the routine HMIS reports and what independent survey data capture (in this case NFHS 4) for our health outcomes will also improve our score (as well as benefit the health system). Interestingly, on most indicators which are computed entirely based on state reports (like many maternal healthcare service provision indicators such as ID, 3-ANC etc. Karnataka does quite well. So, one may need to carefully assess if this is good information or if our M&E systems need an overhaul to ensure we get good data). See also for eg. completeness of IDSP form, which has improved for Kar this year (fig 4.6 on p.66)
Figure 5 Comparison of ID percentage per HMIS routine data and NFHS 4
Governance in terms of stability of state and district posts: Again, on governance, stability of key stewardship posts in teh health department has been the poorest in the country, for Karnataka. In both years (2014-15 and 2015-16), Karnataka has had an average occupancy of less than 7 months for a 3 year period for key posts, whereas WB, Rajasthan, Guj or Pun have had >20 months average occupancy in a three year period! This is staggering and harms effective health systems governance in Karnataka. Similarly, district level positions are also more stable in many states like Guj, Chhattisgarh, MP and Maharashtra when compared to Karnataka (fig 4.4.3, p.54).
Staggering inefficiencies and delays in transfer of central funds to implementing agency have also contributed to poor score, and appears to be worsening. See for eg. average number of days that central funds spend at state level before being transferred to implementing agency being over 2 months in the case of Karnataka and increasing, whereas it is low as 2 weeks or less in the case of Guj, Uttarakhand, UP or Bihar. (Fig. 4.64, p.68)
Increasing sex ratio at birth: There is an alarming fall in sex ratio at birth in most of the larger states. Except for Bihar, Punjab and UP, all states including Karnataka have registered a fall in sex ratio at birth. Karnataka data (from SRS) shows a decrease from 950 girls per 1000 boys to 939 in the one year compared. In the case of some states, like Gujrat, the fall is rather steep (907 to 854!)
With respect to institutional deliveries, Guj has performed extremely well. It has gone form 91 to 98 percent IDs in the year compared, whereas Karnataka has again rather stagnant around 77. However, given that our low birth weights have increased and our U5 mortality rates have increased, despite a somewhat reasonable institutional delivery percentage, it sharply points towards a huge quality issue in the provision of antenatal, delivery, postnatal and early infancy/childhoood care.
Relatively better performance on ART treatment for PLHIV: Among larger states, Karnataka has done well on the HIV/AIDS treatment coverage front. Proportion of people living with HIV/AIDS who are on ART treatment has gone up from 84 to 89 percent, whereas many states in the west and north including Gujarat are still at a coverage of about 50% or less.
High OOP for delivery in public facilities: In terms of OOP for delivery in public facilities, Karnataka still has the sixth highest! This is rather worrisome and indicates heavy outside prescriptions for medicines, laboratory tests or informal payments. Many states including TN and Guj have almost half of our OOP for delivery in public facilities.
High vacancies of health workers across cadres: Vacancies in ANMs and staff nurses have fallen in Karnataka, but doctor and specialist vacancies remain stagnant.
And what should Karnataka do?
Firstly, one must remember what the ranking is and what it is NOT. The ranking is certainly useful to find out gaps and plug them. What the ranking does NOT do is identify structural problems either in the health system, or in the socio-economic setting of the state. The HSPI report does not go into functions that the state ought to be doing, say for eg. private sector regulation, or improving the complement of services at primary health care (including mental health and NCD care for eg.). Secondly, the rankings are state level aggregates, so they are completely oblivious of intra-state regional inequalities (north-south in Karnataka for eg) or inequities prevailing within district and taluka level sub-populations (an inequitable lack of access along caste, socio-economic status and gender lines).
That said, the ranking is an opportunity to do course correction and improve quality of public services as well as identify need for regulation of private sector and engage better with them.
- Quality of care for mothers and children and strengthening of PHCs and Anganwadis: While coverage of ANC and institutional deliveries have indeed improved, as seen above low-birth weight, under-5 children survival heavily depends on well staffed and well resourced PHCs with competent, trained and motivated staff. All measures that increase staff availability, their training and capacity as well as measures that improves their motivation (staff pay on time, proper payment of incentives etc.) is important.
- Strengthening district capacity for M&E and HMIS data quality: Clearly, there is a lot of work to be done on improving data quality of HMIS. HSPI report has shown quite some gaps between performance indicators generated through routine HMIS and those showing up on wider population-based surveys. Also improving capacity of district health management teams to carefully do such assessments locally and implement course correction is urgently needed.
- Private sector data: There is an urgent need for better regulation and coordination with the private sector for improving care for TB as well as various other conditions where they are involved. For eg. data on Dengue, TB and other infectious diseases are under-reported in public data. How can we ensure good quality care for populations irrespective of where they are getting the care?
- Good Governance and state-level leadership: Meaningful social interventions to address boy preference in addition to better enforcement of sex-selective abortion
- Improving vacancy position for doctors and specialists in public hospitals
- Improving fund flow from centre to districts and implementing agencies