Contact
Join
Login

The value of cause of death data

Medical certification of cause of death

Coding causes of death to statistical categories
The International Classification of Diseases

Cause of death: where there is no physician
Verbal autopsy diagnostic algorithms

Automated verbal autopsy
What is automated verbal autopsy and how does it differ from medical certification of cause of death?

Incorporating verbal autopsy into the civil registration and vital statistics system

Learn how to use the Learning Centre

Move your way through the CRVS system or simply click on a topic to dive into a specific subject.

An account lets you:

  • Save resources from our Library
  • Track your progress through the Learning Centre
  • Sign-up for our free newsletter

Coding causes of death to statistical categories

Conduct regular coding quality assessments

Poor coding practices detract from the usability of COD data and waste resources. To ensure good-quality coding, the work of coders should be systematically evaluated on a regular basis to identify and correct any errors or misunderstandings. This process should not be onerous – for example, a sample of coded certificates could be re-coded by different coders and the degree of agreement assessed. If the evaluation revealed a specific problem then an expert coder could deliver targeted training based on the type of errors detected.

Coding audit example: Sri Lanka

In 2002, 1067 medical records from hospitals in the Colombo district of Sri Lanka were assessed for coding quality using the Australian Coding Benchmark Audit (ACBA). The original records were re-coded, and overall agreement was found to be only 57.6%.

The figure below shows the auditing procedure used in ACBA. Even though the 2002 study was conducted using morbidity data, the same principles apply to mortality data. ACBA was developed by the National Centre for Classification in Health in Australia in 1998, and recommends the use of a minimum of 40 records or 5% of the total number of records when assessing coding quality. Although ACBA software is no longer available for purchase, the philosophy behind it could usefully be adopted when assessing the quality of mortality coding.

Auditing procedure used by ACBA 

download

Coding audit example: Netherlands

In the Netherlands, a ‘double-coding’ study of the reliability of underlying COD coding was conducted using death certificates from May 2005. These were then re-coded manually in 2007 by four coders. Reliability was measured by calculating the degree of agreement between coders (inter-coder agreement) and the consistency of each individual coder over time (intra-coder agreement). 
Evaluation of 10,833 death certificates showed that inter-coder agreement on the UCOD was 78%. The mean intra-coder agreement for all four coders was 89%. The degree of agreement was found to be associated with the specificity of the ICD-10 code (whether at chapter, three-digit or four-digit level), age of the deceased, number of coders, and the number of diseases reported on the death certificate. The study also found that the reliability of COD statistics was high (>90%) for major causes of deaths – mainly cancers and acute myocardial infarction – but low for chronic diseases such as diabetes and renal insufficiency (70%). 

It was concluded that statistical offices should provide coders with additional rules for coding causes of death associated with a low level of reliability. 


Read more

Harteloh P et alJ (2010).  The reliability of cause-of-death coding in The Netherlands. European Journal of Epidemiology.

Coding audit example: Taiwan

In Taiwan, a systematic sample of 5621 death certificates (5% of all the death certificates issued in 1994) was reviewed to assess quality of coding according to ICD-9. The UCOD death selected by the reviewer for each death certificate was compared with that selected by the original coder. In both cases, the UCOD was selected according to ACME decision tables. 
Overall agreement rates between the reviewers and the coders were 80.9% for three-digit ICD categories and 83.9% for two-digit ICD categories. Good agreement was found in cases of malignant neoplasms and for injuries, and poisoning, but agreement was poorer for nephrotic diseases, hypertension-related diseases and cerebral infarction. 

It was concluded that the national administration should undertake routine internal reviews to check on the quality of underlying COD coding practices. 


Read more

Lu TH et al (2000).  Accuracy of cause-of-death coding in Taiwan: types of miscoding and effects on mortality statistics. International Journal of Epidemiology.


© University of Melbourne 2018   For more information on copyright visit our website terms