External quality assessment performance in ten countries: an IFCC global laboratory quality project
-
Renze Bais
, Jean-Marc Giannoli
and Egon P. Amann
Abstract
Objectives
This study aimed to assess the validity of external quality assessment (EQA) laboratory results across various cultural and environmental contexts and to identify potential improvement areas.
Methods
The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) Task Force on Global Laboratory Quality (TF-GLQ) conducted a 2-year study (2022 and 2023) in which EQA materials, related software and online training was provided by a commercial vendor to 100 laboratories in ten IFCC member society countries. The results were analysed on a monthly basis by the TF-GLQ, to show the number of submissions per country, tests per lab, acceptability rates, random failures and to get a measure of which analytes performed poorly.
Results
The EQA material was dispatched on a quarterly basis. Some countries had problems with customs releasing the material in a timely manner, resulting in laboratories not receiving them on time leading to no submission. We report here the results for the second year of the survey. The number of examinations varied between laboratories, ranging from seven to 84 analytes. Of the ten countries surveyed, six averaged greater than 90 % acceptable results over the whole 12-months cycle, one had unacceptable results for two of the nine months they returned results and the other four were considered to not perform to an acceptable standard.
Conclusions
All 100 participating laboratories indicated satisfaction with the EQA survey and related services, including on-site training, and report handling. However, specimen receiving issues, suggest benefits in dispatching materials for a full 12-month cycle. Significant discrepancies in EQA performance indicate that four countries require long-term assistance, training and guidance. To ensure reliable patient results, promoting EQA in certain countries is essential to achieve the required level of quality.
Introduction
As part of total quality management (TQM), external quality assessment (EQA) is a vital component to ensure validity of the laboratory results so as to ensure confidence in patient results.
To assure analytical reliability of the results, internal quality control (IQC) is the primary tool used by medical laboratories. While IQC is critical for daily operation to evaluate changes in instrumentation and reagent lots, provides a regular check on the continuous precision of the analytical system, and identifies probable analytical errors, it is of limited utility in evaluating result accuracy and it often does not encompass the entire testing process.
EQA is a key component of a laboratory’s TQM system, required by many national regulations, for accreditation according to ISO EN 15189 [1], and recommended by the Clinical and Laboratory Standards Institute (CLSI) [2, 3]. When EQA schemes are not available for a specific analyte or methodology, alternative assessment procedures are necessary, and these vary significantly between laboratories and regions [3, 4]. EQA provides an external assessment of laboratory performance over time. Depending on the design, EQA can provide an assessment of trueness, intra and interlaboratory variation, linearity, identify differences between methods, and monitor continued efforts at harmonization 5], [6], [7], [8.
The IFCC inaugurated a Task Force on Global Lab Quality (TF-GLQ) in 2021. The initial scope and mandate were to provide tools for the evaluation of laboratories’ performance in developing countries using IQC and EQA programs in order to improve the medical diagnostics and disease treatment of patients. The intention was to identify vendors capable of providing quality control materials and associated software for on-going monitoring and to develop training programs for participating laboratories.
For the EQA program, a call for application to participate were issued to all IFCC member societies from which ten countries, (Georgia, Serbia, Bosnia/Herzegovina, Bolivia, Columbia, Peru, Indonesia, Sri Lanka, Malawi and Zambia) were selected to participate in the pilot program. Working in collaboration with the respective member societies, five candidate participating laboratories within each country were identified. A repertoire of common clinical biochemistry laboratory tests was agreed and suggested for the EQA program.
A bidding process was undertaken in 2021 to identify a commercial vendor for the EQA program and subsequently, OneWorld Accuracy (1WA), Vancouver, Canada, was selected to provide EQA materials, related software and respective online training to participating laboratories. All costs were borne by the IFCC. The TF-GLQ reviewed all reports, discussed the results, and any performance issues on a monthly basis.
The EQA program started in 2022 with 50 labs (i.e., five per country). Based on the positive results of a survey conducted at the end of 2022 (participants indicated satisfaction with the EQA program), participating countries were encouraged to extend the program from five to ten laboratories at the beginning of 2023. The results reported here are for the program run in 2023.
Materials and methods
Participating laboratories
A call to participate in this EQA pilot programme was send to IFCC member states deemed low-and middle-income countries. Not all countries expressed interest to participate (for example, no county from middle east). The final decision which countries would participate was done via internal voting of TF-GLQ members, relying on their individual estimation and their experience with previous IQC and EQA training in low-and middle-income countries of some members. (The countries taking part in this project and laboratory selection has been described previously [9, 10].) Finally, ten countries were selected: There were three from Europe (Georgia, Serbia, Bosnia/Herzegovina), three from Latin America (Bolivia, Colombia, Peru) and two each from Africa (Malawi, Zambia) and Asia (Indonesia, Sri Lanka). Participating laboratories were also recruited from Greece which were used as a control population from a high-income country. Although the task force worked in collaboration with the respective member societies, suggesting to select labs in need of improving their quality, ultimately the participating laboratories were selected by the member societies. Each of the ten countries selected for this study appointed a country coordinator who was the liaison between the TF-GLQ and the individual laboratories. There were ten laboratories taking part in each country which had been selected by the National Society for that country. All laboratories were enrolled into the 1WA online system under clearly described classifications according to analytical principle, method, instrument, reagent and calibrator. Different units reported for various methods were taken into account by the data analysis system.
EQA material
Commercial lyophilized, human based sera, containing 80 analytes, spiked to provide a range of analyte concentrations to cover the analytical range. EQA material was shipped to each laboratory at three monthly intervals with three samples per shipment, four shipments in total per cycle. Participants were provided with specific instructions on the procedure for the reconstitution of the samples with deionized water before use. The samples were shipped at ambient temperature however they were then stored at +2–8 °C at the country locations before use. Sample stability after reconstitution, at +2–8 °C of 8 days was provided from the manufacturer. One sample was analysed each month (EQA challenge). Results were entered into the 1WA system online. The TF-GLQ suggested a panel of 15 tests that were considered important for routine analyses. However, not all laboratories measured all 15 analytes, and participants were expected to only measure analytes which were already part of their routine service.
Analytical method used
Methods were classified according to analytical principle, method, instrument, reagent and calibrator. The different units reported was also taken into account during the data analysis. A list of methods is provided in Table 1.
List of instruments used by two or more laboratories.
Biochemistry | |
---|---|
Instrument | Number of laboratories users |
Roche cobas/Integra | 24 |
Beckman AU (Olympus)/DxC | 20 |
Ortho Vitros | 16 |
Mindray BS | 15 |
Abbott Alinity c/ci | 10 |
Awareness Stat Fax | 9 |
BioSystems A/BA/BTS | 9 |
Siemens (Dade) Dimension/Atellica | 9 |
Abbott Architect | 7 |
AI Osmometer 3,320 | 6 |
Erba Lachema Lyte/XL | 5 |
Diestro 103 A P | 4 |
IL GEM/Lyte | 4 |
ABX Pentra C200 | 3 |
AVL 9180/9181 | 3 |
EXIAS e|1 Analyzer | 3 |
Human Humalyzer/Humastar | 3 |
Wiener CM | 3 |
Zybio EXC200 | 3 |
DiaSys Response 910 | 2 |
i-Smart 300 | 2 |
Paramedical PKL PPC125 | 2 |
Radiometer ABL | 2 |
Total | 186 a |
Immunoassay | |
---|---|
Instrument | Number of laboratories users |
Roche Vobas 6,000 | 36 |
Siemens (Dade) Advia/Atellica/BN/Dimension/Immulite | 23 |
Abbott Architect | 20 |
Ortho Vitros | 17 |
Abbott Alinity | 12 |
BioMerieux Mini Vidas | 11 |
Shenzhen Microprofit/Maglumi | 10 |
Mindray BS-240 | 9 |
Awareness Stat Fax | 6 |
Finecare FIA Meter | 6 |
Snibe Maglumi 800 | 6 |
Tosoh AIA | 5 |
Beckman (Olympus) AU | 4 |
Beckman Access 2 | 4 |
BioSystems | 4 |
Monobind Lumax/Neo Lumax | 4 |
IDS iSYS | 3 |
AI Osmometer 3,320 | 2 |
Biozek DCR 1000 | 2 |
Boditech AFIAS 10 | 2 |
DiaSorin LIAISON | 2 |
Human Reader | 2 |
Lifotronic eCL 8,000 | 2 |
Manual determination | 2 |
Meril Merilyzer AutoQuant 200i | 2 |
NanoEntek FREND System | 2 |
SFRI IRE 96 | 2 |
Total | 200 |
-
aIncludes a further 22 instruments used by one laboratory only. List of instrument manufacturers that were used in two or more laboratories within the 100 laboratories included in the survey. See text for more details.
Assigned target values
Commercial lyophilised material was used where no claims regarding commutability was provided by the manufacturer. Commutability was not assessed and considered unlikely. For this reason, peer view assessment was used rather than a comparison against a reference method or overall mean. The method peer group mean from participants results after exclusion of outliers was therefore used as the target value for each sample.
Analytical performance specification (APS)
Performance criteria based on the Standard deviation of the returned results, SDp or “state of the art” was used. This was expressed as a Z score i.e. (Laboratory result – Target value)/SDp. As these APS were statistical rather than clinically derived, APS based on CLIA criteria which are widely used throughout North America and the Middle East was subsequently applied retrospectively [11]. These were selected in preference to the Milan Model biological variation used within Europe and Australia as the latter was considered unrealistic for low-and middle-income countries. The rationale for the selection of APS was based on passable (everyone should theoretically pass; there may still be benefit from better performance. Regulatory requirements or governmental regulations may favour this philosophy) [12].
List of examinations (analytes)
A repertoire of common clinical biochemistry laboratory tests were agreed by the TF-GLQ and suggested for the EQA program, although this was extended to include all of the 80 analytes provided in the programme. This represented 33 biochemistry analytes, 20 endocrine, 12 enzymes and three tumour markers. Clinical Chemistry analytes were selected during the pilot as experience by TF-GLQ members visiting Africa had reported very little EQA programmes used within this discipline, with the majority of EQA programmes focused on infectious diseases.
Category of EQA design
The evaluation capabilities of the EQA design was classified as Category 6 as defined by Miller et al. [13]. In this category, the material was not deemed to be commutable, and no assessment of trueness or metrological traceability was available. The analytical quality being assessed was therefore total error (includes bias and imprecision, as applied to single results) as defined by Jones et al. [12].
Performance reports
Reports were returned to the laboratories by email in the form of. pdf files. The reports included method mean, standard deviation and coefficient of variation, the z-score and acceptable range calculated as peer group mean ±2 times standard deviations.
All members of the TF-GLQ had access to the results from all 100 laboratories and the country coordinator had access to the results for the 10 laboratories in their country.
Results
Figure 1 shows the percentage of survey participation for the ten individual countries for 2023. Although not shown in Figure 1, Country A and Country F did not return any results for October, November and December and Country J for November and December 2023. Country G had zero participation for April, July and October 2023. Country J averaged only 2.4 laboratories per month submitting results.

The rate of participation of countries in the survey for 2023. The Figure shows overall participation for each of the ten countries in % for 2023.
As part of enrolling in the 1WA results data base, the laboratories were required to list the manufacturer and instrument model for each analyte. Table 1 list the instrument manufacturers that were used in two or more laboratories within the 100 included in the survey. In total, it was found that at least 50 manufacturers and over 100 instruments were listed as being used by these laboratories, including some used for one analyte only.
For some of the countries there were considerable differences in the number of results reported by different laboratories (Figure 2). As can be seen, the maximum number of results reported by an individual laboratory for Country A is greater than the number of analytes in the material provided which is due to one laboratory returning results for the same analyte assayed on multiple instruments. Similarly, one laboratory from Country F measured the same analytes on two different instruments. There was large variability in the instruments used by laboratories within any one country.

Results returned per month by laboratories in the 10 survey countries. Some laboratories returned more results than the number of analytes in the material provided as it was assayed on multiple instruments within one laboratory. Minimum and maximum number of applied analytes per laboratory per country are shown. Note that each laboratory could choose from the 80 provided lyophilised serum containing analytes according to their laboratory’s routine. In case that the graph shows a number greater than 80 (i.e., country A) same analytes were assayed on multiple instruments
The percentage of acceptable results submitted was calculated using the z-score (Figure 3).

Acceptability per month for individual countries. The acceptability was determined from the reports provided by 1WA based on CLIA criteria or±2 SD if not available in CLIA’88. The four countries shown in red were considered not to perform to an acceptable standard. Acceptability per month for each country. Of the ten countries surveyed, six averaged greater than 90 % acceptable results over the whole 12 months cycle, one had unacceptable results for two of the nine months they returned results and the other four were considered to not perform to an acceptable standard (shown in red). See text for additional interpretation.
Of the ten countries surveyed, six averaged greater than 90 % acceptable results over the whole 12-months cycle, one had unacceptable results for two of the nine months they returned results and the other four were considered to not perform to an acceptable standard (shown in red). For instance, the October result of 4 % is from two laboratories in Country J, with one laboratory having all 15 results unacceptable and the other having 10 of 11 results being unacceptable. Country A and F had no submission for October, November and December, Country J for November and December and Country G for April, July and October.
The results in Table 2 show the various in acceptability of results for individual laboratories within each country.
The percentage acceptability of submitted results for each individual laboratory in the 10 survey countries. The number of analytes per submission can be greater than 80 (number of analytes in the material) due to a laboratory using more than one instrument for the same analytes. The acceptability was determined from the reports provided by 1WA based on CLIA criteria or±2 SD if not available in CLIA. The results are for laboratories that returned 4 or greater monthly results.
Country A | Country B | Country C | Country D | Country E | Country F | Country G | Country H | Country I | Country J | |||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | |
Laboratory 1 | 6 | 38–42 | 96.7 | 12 | 15–16 | 93.1 | 12 | 21–40 | 89.1 | 12 | 10–16 | 93.9 | 12 | 15 | 99.4 | 9 | 18 | 91.1 | 9 | 20–22 | 78.0 | 11 | 13–18 | 35.1 | 11 | 10–12 | 45.9 | <4 | – | – |
Laboratory 2 | 9 | 39–42 | 99.2 | 12 | 21–23 | 98.9 | 11 | 43–45 | 93.5 | 10 | 24–31 | 99.1 | 11 | 19–20 | 96.8 | 9 | 32–40 | 75.2 | 9 | 15–38 | 64.2 | 12 | 27–31 | 88.6 | 10 | 13–15 | 38.4 | 6 | 16–30 | 73.7 |
Laboratory 3 | 9 | 15–16 | 99.3 | 12 | 15–16 | 92.4 | 12 | 16–18 | 99.0 | 12 | 15–17 | 96.3 | 11 | 15–16 | 98.9 | 5 | 21–26 | 94.8 | 9 | 10–15 | 78.7 | 9 | 12–22 | 62.3 | 11 | 17–19 | 99.0 | 5 | 17–24 | 55.0 |
Laboratory 4 | 5 | 20–24 | 95.2 | 12 | 26–29 | 86.9 | 12 | 19–26 | 89.5 | 12 | 34–44 | 96.8 | 10 | 32–35 | 96.1 | 8 | 32–41 | 96.0 | 8 | 18–22 | 89.3 | 12 | 25–26 | 91.8 | 11 | 16–29 | 97.4 | 8 | 14–20 | 86.3 |
Laboratory 5 | 9 | 17 | 93.3 | 12 | 16–17 | 89.1 | 12 | 51–56 | 98.7 | 12 | 17–18 | 91.7 | 10 | 26–28 | 99.2 | 8 | 24–44 | 96.6 | 8 | 15–17 | 71.4 | 11 | 43–50 | 66.3 | 12 | 45–47 | 94.6 | <4 | – | – |
Laboratory 6 | 9 | 18 | 97.4 | 12 | 45–50 | 97.4 | 12 | 51–71 | 97.2 | 11 | 14–17 | 100.0 | 12 | 21–22 | 99.2 | 9 | 29–35 | 92.0 | 9 | 47–49 | 48.8 | 12 | 23–31 | 70.0 | 12 | 15 | 33.4 | <4 | – | – |
Laboratory 7 | 9 | 52–61 | 97.4 | 12 | 30–37 | 96.9 | 12 | 25–28 | 98.9 | 12 | 37–44 | 99.4 | 11 | 20–21 | 97.4 | <4 | – | – | 9 | 45,633 | 44.4 | 11 | 7–8 | 6.2 | 12 | 49–52 | 90.0 | <4 | – | – |
Laboratory 8 | 9 | 84–87 | 98.4 | 12 | 22–23 | 98.8 | 12 | 24–26 | 99.0 | 12 | 37–53 | 95.5 | 8 | 42,278 | 83.4 | 8 | 30–34 | 90.1 | 9 | 17–21 | 85.2 | 12 | 9–34 | 66.3 | 12 | 15–17 | 79.3 | <4 | – | – |
Laboratory 9 | 9 | 53–63 | 93.2 | 12 | 19–22 | 96.8 | 12 | 38–40 | 99.0 | 11 | 35–53 | 90.1 | 10 | 18–21 | 95.3 | 9 | 25–37 | 96.8 | 9 | 13–21 | 88.7 | 12 | 29 | 91.4 | 12 | 14–16 | 70.7 | 6 | 13–14 | 85.7 |
Laboratory 10 | 8 | 20–21 | 98.1 | 12 | 39–45 | 97.8 | 11 | 47–48 | 91.8 | 11 | 13–20 | 89.0 | 11 | 3–14 | 95.5 | 3 | 17–29 | 93.0 | 9 | 45,540 | 92.8 | 9 | 11–12 | 45.7 | 12 | 13–14 | 63.6 | <4 | – | – |
As can be seen, the number of analyte results submitted for some laboratories varied significantly, e.g., laboratory 5, country F and laboratory 8, country H. On further investigation, this was found to be lack of reagents which is a major issue in many developing countries.
The performance for individual analytes measured in the survey are shown in Table 3.
The performance for analytes measured by at least 20 laboratories. The acceptability is based in the new CLIA criteria [11], except for lactate and transferrin for which the ±2SD of the peer group was used.
Analyte | Number of participants | Total number of results submitted | % of acceptable results as compared to overall average (to peers group average) | Acceptable limits (CLIA 2024 criteria) |
---|---|---|---|---|
Albumin | 91 | 898 | 80 | ±8 % |
Alkaline phosphatase | 93 | 889 | 79 | ±20 % |
ALT | 97 | 975 | 84 | ±15 % |
Amylase | 60 | 488 | 76 | ±20 % |
AST | 96 | 960 | 86 | ±15 % |
Bilirubin total | 96 | 900 | 81 | ±20 % |
Calcium | 71 | 666 | 75 | ±5 % |
Chloride | 67 | 656 | 82 | ±5 % |
Cholesterol HDL | 69 | 583 | 83 | ±20 % |
Cholesterol LDL | 43 | 322 | 83 | ±20 % |
Cholesterol total | 98 | 971 | 88 | ±10 % |
CK | 40 | 325 | 88 | ±20 % |
Creatinine | 100 | 982 | 75 | ±10 % |
Ferritin | 44 | 387 | 29 (65) | ±20 % |
GGT | 64 | 590 | 77 | ±15 % |
Glucose | 98 | 985 | 82 | ±8 % |
Iron | 45 | 417 | 91 | ±15 % |
Lactate | 21 | 110 | 90 | ±20 % |
LDH | 59 | 514 | 64 (69) | ±15 % |
Lipase | 29 | 274 | 70 | ±20 % |
Magnesium | 48 | 446 | 86 | ±15 % |
Phosphates | 69 | 624 | 77 | ±10 % |
Potassium | 75 | 767 | 86 | ±7 % |
Proteins total | 93 | 903 | 81 | ±8 % |
PSA | 48 | 441 | 80 | ±20 % |
Sodium | 73 | 747 | 84 | ±4 % |
T4 free | 52 | 505 | 45 (85) | ±15 % |
Transferrin | 20 | 128 | 95 | ±20 % |
Triglycerides | 97 | 960 | 86 | ±15 % |
TSH | 56 | 568 | 84 (96) | ±20 % |
Urea | 96 | 960 | 72 | ±9 % |
Uric acid | 79 | 715 | 76 | ±10 % |
-
The Table shows the percentage of satisfactory results according to CLIA, 2024 acceptable limits criteria, the number of analytical systems assessed and, in brackets, the number of participant laboratories for each analyte. Results are evaluated both in relation to the overall average and to the peer group average.
There are 80 available analytes in the material that was provided and the breakdown of these that were assayed by the participants is as follows:
80 % of the participants assayed albumin, alkaline phosphatase, ALT, AST, bilirubin total and conjugated, cholesterol, creatinine, glucose, potassium, total protein, sodium, urea, triglycerides and uric acid.
20 % of the participant laboratories assayed amylase, CO2 total, DHEA sulphate, lithium, and transferrin.
Less than 20 % of the participant laboratories assayed 11 deoxy-cortisol, 17 OH progesterone, aldosterone, androstenedione, cholinesterase, fructosamine, homocysteine, Lp(a), osmolality, prostatic acid phosphatase, total acid phosphatase, SHBG, T3 uptake, TIBC, acetaminophen, carbamazepine, digoxin, gentamicin, phenobarbital, phenytoin, salicylate, theophylline, tobramycin, valproic acid and vancomycin.
As expected, analytes requiring immunoassay for measuring were generally the least performed.
Discussion
The scope of this study was to evaluate laboratories’ EQA program performances in developing countries in order to improve the medical diagnostics and disease treatment of patients. EQA materials and associated software for on-going monitoring were provided free of charge to participating countries. We wanted to understand daily performance routines and their issues, acceptability rates of applied assays (pass or fail) and develop training programs for participating laboratories.
The simple EQA design applied fulfilled our aim of identifying poor performing laboratories in low to middle income countries that required further help and support. We didn’t need commutable, metrological traceable material with APS based on clinical or biological goals to achieve our aim. We needed cheap, stable material from a provider that had global EQA presence where we could use simple peer review using pragmatic APS criteria.
As an ISO 17043 accredited provider, 1WA assesses the homogeneity and stability of the EQA samples provided in their program. They may assess commutability of the samples based on CLSI EP14 “Evaluation of commutability of processed samples” but that is not a requirement. There is no mention in 1WA EQA collateral suggesting commutable samples are utilized. Note that an assessment of commutability would be done on the broader data set for each sample, not just that associated with labs in the IFCC pilot program.
As the standardization/harmonization of test methods was not the intent of the EQA Pilot program, commutability was not assessed by the TF-GLQ nor by 1WA on the processed samples. We know that EQA samples supporting large (global) programs will likely have matrix effects (biases) associated with their supplemented additives and processing. That is why consensus-based peer group assessments (instrument/reagent), where possible, are valuable in determining acceptable performance of participants.
Commutability was not assessed and considered unlikely. For this reason, peer view assessment was used rather than a comparison against a reference method or overall mean. The method peer group mean from participants results after exclusion of outliers was therefore used as the target value for each sample.
There are several different APSs that could be used (some country specific), but the program selected already included the recently revised CLIA criteria. These were selected in preference to the Milan Model biological variation used within Europe and Australia as the latter was considered unrealistic for low- and middle-income countries.
This EQA performance study covered the years 2022 and 2023. In preparation of this study, ten IFCC member countries were recruited in 2021, liaison managers in these countries were identified and assigned with the help of the respective local Clinical Chemistry organisations. A suitable commercial vendor of EQA materials was identified (i.e., 1WA) and a proposal for a panel of 15 commonly and widely used Biochemistry-and Immunoassays for routine analysis was developed. However, not every laboratory measured all of these, and they were expected to only measure analytes which were already part of their routine service.
In 2022, 50 laboratories in ten developing countries participated. Based on the initial acceptance of the program (as evidenced by a survey conducted at the end of 2022 in which participating countries expressed interest to expand the program), the program was extended to 100 laboratories in the same ten participating countries. The results reported here are for the program run in 2023.
We observed distinct differences in performance: Six out of the ten participating countries showed a consistently solid acceptance rate of >90 % over the observed time period. In contrast, four out of the ten participating countries showed poor acceptance rates between 60 and 90 %. This is below generally accepted EQA standards and requires improvement. The TF-GLQ intends to focus their activities to continue aid by online and on-site training particular with these countries. The six countries with solid acceptance rates of >90 % do not need to be further monitored and do not require additional assistance.
Some countries encountered shipping and customs clearance problems, thus being unable to submit results in a timely fashion. As lyophilized EQA materials are stable, it was concluded that for many developing countries in which logistics and customs clearance may be an issue, a single annual shipment is the most appropriate way to distribute the EQA material.
During the survey period and from subsequent discussions with participating laboratories, a number of common issues were identified as affecting performance:
Material awaiting custom clearance causes delays in laboratories receiving samples and not meeting the submission deadlines, as was the case for laboratories in Country G which were unable to submit results for April, July and October.
The availability of reagents and other materials due to a lack of money is a significant problem for laboratories in some countries. Also, often governments in developing countries prioritise resources and biochemistry may not be seen as being as urgent as the measurement of infectious diseases. For instance, in Africa, there is a major drive to control the spread of malaria, HIV and tuberculosis.
In some countries, there was a lack of understanding of the importance of quality control and external quality assessment and the associated procedures in ensuring acceptable laboratory performance leading to a non-commitment by the laboratory. This was reflected in the results for Country J where only 4 out of the 10 laboratories that were enrolled in the program submitted at least 4 sets of results. In this scenario, previous experience has shown that laboratories will participate if there is oversight of the program by the relevant authority such as the Ministry of Health.
In October and November of 2023, scientific support teams from the TF-GLQ in collaboration with the national societies organized workshops, laboratory visits and conducted country-specific training in Malawi, Zambia, Peru and Columbia. Workshops included training and case studies in IQC along with interpretation of EQA reports using real case studies and performance data from those countries. These visits were also used to identify issues and barriers to improving the quality of laboratory diagnostics in these and similar countries and the measures needed to address these concerns.
This study shows that not all countries perform to globally accepted EQA performance standards (note: all participating countries of this study are IFCC member countries) and that additional future efforts should be devoted to improve this unsatisfactory situation to the good of improved medical diagnostics and disease treatment of patients.
Acknowledgments
The IFCC TF-GLQ would like to thank Silvia Cardinale for her help and expertise in handling country communication and data survey collation and Anna Carobene for critical reading of and useful comments on the manuscript.
-
Research ethics: Not applicable, since no human subjects were involved in this study. Data analyzed was anonymized.
-
Informed consent: Not applicable.
-
Author contributions: The authors have accepted responsibility for the entire content of this manuscript and approved its submission.
-
Competing interests: JL is an employee of Abbott Labs, RB owns rbaisconsulting, KC is an employee of Bio-Rad Laboratories, JMG is an employee of Technical Direction Biogroup, EA owns Amann Consulting. All other author state no conflict of interest.
-
Research funding: None declared.
-
Data availability: Not applicable.
References
1. Schneider, F, Maurer, C, Friedberg, RC. International organization for standardization (ISO) 15189. Ann Lab Med 2017;37:365–70. https://doi.org/10.3343/alm.2017.37.5.365.Search in Google Scholar PubMed PubMed Central
2. Sciacovelli, L, Secchiero, S, Padoan, A, Plebani, M. External quality assessment programs in the context of ISO 15189 accreditation. Clin Chem Lab Med 2018;56:1644–54. https://doi.org/10.1515/cclm-2017-1179.Search in Google Scholar PubMed
3. Darcy, T. CLSI. Darcy, T, editor. QMS24 using proficiency testing and alternative assessment to improve medical laboratory quality, 3rd ed. Wayne, PA: Clinical and Laboratory Standards Institute; 2016.Search in Google Scholar
4. Payne, DA, Russomando, G, Linder, MW, Baluchova, K, Ashavaid, T, Steimer, W, et al.. External quality assessment (EQA) and alternative assessment procedures (AAPs) in molecular diagnostics: findings of an international survey. Clin Chem Lab Med 2021;59:301–6. https://doi.org/10.1515/cclm-2020-0101.Search in Google Scholar PubMed
5. Plebani, M. Harmonization in laboratory medicine: the complete picture. Clin Chem Lab Med 2013;51:741–51. https://doi.org/10.1515/cclm-2013-0075.Search in Google Scholar PubMed
6. Ceriotti, F. The role of external quality assessment schemes in monitoring and improving the standardization process. Clin Chim Acta 2014;432:77–81. https://doi.org/10.1016/j.cca.2013.12.032.Search in Google Scholar PubMed
7. Ceriotti, F, Cobbaert, C. Harmonization of external quality assessment schemes and their role – clinical chemistry and beyond. Clin Chem Lab Med 2018;56:1587–90. https://doi.org/10.1515/cclm-2018-0265.Search in Google Scholar PubMed
8. James, D, Ames, D, Lopez, B, Still, R, Simpson, W, Twomey, P. External quality assessment: best practice. J Clin Pathol 2014;67:651. https://doi.org/10.1136/jclinpath-2013-201621.Search in Google Scholar PubMed
9. Blasutig, I, Wheeler, S, Bais, R, Kumar Dabla, P, Lin, J, Perret-Liaudet, A, et al.. External quality assessment practices in medical laboratories: an IFCC global survey of member societies. Clin Chem Lab Med 2023;61:1404–10. https://doi.org/10.1515/cclm-2023-0057.Search in Google Scholar PubMed
10. Wheeler, S, Blasutig, I, Kumar Dabla, P, Giannoli, J, Vassault, A, Lin, J, et al.. Quality standards and internal quality control practices in medical laboratories: an IFCC global survey of member societies. Clin Chem Lab Med 2023;61:2094–101. https://doi.org/10.1515/cclm-2023-0492.Search in Google Scholar PubMed
11. 2024 CLIA acceptance limits for proficiency testing, quoted from the federal register Vol 87, No 131, 2022. https://www.westgard.com/2024-clia-requirements.htm.Search in Google Scholar
12. Jones, G, Albarede, S, Kesseler, D, MacKenzie, F, Mammen, J, Pedersen, M, et al.. Analytical performance specifications for external quality assessment - definitions and descriptions. Clin Chem Lab Med 2017;55:949–55. https://doi.org/10.1515/cclm-2017-0151.Search in Google Scholar PubMed
13. Miller, W, Jones, G, Horowitz, G, Weykamp, C. Proficiency testing/external quality assessment: current challenges and future directions. Clin Chem 2011;57:12. https://doi.org/10.1373/clinchem.2011.168641.Search in Google Scholar PubMed
© 2024 Walter de Gruyter GmbH, Berlin/Boston
Abstract
Objectives
This study aimed to assess the validity of external quality assessment (EQA) laboratory results across various cultural and environmental contexts and to identify potential improvement areas.
Methods
The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) Task Force on Global Laboratory Quality (TF-GLQ) conducted a 2-year study (2022 and 2023) in which EQA materials, related software and online training was provided by a commercial vendor to 100 laboratories in ten IFCC member society countries. The results were analysed on a monthly basis by the TF-GLQ, to show the number of submissions per country, tests per lab, acceptability rates, random failures and to get a measure of which analytes performed poorly.
Results
The EQA material was dispatched on a quarterly basis. Some countries had problems with customs releasing the material in a timely manner, resulting in laboratories not receiving them on time leading to no submission. We report here the results for the second year of the survey. The number of examinations varied between laboratories, ranging from seven to 84 analytes. Of the ten countries surveyed, six averaged greater than 90 % acceptable results over the whole 12-months cycle, one had unacceptable results for two of the nine months they returned results and the other four were considered to not perform to an acceptable standard.
Conclusions
All 100 participating laboratories indicated satisfaction with the EQA survey and related services, including on-site training, and report handling. However, specimen receiving issues, suggest benefits in dispatching materials for a full 12-month cycle. Significant discrepancies in EQA performance indicate that four countries require long-term assistance, training and guidance. To ensure reliable patient results, promoting EQA in certain countries is essential to achieve the required level of quality.
Introduction
As part of total quality management (TQM), external quality assessment (EQA) is a vital component to ensure validity of the laboratory results so as to ensure confidence in patient results.
To assure analytical reliability of the results, internal quality control (IQC) is the primary tool used by medical laboratories. While IQC is critical for daily operation to evaluate changes in instrumentation and reagent lots, provides a regular check on the continuous precision of the analytical system, and identifies probable analytical errors, it is of limited utility in evaluating result accuracy and it often does not encompass the entire testing process.
EQA is a key component of a laboratory’s TQM system, required by many national regulations, for accreditation according to ISO EN 15189 [1], and recommended by the Clinical and Laboratory Standards Institute (CLSI) [2, 3]. When EQA schemes are not available for a specific analyte or methodology, alternative assessment procedures are necessary, and these vary significantly between laboratories and regions [3, 4]. EQA provides an external assessment of laboratory performance over time. Depending on the design, EQA can provide an assessment of trueness, intra and interlaboratory variation, linearity, identify differences between methods, and monitor continued efforts at harmonization 5], [6], [7], [8.
The IFCC inaugurated a Task Force on Global Lab Quality (TF-GLQ) in 2021. The initial scope and mandate were to provide tools for the evaluation of laboratories’ performance in developing countries using IQC and EQA programs in order to improve the medical diagnostics and disease treatment of patients. The intention was to identify vendors capable of providing quality control materials and associated software for on-going monitoring and to develop training programs for participating laboratories.
For the EQA program, a call for application to participate were issued to all IFCC member societies from which ten countries, (Georgia, Serbia, Bosnia/Herzegovina, Bolivia, Columbia, Peru, Indonesia, Sri Lanka, Malawi and Zambia) were selected to participate in the pilot program. Working in collaboration with the respective member societies, five candidate participating laboratories within each country were identified. A repertoire of common clinical biochemistry laboratory tests was agreed and suggested for the EQA program.
A bidding process was undertaken in 2021 to identify a commercial vendor for the EQA program and subsequently, OneWorld Accuracy (1WA), Vancouver, Canada, was selected to provide EQA materials, related software and respective online training to participating laboratories. All costs were borne by the IFCC. The TF-GLQ reviewed all reports, discussed the results, and any performance issues on a monthly basis.
The EQA program started in 2022 with 50 labs (i.e., five per country). Based on the positive results of a survey conducted at the end of 2022 (participants indicated satisfaction with the EQA program), participating countries were encouraged to extend the program from five to ten laboratories at the beginning of 2023. The results reported here are for the program run in 2023.
Materials and methods
Participating laboratories
A call to participate in this EQA pilot programme was send to IFCC member states deemed low-and middle-income countries. Not all countries expressed interest to participate (for example, no county from middle east). The final decision which countries would participate was done via internal voting of TF-GLQ members, relying on their individual estimation and their experience with previous IQC and EQA training in low-and middle-income countries of some members. (The countries taking part in this project and laboratory selection has been described previously [9, 10].) Finally, ten countries were selected: There were three from Europe (Georgia, Serbia, Bosnia/Herzegovina), three from Latin America (Bolivia, Colombia, Peru) and two each from Africa (Malawi, Zambia) and Asia (Indonesia, Sri Lanka). Participating laboratories were also recruited from Greece which were used as a control population from a high-income country. Although the task force worked in collaboration with the respective member societies, suggesting to select labs in need of improving their quality, ultimately the participating laboratories were selected by the member societies. Each of the ten countries selected for this study appointed a country coordinator who was the liaison between the TF-GLQ and the individual laboratories. There were ten laboratories taking part in each country which had been selected by the National Society for that country. All laboratories were enrolled into the 1WA online system under clearly described classifications according to analytical principle, method, instrument, reagent and calibrator. Different units reported for various methods were taken into account by the data analysis system.
EQA material
Commercial lyophilized, human based sera, containing 80 analytes, spiked to provide a range of analyte concentrations to cover the analytical range. EQA material was shipped to each laboratory at three monthly intervals with three samples per shipment, four shipments in total per cycle. Participants were provided with specific instructions on the procedure for the reconstitution of the samples with deionized water before use. The samples were shipped at ambient temperature however they were then stored at +2–8 °C at the country locations before use. Sample stability after reconstitution, at +2–8 °C of 8 days was provided from the manufacturer. One sample was analysed each month (EQA challenge). Results were entered into the 1WA system online. The TF-GLQ suggested a panel of 15 tests that were considered important for routine analyses. However, not all laboratories measured all 15 analytes, and participants were expected to only measure analytes which were already part of their routine service.
Analytical method used
Methods were classified according to analytical principle, method, instrument, reagent and calibrator. The different units reported was also taken into account during the data analysis. A list of methods is provided in Table 1.
List of instruments used by two or more laboratories.
Biochemistry | |
---|---|
Instrument | Number of laboratories users |
Roche cobas/Integra | 24 |
Beckman AU (Olympus)/DxC | 20 |
Ortho Vitros | 16 |
Mindray BS | 15 |
Abbott Alinity c/ci | 10 |
Awareness Stat Fax | 9 |
BioSystems A/BA/BTS | 9 |
Siemens (Dade) Dimension/Atellica | 9 |
Abbott Architect | 7 |
AI Osmometer 3,320 | 6 |
Erba Lachema Lyte/XL | 5 |
Diestro 103 A P | 4 |
IL GEM/Lyte | 4 |
ABX Pentra C200 | 3 |
AVL 9180/9181 | 3 |
EXIAS e|1 Analyzer | 3 |
Human Humalyzer/Humastar | 3 |
Wiener CM | 3 |
Zybio EXC200 | 3 |
DiaSys Response 910 | 2 |
i-Smart 300 | 2 |
Paramedical PKL PPC125 | 2 |
Radiometer ABL | 2 |
Total | 186 a |
Immunoassay | |
---|---|
Instrument | Number of laboratories users |
Roche Vobas 6,000 | 36 |
Siemens (Dade) Advia/Atellica/BN/Dimension/Immulite | 23 |
Abbott Architect | 20 |
Ortho Vitros | 17 |
Abbott Alinity | 12 |
BioMerieux Mini Vidas | 11 |
Shenzhen Microprofit/Maglumi | 10 |
Mindray BS-240 | 9 |
Awareness Stat Fax | 6 |
Finecare FIA Meter | 6 |
Snibe Maglumi 800 | 6 |
Tosoh AIA | 5 |
Beckman (Olympus) AU | 4 |
Beckman Access 2 | 4 |
BioSystems | 4 |
Monobind Lumax/Neo Lumax | 4 |
IDS iSYS | 3 |
AI Osmometer 3,320 | 2 |
Biozek DCR 1000 | 2 |
Boditech AFIAS 10 | 2 |
DiaSorin LIAISON | 2 |
Human Reader | 2 |
Lifotronic eCL 8,000 | 2 |
Manual determination | 2 |
Meril Merilyzer AutoQuant 200i | 2 |
NanoEntek FREND System | 2 |
SFRI IRE 96 | 2 |
Total | 200 |
-
aIncludes a further 22 instruments used by one laboratory only. List of instrument manufacturers that were used in two or more laboratories within the 100 laboratories included in the survey. See text for more details.
Assigned target values
Commercial lyophilised material was used where no claims regarding commutability was provided by the manufacturer. Commutability was not assessed and considered unlikely. For this reason, peer view assessment was used rather than a comparison against a reference method or overall mean. The method peer group mean from participants results after exclusion of outliers was therefore used as the target value for each sample.
Analytical performance specification (APS)
Performance criteria based on the Standard deviation of the returned results, SDp or “state of the art” was used. This was expressed as a Z score i.e. (Laboratory result – Target value)/SDp. As these APS were statistical rather than clinically derived, APS based on CLIA criteria which are widely used throughout North America and the Middle East was subsequently applied retrospectively [11]. These were selected in preference to the Milan Model biological variation used within Europe and Australia as the latter was considered unrealistic for low-and middle-income countries. The rationale for the selection of APS was based on passable (everyone should theoretically pass; there may still be benefit from better performance. Regulatory requirements or governmental regulations may favour this philosophy) [12].
List of examinations (analytes)
A repertoire of common clinical biochemistry laboratory tests were agreed by the TF-GLQ and suggested for the EQA program, although this was extended to include all of the 80 analytes provided in the programme. This represented 33 biochemistry analytes, 20 endocrine, 12 enzymes and three tumour markers. Clinical Chemistry analytes were selected during the pilot as experience by TF-GLQ members visiting Africa had reported very little EQA programmes used within this discipline, with the majority of EQA programmes focused on infectious diseases.
Category of EQA design
The evaluation capabilities of the EQA design was classified as Category 6 as defined by Miller et al. [13]. In this category, the material was not deemed to be commutable, and no assessment of trueness or metrological traceability was available. The analytical quality being assessed was therefore total error (includes bias and imprecision, as applied to single results) as defined by Jones et al. [12].
Performance reports
Reports were returned to the laboratories by email in the form of. pdf files. The reports included method mean, standard deviation and coefficient of variation, the z-score and acceptable range calculated as peer group mean ±2 times standard deviations.
All members of the TF-GLQ had access to the results from all 100 laboratories and the country coordinator had access to the results for the 10 laboratories in their country.
Results
Figure 1 shows the percentage of survey participation for the ten individual countries for 2023. Although not shown in Figure 1, Country A and Country F did not return any results for October, November and December and Country J for November and December 2023. Country G had zero participation for April, July and October 2023. Country J averaged only 2.4 laboratories per month submitting results.

The rate of participation of countries in the survey for 2023. The Figure shows overall participation for each of the ten countries in % for 2023.
As part of enrolling in the 1WA results data base, the laboratories were required to list the manufacturer and instrument model for each analyte. Table 1 list the instrument manufacturers that were used in two or more laboratories within the 100 included in the survey. In total, it was found that at least 50 manufacturers and over 100 instruments were listed as being used by these laboratories, including some used for one analyte only.
For some of the countries there were considerable differences in the number of results reported by different laboratories (Figure 2). As can be seen, the maximum number of results reported by an individual laboratory for Country A is greater than the number of analytes in the material provided which is due to one laboratory returning results for the same analyte assayed on multiple instruments. Similarly, one laboratory from Country F measured the same analytes on two different instruments. There was large variability in the instruments used by laboratories within any one country.

Results returned per month by laboratories in the 10 survey countries. Some laboratories returned more results than the number of analytes in the material provided as it was assayed on multiple instruments within one laboratory. Minimum and maximum number of applied analytes per laboratory per country are shown. Note that each laboratory could choose from the 80 provided lyophilised serum containing analytes according to their laboratory’s routine. In case that the graph shows a number greater than 80 (i.e., country A) same analytes were assayed on multiple instruments
The percentage of acceptable results submitted was calculated using the z-score (Figure 3).

Acceptability per month for individual countries. The acceptability was determined from the reports provided by 1WA based on CLIA criteria or±2 SD if not available in CLIA’88. The four countries shown in red were considered not to perform to an acceptable standard. Acceptability per month for each country. Of the ten countries surveyed, six averaged greater than 90 % acceptable results over the whole 12 months cycle, one had unacceptable results for two of the nine months they returned results and the other four were considered to not perform to an acceptable standard (shown in red). See text for additional interpretation.
Of the ten countries surveyed, six averaged greater than 90 % acceptable results over the whole 12-months cycle, one had unacceptable results for two of the nine months they returned results and the other four were considered to not perform to an acceptable standard (shown in red). For instance, the October result of 4 % is from two laboratories in Country J, with one laboratory having all 15 results unacceptable and the other having 10 of 11 results being unacceptable. Country A and F had no submission for October, November and December, Country J for November and December and Country G for April, July and October.
The results in Table 2 show the various in acceptability of results for individual laboratories within each country.
The percentage acceptability of submitted results for each individual laboratory in the 10 survey countries. The number of analytes per submission can be greater than 80 (number of analytes in the material) due to a laboratory using more than one instrument for the same analytes. The acceptability was determined from the reports provided by 1WA based on CLIA criteria or±2 SD if not available in CLIA. The results are for laboratories that returned 4 or greater monthly results.
Country A | Country B | Country C | Country D | Country E | Country F | Country G | Country H | Country I | Country J | |||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | Number of submissions | Minimum-maximum analytes per submission | Acceptable, % | |
Laboratory 1 | 6 | 38–42 | 96.7 | 12 | 15–16 | 93.1 | 12 | 21–40 | 89.1 | 12 | 10–16 | 93.9 | 12 | 15 | 99.4 | 9 | 18 | 91.1 | 9 | 20–22 | 78.0 | 11 | 13–18 | 35.1 | 11 | 10–12 | 45.9 | <4 | – | – |
Laboratory 2 | 9 | 39–42 | 99.2 | 12 | 21–23 | 98.9 | 11 | 43–45 | 93.5 | 10 | 24–31 | 99.1 | 11 | 19–20 | 96.8 | 9 | 32–40 | 75.2 | 9 | 15–38 | 64.2 | 12 | 27–31 | 88.6 | 10 | 13–15 | 38.4 | 6 | 16–30 | 73.7 |
Laboratory 3 | 9 | 15–16 | 99.3 | 12 | 15–16 | 92.4 | 12 | 16–18 | 99.0 | 12 | 15–17 | 96.3 | 11 | 15–16 | 98.9 | 5 | 21–26 | 94.8 | 9 | 10–15 | 78.7 | 9 | 12–22 | 62.3 | 11 | 17–19 | 99.0 | 5 | 17–24 | 55.0 |
Laboratory 4 | 5 | 20–24 | 95.2 | 12 | 26–29 | 86.9 | 12 | 19–26 | 89.5 | 12 | 34–44 | 96.8 | 10 | 32–35 | 96.1 | 8 | 32–41 | 96.0 | 8 | 18–22 | 89.3 | 12 | 25–26 | 91.8 | 11 | 16–29 | 97.4 | 8 | 14–20 | 86.3 |
Laboratory 5 | 9 | 17 | 93.3 | 12 | 16–17 | 89.1 | 12 | 51–56 | 98.7 | 12 | 17–18 | 91.7 | 10 | 26–28 | 99.2 | 8 | 24–44 | 96.6 | 8 | 15–17 | 71.4 | 11 | 43–50 | 66.3 | 12 | 45–47 | 94.6 | <4 | – | – |
Laboratory 6 | 9 | 18 | 97.4 | 12 | 45–50 | 97.4 | 12 | 51–71 | 97.2 | 11 | 14–17 | 100.0 | 12 | 21–22 | 99.2 | 9 | 29–35 | 92.0 | 9 | 47–49 | 48.8 | 12 | 23–31 | 70.0 | 12 | 15 | 33.4 | <4 | – | – |
Laboratory 7 | 9 | 52–61 | 97.4 | 12 | 30–37 | 96.9 | 12 | 25–28 | 98.9 | 12 | 37–44 | 99.4 | 11 | 20–21 | 97.4 | <4 | – | – | 9 | 45,633 | 44.4 | 11 | 7–8 | 6.2 | 12 | 49–52 | 90.0 | <4 | – | – |
Laboratory 8 | 9 | 84–87 | 98.4 | 12 | 22–23 | 98.8 | 12 | 24–26 | 99.0 | 12 | 37–53 | 95.5 | 8 | 42,278 | 83.4 | 8 | 30–34 | 90.1 | 9 | 17–21 | 85.2 | 12 | 9–34 | 66.3 | 12 | 15–17 | 79.3 | <4 | – | – |
Laboratory 9 | 9 | 53–63 | 93.2 | 12 | 19–22 | 96.8 | 12 | 38–40 | 99.0 | 11 | 35–53 | 90.1 | 10 | 18–21 | 95.3 | 9 | 25–37 | 96.8 | 9 | 13–21 | 88.7 | 12 | 29 | 91.4 | 12 | 14–16 | 70.7 | 6 | 13–14 | 85.7 |
Laboratory 10 | 8 | 20–21 | 98.1 | 12 | 39–45 | 97.8 | 11 | 47–48 | 91.8 | 11 | 13–20 | 89.0 | 11 | 3–14 | 95.5 | 3 | 17–29 | 93.0 | 9 | 45,540 | 92.8 | 9 | 11–12 | 45.7 | 12 | 13–14 | 63.6 | <4 | – | – |
As can be seen, the number of analyte results submitted for some laboratories varied significantly, e.g., laboratory 5, country F and laboratory 8, country H. On further investigation, this was found to be lack of reagents which is a major issue in many developing countries.
The performance for individual analytes measured in the survey are shown in Table 3.
The performance for analytes measured by at least 20 laboratories. The acceptability is based in the new CLIA criteria [11], except for lactate and transferrin for which the ±2SD of the peer group was used.
Analyte | Number of participants | Total number of results submitted | % of acceptable results as compared to overall average (to peers group average) | Acceptable limits (CLIA 2024 criteria) |
---|---|---|---|---|
Albumin | 91 | 898 | 80 | ±8 % |
Alkaline phosphatase | 93 | 889 | 79 | ±20 % |
ALT | 97 | 975 | 84 | ±15 % |
Amylase | 60 | 488 | 76 | ±20 % |
AST | 96 | 960 | 86 | ±15 % |
Bilirubin total | 96 | 900 | 81 | ±20 % |
Calcium | 71 | 666 | 75 | ±5 % |
Chloride | 67 | 656 | 82 | ±5 % |
Cholesterol HDL | 69 | 583 | 83 | ±20 % |
Cholesterol LDL | 43 | 322 | 83 | ±20 % |
Cholesterol total | 98 | 971 | 88 | ±10 % |
CK | 40 | 325 | 88 | ±20 % |
Creatinine | 100 | 982 | 75 | ±10 % |
Ferritin | 44 | 387 | 29 (65) | ±20 % |
GGT | 64 | 590 | 77 | ±15 % |
Glucose | 98 | 985 | 82 | ±8 % |
Iron | 45 | 417 | 91 | ±15 % |
Lactate | 21 | 110 | 90 | ±20 % |
LDH | 59 | 514 | 64 (69) | ±15 % |
Lipase | 29 | 274 | 70 | ±20 % |
Magnesium | 48 | 446 | 86 | ±15 % |
Phosphates | 69 | 624 | 77 | ±10 % |
Potassium | 75 | 767 | 86 | ±7 % |
Proteins total | 93 | 903 | 81 | ±8 % |
PSA | 48 | 441 | 80 | ±20 % |
Sodium | 73 | 747 | 84 | ±4 % |
T4 free | 52 | 505 | 45 (85) | ±15 % |
Transferrin | 20 | 128 | 95 | ±20 % |
Triglycerides | 97 | 960 | 86 | ±15 % |
TSH | 56 | 568 | 84 (96) | ±20 % |
Urea | 96 | 960 | 72 | ±9 % |
Uric acid | 79 | 715 | 76 | ±10 % |
-
The Table shows the percentage of satisfactory results according to CLIA, 2024 acceptable limits criteria, the number of analytical systems assessed and, in brackets, the number of participant laboratories for each analyte. Results are evaluated both in relation to the overall average and to the peer group average.
There are 80 available analytes in the material that was provided and the breakdown of these that were assayed by the participants is as follows:
80 % of the participants assayed albumin, alkaline phosphatase, ALT, AST, bilirubin total and conjugated, cholesterol, creatinine, glucose, potassium, total protein, sodium, urea, triglycerides and uric acid.
20 % of the participant laboratories assayed amylase, CO2 total, DHEA sulphate, lithium, and transferrin.
Less than 20 % of the participant laboratories assayed 11 deoxy-cortisol, 17 OH progesterone, aldosterone, androstenedione, cholinesterase, fructosamine, homocysteine, Lp(a), osmolality, prostatic acid phosphatase, total acid phosphatase, SHBG, T3 uptake, TIBC, acetaminophen, carbamazepine, digoxin, gentamicin, phenobarbital, phenytoin, salicylate, theophylline, tobramycin, valproic acid and vancomycin.
As expected, analytes requiring immunoassay for measuring were generally the least performed.
Discussion
The scope of this study was to evaluate laboratories’ EQA program performances in developing countries in order to improve the medical diagnostics and disease treatment of patients. EQA materials and associated software for on-going monitoring were provided free of charge to participating countries. We wanted to understand daily performance routines and their issues, acceptability rates of applied assays (pass or fail) and develop training programs for participating laboratories.
The simple EQA design applied fulfilled our aim of identifying poor performing laboratories in low to middle income countries that required further help and support. We didn’t need commutable, metrological traceable material with APS based on clinical or biological goals to achieve our aim. We needed cheap, stable material from a provider that had global EQA presence where we could use simple peer review using pragmatic APS criteria.
As an ISO 17043 accredited provider, 1WA assesses the homogeneity and stability of the EQA samples provided in their program. They may assess commutability of the samples based on CLSI EP14 “Evaluation of commutability of processed samples” but that is not a requirement. There is no mention in 1WA EQA collateral suggesting commutable samples are utilized. Note that an assessment of commutability would be done on the broader data set for each sample, not just that associated with labs in the IFCC pilot program.
As the standardization/harmonization of test methods was not the intent of the EQA Pilot program, commutability was not assessed by the TF-GLQ nor by 1WA on the processed samples. We know that EQA samples supporting large (global) programs will likely have matrix effects (biases) associated with their supplemented additives and processing. That is why consensus-based peer group assessments (instrument/reagent), where possible, are valuable in determining acceptable performance of participants.
Commutability was not assessed and considered unlikely. For this reason, peer view assessment was used rather than a comparison against a reference method or overall mean. The method peer group mean from participants results after exclusion of outliers was therefore used as the target value for each sample.
There are several different APSs that could be used (some country specific), but the program selected already included the recently revised CLIA criteria. These were selected in preference to the Milan Model biological variation used within Europe and Australia as the latter was considered unrealistic for low- and middle-income countries.
This EQA performance study covered the years 2022 and 2023. In preparation of this study, ten IFCC member countries were recruited in 2021, liaison managers in these countries were identified and assigned with the help of the respective local Clinical Chemistry organisations. A suitable commercial vendor of EQA materials was identified (i.e., 1WA) and a proposal for a panel of 15 commonly and widely used Biochemistry-and Immunoassays for routine analysis was developed. However, not every laboratory measured all of these, and they were expected to only measure analytes which were already part of their routine service.
In 2022, 50 laboratories in ten developing countries participated. Based on the initial acceptance of the program (as evidenced by a survey conducted at the end of 2022 in which participating countries expressed interest to expand the program), the program was extended to 100 laboratories in the same ten participating countries. The results reported here are for the program run in 2023.
We observed distinct differences in performance: Six out of the ten participating countries showed a consistently solid acceptance rate of >90 % over the observed time period. In contrast, four out of the ten participating countries showed poor acceptance rates between 60 and 90 %. This is below generally accepted EQA standards and requires improvement. The TF-GLQ intends to focus their activities to continue aid by online and on-site training particular with these countries. The six countries with solid acceptance rates of >90 % do not need to be further monitored and do not require additional assistance.
Some countries encountered shipping and customs clearance problems, thus being unable to submit results in a timely fashion. As lyophilized EQA materials are stable, it was concluded that for many developing countries in which logistics and customs clearance may be an issue, a single annual shipment is the most appropriate way to distribute the EQA material.
During the survey period and from subsequent discussions with participating laboratories, a number of common issues were identified as affecting performance:
Material awaiting custom clearance causes delays in laboratories receiving samples and not meeting the submission deadlines, as was the case for laboratories in Country G which were unable to submit results for April, July and October.
The availability of reagents and other materials due to a lack of money is a significant problem for laboratories in some countries. Also, often governments in developing countries prioritise resources and biochemistry may not be seen as being as urgent as the measurement of infectious diseases. For instance, in Africa, there is a major drive to control the spread of malaria, HIV and tuberculosis.
In some countries, there was a lack of understanding of the importance of quality control and external quality assessment and the associated procedures in ensuring acceptable laboratory performance leading to a non-commitment by the laboratory. This was reflected in the results for Country J where only 4 out of the 10 laboratories that were enrolled in the program submitted at least 4 sets of results. In this scenario, previous experience has shown that laboratories will participate if there is oversight of the program by the relevant authority such as the Ministry of Health.
In October and November of 2023, scientific support teams from the TF-GLQ in collaboration with the national societies organized workshops, laboratory visits and conducted country-specific training in Malawi, Zambia, Peru and Columbia. Workshops included training and case studies in IQC along with interpretation of EQA reports using real case studies and performance data from those countries. These visits were also used to identify issues and barriers to improving the quality of laboratory diagnostics in these and similar countries and the measures needed to address these concerns.
This study shows that not all countries perform to globally accepted EQA performance standards (note: all participating countries of this study are IFCC member countries) and that additional future efforts should be devoted to improve this unsatisfactory situation to the good of improved medical diagnostics and disease treatment of patients.
Acknowledgments
The IFCC TF-GLQ would like to thank Silvia Cardinale for her help and expertise in handling country communication and data survey collation and Anna Carobene for critical reading of and useful comments on the manuscript.
-
Research ethics: Not applicable, since no human subjects were involved in this study. Data analyzed was anonymized.
-
Informed consent: Not applicable.
-
Author contributions: The authors have accepted responsibility for the entire content of this manuscript and approved its submission.
-
Competing interests: JL is an employee of Abbott Labs, RB owns rbaisconsulting, KC is an employee of Bio-Rad Laboratories, JMG is an employee of Technical Direction Biogroup, EA owns Amann Consulting. All other author state no conflict of interest.
-
Research funding: None declared.
-
Data availability: Not applicable.
References
1. Schneider, F, Maurer, C, Friedberg, RC. International organization for standardization (ISO) 15189. Ann Lab Med 2017;37:365–70. https://doi.org/10.3343/alm.2017.37.5.365.Search in Google Scholar PubMed PubMed Central
2. Sciacovelli, L, Secchiero, S, Padoan, A, Plebani, M. External quality assessment programs in the context of ISO 15189 accreditation. Clin Chem Lab Med 2018;56:1644–54. https://doi.org/10.1515/cclm-2017-1179.Search in Google Scholar PubMed
3. Darcy, T. CLSI. Darcy, T, editor. QMS24 using proficiency testing and alternative assessment to improve medical laboratory quality, 3rd ed. Wayne, PA: Clinical and Laboratory Standards Institute; 2016.Search in Google Scholar
4. Payne, DA, Russomando, G, Linder, MW, Baluchova, K, Ashavaid, T, Steimer, W, et al.. External quality assessment (EQA) and alternative assessment procedures (AAPs) in molecular diagnostics: findings of an international survey. Clin Chem Lab Med 2021;59:301–6. https://doi.org/10.1515/cclm-2020-0101.Search in Google Scholar PubMed
5. Plebani, M. Harmonization in laboratory medicine: the complete picture. Clin Chem Lab Med 2013;51:741–51. https://doi.org/10.1515/cclm-2013-0075.Search in Google Scholar PubMed
6. Ceriotti, F. The role of external quality assessment schemes in monitoring and improving the standardization process. Clin Chim Acta 2014;432:77–81. https://doi.org/10.1016/j.cca.2013.12.032.Search in Google Scholar PubMed
7. Ceriotti, F, Cobbaert, C. Harmonization of external quality assessment schemes and their role – clinical chemistry and beyond. Clin Chem Lab Med 2018;56:1587–90. https://doi.org/10.1515/cclm-2018-0265.Search in Google Scholar PubMed
8. James, D, Ames, D, Lopez, B, Still, R, Simpson, W, Twomey, P. External quality assessment: best practice. J Clin Pathol 2014;67:651. https://doi.org/10.1136/jclinpath-2013-201621.Search in Google Scholar PubMed
9. Blasutig, I, Wheeler, S, Bais, R, Kumar Dabla, P, Lin, J, Perret-Liaudet, A, et al.. External quality assessment practices in medical laboratories: an IFCC global survey of member societies. Clin Chem Lab Med 2023;61:1404–10. https://doi.org/10.1515/cclm-2023-0057.Search in Google Scholar PubMed
10. Wheeler, S, Blasutig, I, Kumar Dabla, P, Giannoli, J, Vassault, A, Lin, J, et al.. Quality standards and internal quality control practices in medical laboratories: an IFCC global survey of member societies. Clin Chem Lab Med 2023;61:2094–101. https://doi.org/10.1515/cclm-2023-0492.Search in Google Scholar PubMed
11. 2024 CLIA acceptance limits for proficiency testing, quoted from the federal register Vol 87, No 131, 2022. https://www.westgard.com/2024-clia-requirements.htm.Search in Google Scholar
12. Jones, G, Albarede, S, Kesseler, D, MacKenzie, F, Mammen, J, Pedersen, M, et al.. Analytical performance specifications for external quality assessment - definitions and descriptions. Clin Chem Lab Med 2017;55:949–55. https://doi.org/10.1515/cclm-2017-0151.Search in Google Scholar PubMed
13. Miller, W, Jones, G, Horowitz, G, Weykamp, C. Proficiency testing/external quality assessment: current challenges and future directions. Clin Chem 2011;57:12. https://doi.org/10.1373/clinchem.2011.168641.Search in Google Scholar PubMed
© 2024 Walter de Gruyter GmbH, Berlin/Boston
Articles in the same Issue
- Frontmatter
- Editorial
- External quality assurance (EQA): navigating between quality and sustainability
- Reviews
- Molecular allergology: a clinical laboratory tool for precision diagnosis, stratification and follow-up of allergic patients
- Nitrous oxide abuse direct measurement for diagnosis and follow-up: update on kinetics and impact on metabolic pathways
- Opinion Papers
- A vision to the future: value-based laboratory medicine
- Point-of-care testing, near-patient testing and patient self-testing: warning points
- Navigating the path of reproducibility in microRNA-based biomarker research with ring trials
- Point/Counterpoint
- Six Sigma – is it time to re-evaluate its value in laboratory medicine?
- The value of Sigma-metrics in laboratory medicine
- Genetics and Molecular Diagnostics
- Analytical validation of the amplification refractory mutation system polymerase chain reaction-capillary electrophoresis assay to diagnose spinal muscular atrophy
- Can we identify patients carrying targeted deleterious DPYD variants with plasma uracil and dihydrouracil? A GPCO-RNPGx retrospective analysis
- General Clinical Chemistry and Laboratory Medicine
- Comparison of ChatGPT, Gemini, and Le Chat with physician interpretations of medical laboratory questions from an online health forum
- External quality assessment performance in ten countries: an IFCC global laboratory quality project
- Multivariate anomaly detection models enhance identification of errors in routine clinical chemistry testing
- Enhanced patient-based real-time quality control using the graph-based anomaly detection
- Performance evaluation and user experience of BT-50 transportation unit with automated and scheduled quality control measurements
- Stability of steroid hormones in dried blood spots (DBS)
- Quantification of C1 inhibitor activity using a chromogenic automated assay: analytical and clinical performances
- Reference Values and Biological Variations
- Time-dependent characteristics of analytical measurands
- Cancer Diagnostics
- Expert-level detection of M-proteins in serum protein electrophoresis using machine learning
- An automated workflow based on data independent acquisition for practical and high-throughput personalized assay development and minimal residual disease monitoring in multiple myeloma patients
- Cardiovascular Diseases
- Analytical validation of the Mindray CL1200i analyzer high sensitivity cardiac troponin I assay: MERITnI study
- Diabetes
- Limitations of glycated albumin standardization when applied to the assessment of diabetes patients
- Patient result monitoring of HbA1c shows small seasonal variations and steady decrease over more than 10 years
- Letters to the Editor
- Inaccurate definition of Bence Jones proteinuria in the EFLM Urinalysis Guideline 2023
- Use of the term “Bence-Jones proteinuria” in the EFLM European Urinalysis Guideline 2023
- Is uracil enough for effective pre-emptive DPD testing?
- Reply to: “Is uracil enough for effective pre-emptive DPD testing?”
- Accurate predictory role of monocyte distribution width on short-term outcome in sepsis patients
- Reply to: “Accurate predictory role of monocyte distribution width on short-term outcome in sepsis patients”
- Spurious parathyroid hormone (PTH) elevation caused by macro-PTH
- Setting analytical performance specifications for copeptin-based testing
- Serum vitamin B12 levels during chemotherapy against diffuse large B-cell lymphoma: a case report and review of the literature
- Evolution of acquired haemoglobin H disease monitored by capillary electrophoresis: a case of a myelofibrotic patient with a novel ATRX mutation
Articles in the same Issue
- Frontmatter
- Editorial
- External quality assurance (EQA): navigating between quality and sustainability
- Reviews
- Molecular allergology: a clinical laboratory tool for precision diagnosis, stratification and follow-up of allergic patients
- Nitrous oxide abuse direct measurement for diagnosis and follow-up: update on kinetics and impact on metabolic pathways
- Opinion Papers
- A vision to the future: value-based laboratory medicine
- Point-of-care testing, near-patient testing and patient self-testing: warning points
- Navigating the path of reproducibility in microRNA-based biomarker research with ring trials
- Point/Counterpoint
- Six Sigma – is it time to re-evaluate its value in laboratory medicine?
- The value of Sigma-metrics in laboratory medicine
- Genetics and Molecular Diagnostics
- Analytical validation of the amplification refractory mutation system polymerase chain reaction-capillary electrophoresis assay to diagnose spinal muscular atrophy
- Can we identify patients carrying targeted deleterious DPYD variants with plasma uracil and dihydrouracil? A GPCO-RNPGx retrospective analysis
- General Clinical Chemistry and Laboratory Medicine
- Comparison of ChatGPT, Gemini, and Le Chat with physician interpretations of medical laboratory questions from an online health forum
- External quality assessment performance in ten countries: an IFCC global laboratory quality project
- Multivariate anomaly detection models enhance identification of errors in routine clinical chemistry testing
- Enhanced patient-based real-time quality control using the graph-based anomaly detection
- Performance evaluation and user experience of BT-50 transportation unit with automated and scheduled quality control measurements
- Stability of steroid hormones in dried blood spots (DBS)
- Quantification of C1 inhibitor activity using a chromogenic automated assay: analytical and clinical performances
- Reference Values and Biological Variations
- Time-dependent characteristics of analytical measurands
- Cancer Diagnostics
- Expert-level detection of M-proteins in serum protein electrophoresis using machine learning
- An automated workflow based on data independent acquisition for practical and high-throughput personalized assay development and minimal residual disease monitoring in multiple myeloma patients
- Cardiovascular Diseases
- Analytical validation of the Mindray CL1200i analyzer high sensitivity cardiac troponin I assay: MERITnI study
- Diabetes
- Limitations of glycated albumin standardization when applied to the assessment of diabetes patients
- Patient result monitoring of HbA1c shows small seasonal variations and steady decrease over more than 10 years
- Letters to the Editor
- Inaccurate definition of Bence Jones proteinuria in the EFLM Urinalysis Guideline 2023
- Use of the term “Bence-Jones proteinuria” in the EFLM European Urinalysis Guideline 2023
- Is uracil enough for effective pre-emptive DPD testing?
- Reply to: “Is uracil enough for effective pre-emptive DPD testing?”
- Accurate predictory role of monocyte distribution width on short-term outcome in sepsis patients
- Reply to: “Accurate predictory role of monocyte distribution width on short-term outcome in sepsis patients”
- Spurious parathyroid hormone (PTH) elevation caused by macro-PTH
- Setting analytical performance specifications for copeptin-based testing
- Serum vitamin B12 levels during chemotherapy against diffuse large B-cell lymphoma: a case report and review of the literature
- Evolution of acquired haemoglobin H disease monitored by capillary electrophoresis: a case of a myelofibrotic patient with a novel ATRX mutation