Validation Blames Human Error in HIV Test Kit Evaluation




Procuring medical devices to carry out HIV tests across the country has seen a series of back and forth turns in the process since 2012. On the Ministry of Health’s own admission, this has disrupted national HIV service delivery. The tests carried out on 6.3 million people over the past nine months were short of target owing largely to the unavailability of rapid HIV test kits.

Buying 11 million of these kits was challenged by disputes in standard setting, shifting procedures and techniques as the approach, despite clear guidelines in international and national policies and guidance. The country’s algorithm, a method employed to determine the precision of the kits, remained unchecked for close to 10 years, giving way to a single private company, Medica Pharma Plc, positioning itself as the sole supplier.

An attempt to revise the algorithm in 2012 ensued further disputes and a series of tenders and retenders on who is qualified to supply the kits that meet standards established by the World Health Organisation (WHO). This is a UN health watchdog which qualifies and prequalifies suppliers of medical and medicine supplies. Involved in the validation process are the US Centre for Disease Control (CDC), the Global Fund to Fight AIDS, TB & Malaria, and the Ethiopian Public Health Institute.

The Ministry, responsible for the procurement and distribution of the kits, had dropped its own endorsed field-tested algorithm in January 2015. Yet it went ahead with the purchase through an international agent, when the national procurement agency ordered cancellation of the announced supplier.

Subsequently, an order was placed with Premier Media Corporation First Response® HIV½ Card, a supplier the Ministry says had “lead-time and with 100pc specificity and sensitivity” when evaluated by WHO. It is the same kit tested in Zambia, Ghana, South Africa and India, having satisfactory results, the Ministry said.

Evaluation on local conformity, conducted on the imported kits by the Institute in its laboratory and in the field – in Debre Berhan and Dubti in Afar Region – showed 91.3pc sensitivity results short of the WHO’s requirement of 99pc. This led to the credibility of the Institute being questioned in three layers: non-observance of procedures, protocols, and supervision. But officials at the Ministry are certain that this has never happened and pledge it will not be repeated.

Consequently, the Ministry’s officials have resorted to using plasma testing, and a scientific method known as INNO-LIA, which is a supplemental HIV assay, meaning it determines quality. But INNO-LIA is not a kit or independent technique for proving adaptability. Neither are the lab-based plasma testing and field whole-blood tests replaceable – but complimentary to be certain.

The stirrings of procurement procedures have got so high, it has now reached a point of accusations and counter accusations of manipulating field test results. The Federal Ethics & Anti-Corruption Commission was involved to ensure corrective measures were taken. The nature of investigation promised remains vague.

The local bidder, largely owned by Dawit G. Egziyabher, has strong assumptions of why all this is happening and who is behind this. To Taye Tolera (MD), chief of the Office of the State Minister for Health, the incoherent and volatile decisions of his Ministry in the three-year procurement saga can be justified by its desire to break the monopoly in supply, and search for better deals which can fall within budget.

In these exclusive interviews, SAMRAWIT TASSEW, FORTUNE EDITOR-IN-CHIEF, unearths the facts as the officials try to clear the air around the disquieting challenges of the troublesome procurement. Here is the one with Taye Tolera (MD), chief of the Office of the State Minister for Health.

Fortune: Can you explain the rationale for the Ministry of Health to decide to revise the algorithm used in the past and kickstart a new competitive bidding, while the existing system was working out right?

Taye Tolera: A single supplier monopolized the supply of Rapid HIV test kits for almost 10 years. Three years back, we decided that the country needs to revise its algorithm, so as to include maybe three or four products, based on their qualities. Once we confirmed the quality of those products and selected our algorithm, then our plan was to select suppliers based on their cost.

For the last 10 years, this did not happen because we used KHB as a screening test, STAT-PAK as confirmatory and Uni-Gold as a tie-breaker [the first two obtained from a pharmaceuticals manufacturing company represented by Medica Pharma]. We also wanted to open the market for others who could win through competitive bidding. It was not really the right way to use a single supplier for 10 or 11 years.

Q: Once convinced that new algorithm and procurement were needed, why did it take so long? Your media release touched upon this. What happened in the three years ?

Once we decided to change the algorithm and to use a competitive bidding process, we asked the Ethiopian Public Health Institute (EPHI) to select some test kits based on their quality and submit these the Pharmaceutical Fund Supply Agency (PFSA). The evaluation process in EPHI took very long – close to eight to nine months – identifying the perfect algorithm. There was another delay in the procurement process.

Right when we were preparing to sign a contract, the winner identified in the algorithm got delisted by the WHO pre-qualified list of HIV test kits. We asked the second winner, Stat-Pak, again represented by Medica Pharma to supply the test kits with the price it offered for KHB. Medica rejected the offer. Due to budget constraints that we might face, after buying the STAT-PAK at a higher price, we decided to drop the whole bid altogether.

Q: Are you saying there were limited resources to buy the kits?

No; but we had to balance. Global Fund gives a three-year budget cycle for HIV/Aids test kits and other equipment related to the kits. We decided to prioritize. PFSA went back to the bidders and identified a supplier who was not in the WHO prequalified list at the time, but then made it to the list before the process was over. PFSA considered negotiating with them. Medica Pharma took the case to the Federal Procurement Agency, which advised us to stop the process. We did.

Q: Was this supplier PFSA approached the one supplying First Response1-2-0; the controversial product?

No. After all these challenges, and anticipated shortage of HIV test kits, we realised that and we asked our partner, the Global Fund, to procure the kits on behalf of Ethiopia. The funds have always been from the Global Fund, and we actually had prerequisites.

Any test kit to be selected should be ISO 13485 certified; it should fulfill the requirements of Global Fund establishing countries, which are the US, EU, Japan Australia and Canada. It should also be prequalified by the WHO.

Q: What happened to the WHO recommended country specific algorithm; why did you skip the algorithm established in January 2015? What was the rationale to exclude the algorithm test out of these three requirements?

In the recent algorithm from 2015, KHB was the first line of screening. Unfortunately, it was delisted from the WHO list. In this case, all we had to do was repeat another field test and do another algorithm all over again.

Since we were nearing shortage, we used a faster track. We had invited all suppliers pre-qualified in the WHO list and compared prices. That is why we put those conditions. For the product to be used in Ethiopia, we actually set a criterion, which is right after delivery doing a lot of sample testing on the delivered kits.

Q: Can lot testing replace field tests and provide indicators for the country’s algorithm?

Lot testing simply means sample testing. And if the product passes the quality to be used in Ethiopia, then it would be used. Otherwise, it should not be used. We did this in the interest of time.

Q: Shouldn`t that be done before delivery?

We applied lot testing after shipment in all other medical supplies we buy.

Q: But in this Rapid HIV test kits case, isn`t algorithm mandatory, and a certain way of ensuring fitness?

No; it is not really necessary.

Q: How do you know that the product is suitable for certain topography, and weather conditions?

We check the sensitivity and the specificity of the product in a laboratory. This tells us about the performance of the diagnostic. For instance, if the sensitivity is more than 99pc, and if the specificity is more than 98pc, then the product is suitable to be used in Ethiopia. We can definitely decide on that.

The product, First Response HIV1-2-0 card test, has 100pc sensitivity and specificity in WHO prequalified list. Whether or not you have an algorithm, the product can certainly be used in Ethiopia.

Q: If it is that certain to depend on the WHO list, why do countries, including Ethiopia, need to go through rigorous field tests wasting time and resources? And why does WHO recommend that countries do so?

It is good to have algorithm; it helps to train professionals accordingly to avoid errors. There should be no problem if you use any particular test kits for HIV testing. It is not really a problem. We are currently working on algorithm. We will start the process all over again. We will set three or four algorithms based on the available test kits that qualify Ethiopia`s standards.

Q: How long do you believe your stock – six million kits – will last?

Six to seven months.

Q: Can the algorithm be established within this period?

Yes. All the challenges – the up and downs – will not be repeated.

Q: Let me take you back to what happened after the first batch was delivered. Why were there contradicting announcements?

The results from laboratory tests on plasma and results from field tests on whole blood were irreconcilable. While its sensitivity was very good on plasma, it was so bad on field tests that were carried out in Dubti and Debre Berhan, areas selected for their extreme weather condition. The result not only contradicted the two approaches, but reports from other countries – Ghana, Zambia, and South Africa – were cases in point. This struck interest and suspicion in scientific groups; we decided to involve others.

Q: On what basis was the Center for Disease Control (CDC) selected?

CDC is like a quality assurance body globally. It is equivalent to the Ministry of Health, in the United States. We have close collaboration with them; they support us in evaluation of other kits as well. You can say that CDC is like the best laboratory in the world.

Q: Do you believe it is more accepted globally more than WHO does?

WHO does not have a laboratory here. It was not important to involve it while they do not have a lab in Ethiopia. The CDC closely works with EPHI. They have an office inside EPHI building, and use their lab. In this new evaluation, the experts involved were two from CDC, one from EPHI, while it was supervised by the Director of HIV and TB Research lab at EPHI.

Of course, it was also closely supervised by officials from the Ministry of Health and also EPHI, including the Director General. We considered the second evaluation as ultimately precise because it was done in a clear manner, using international protocols. It was also validated by CDC in Atlanta. We had no reason to question this result.

Q: Did the joint validation repeat the same process?

They did not follow the standard because they did not use the gold standard technique, which is INNO-LIA, but used ELISA. This evaluation is not valid without using that gold standard. Since ELISA is reported to have high false positivity rates, INNO-LIA was introduced one or two years ago.

It was available in the EPHI laboratory which was later used by the joint revalidation team. It was not bought or imported for the purpose. The other issue is that we actually suspected human error. When field tests were performed, mistakes were made. We are actually investigating if the mistakes were intentional or as a result of negligence.

Q: When was the problem identified; from the research report or after the validation test?

After the validation test was carried out. We did the experiments and found out that the technicians involved in the test committed some mistakes. After the second test by the joint team, EPHI actually reversed its report and became invalid. The supervision was not really adequate during this evaluation. I think that was another big problem.

Q: Were there any suspects – say technicians or supervisors – under investigation for the flaws? Will there be formal charges?

I do not really want to go into details but the nature of the error is under investigation. Whether there will possibly be charges depends on the findings we will get from these investigations.

Q: Has anyone been suspended?

No. All the decisions we make are based on evidence; we are now undertaking a fact-finding mission.

Q: What lessons were learnt from this?

Adequate supervision was not involved in the previous evaluation and we thought that was a serious gap in the process. Technicians involved should be guided by the book. They should follow all the protocols when they do their evaluations. These were the lessons we drew from the first evaluation.

Fortunately, this joint evaluation unequivocally and definitively showed that the previous evaluation was wrong.

Q: EPHI has been crucial to the Ministry for at least 20 years, producing research outputs informing policy decisions. Has there ever been such a case where the output was questioned?

We were very surprised with the contradicting results. While WHO recorded 100pc sensitivity, EPHI recorded 91.3pc, way below standard. When we looked into how EPHI performed this evaluation, it was totally flawed. It did not follow international standards and protocols, including WHO and UNAIDS recommendations. Even if we were critical of their findings, the delivery process and dissemination yet to start, were suspended immediately.

A joint evaluation and validation sample test were ordered, this time involving EPHI and CDC. I am not saying here EPHI cannot be trusted; but, the technicians who performed this evaluation made some major mistakes.

Q: Do you see EPHI reputation in the eyes of the wider public damaged? How can you make a mother trust the Institute and buy imported medicine it approves if you admit such a gross mistake took place?

I understand your concern and we are not worried over EPHI credibility. We have full trust in EPHI; the evaluations it conducted for several years have been valid. It is only this particular evaluation which was wrongly conducted. All the process EPHI has undergone is trustworthy and reliable. It will continue to be a research arm for the Ministry. They will inform us with evidence and we will make our decisions.



Published on May 17,2016 [ Vol 17 ,No 837]


SHARE :
               


Editorial

Two weeks ago, a landslide in the Repi area of Addis Abeba, commonly kn...


Agenda

The health sector has always been an issue that has long concerned the...


Fineline

There could perhaps be no political party in the wo...


Commentary

In today's growing market place, many businesses are trying to make pro...


Viewpoint

Uncertainty surrounding one of President Barack Obama’s signature...


Opinion

There are some studies which suggest the primary reason for the creatio...


View From Arada

It has been almost six months since the announcement of the state of em...


Editors Pick














MEMBERS' LOGIN

ADVERTISEMENT

ADVERTISEMENT

ADVERTISEMENT

ADVERTISEMENT

SUBSCRIBE TO ADDISFORTUNE

Subscribe to our Newsletter

* indicates required


ADVERTISEMENT