Journal Article > ReviewFull Text
E Clinical Medicine. 11 March 2024; Volume 70; 102508.; DOI:10.1016/j.eclinm.2024.102508
Ruef M, Emonet S, Merglen A, Dewez JE, Obama BM, et al.
E Clinical Medicine. 11 March 2024; Volume 70; 102508.; DOI:10.1016/j.eclinm.2024.102508
BACKGROUND
The increasing resistance of Enterobacterales to third-generation cephalosporins and carbapenems in sub-Saharan Africa (SSA) is a major public health concern. We did a systematic review and meta-analysis of studies to estimate the carriage prevalence of Enterobacterales not susceptible to third-generation cephalosporins or carbapenems among paediatric populations in SSA.
METHODS
We performed a systematic literature review and meta-analysis of cross-sectional and cohort studies to estimate the prevalence of childhood (0-18 years old) carriage of extended-spectrum cephalosporin-resistant Enterobacterales (ESCR-E) or carbapenem-resistant Enterobacterales (CRE) in SSA. Medline, EMBASE and the Cochrane Library were searched for studies published from 1 January 2005 to 1 June 2022. Studies with <10 occurrences per bacteria, case reports, and meta-analyses were excluded. Quality and risk of bias were assessed using the Newcastle-Ottawa scale. Meta-analyses of prevalences and odds ratios were calculated using generalised linear mixed-effects models. Heterogeneity was assessed using I2 statistics. The protocol is available on PROSPERO (CRD42021260157).
FINDINGS
Of 1111 studies examined, 40 met our inclusion criteria, reporting on the carriage prevalence of Enterobacterales in 9408 children. The pooled carriage prevalence of ESCR-E was 32.2% (95% CI: 25.2%-40.2%). Between-study heterogeneity was high (I2 = 96%). The main sources of bias pertained to participant selection and the heterogeneity of the microbiological specimens. Carriage proportions were higher among sick children than healthy ones (35.7% vs 16.9%). The pooled proportion of nosocomial acquisition was 53.8% (95% CI: 32.1%-74.1%) among the 922 children without ESCR-E carriage at hospital admission. The pooled odds ratio of ESCR-E carriage after antibiotic treatment within the previous 3 months was 3.20 (95% CI: 2.10-4.88). The proportion of pooled carbapenem-resistant for Enterobacterales was 3.6% (95% CI: 0.7%-16.4%).
INTERPRETATION
This study suggests that ESCR-E carriage among children in SSA is frequent. Microbiology capacity and infection control must be scaled-up to reduce the spread of those multidrug-resistant microorganisms.
The increasing resistance of Enterobacterales to third-generation cephalosporins and carbapenems in sub-Saharan Africa (SSA) is a major public health concern. We did a systematic review and meta-analysis of studies to estimate the carriage prevalence of Enterobacterales not susceptible to third-generation cephalosporins or carbapenems among paediatric populations in SSA.
METHODS
We performed a systematic literature review and meta-analysis of cross-sectional and cohort studies to estimate the prevalence of childhood (0-18 years old) carriage of extended-spectrum cephalosporin-resistant Enterobacterales (ESCR-E) or carbapenem-resistant Enterobacterales (CRE) in SSA. Medline, EMBASE and the Cochrane Library were searched for studies published from 1 January 2005 to 1 June 2022. Studies with <10 occurrences per bacteria, case reports, and meta-analyses were excluded. Quality and risk of bias were assessed using the Newcastle-Ottawa scale. Meta-analyses of prevalences and odds ratios were calculated using generalised linear mixed-effects models. Heterogeneity was assessed using I2 statistics. The protocol is available on PROSPERO (CRD42021260157).
FINDINGS
Of 1111 studies examined, 40 met our inclusion criteria, reporting on the carriage prevalence of Enterobacterales in 9408 children. The pooled carriage prevalence of ESCR-E was 32.2% (95% CI: 25.2%-40.2%). Between-study heterogeneity was high (I2 = 96%). The main sources of bias pertained to participant selection and the heterogeneity of the microbiological specimens. Carriage proportions were higher among sick children than healthy ones (35.7% vs 16.9%). The pooled proportion of nosocomial acquisition was 53.8% (95% CI: 32.1%-74.1%) among the 922 children without ESCR-E carriage at hospital admission. The pooled odds ratio of ESCR-E carriage after antibiotic treatment within the previous 3 months was 3.20 (95% CI: 2.10-4.88). The proportion of pooled carbapenem-resistant for Enterobacterales was 3.6% (95% CI: 0.7%-16.4%).
INTERPRETATION
This study suggests that ESCR-E carriage among children in SSA is frequent. Microbiology capacity and infection control must be scaled-up to reduce the spread of those multidrug-resistant microorganisms.
Journal Article > ReviewFull Text
E Clinical Medicine. 8 March 2024; Volume 70; 102512.; DOI:10.1016/j.eclinm.2024.102512
Kowalski M, Minka Obama B, Catho G, Dewez JE, Merglen A, et al.
E Clinical Medicine. 8 March 2024; Volume 70; 102512.; DOI:10.1016/j.eclinm.2024.102512
BACKGROUND
The burden of antimicrobial resistance (AMR) has been estimated to be the highest in sub-Saharan Africa (SSA). The current study estimated the proportion of drug-resistant Enterobacterales causing infections in SSA children.
METHODS
We searched MEDLINE/PubMed, Embase and the Cochrane Library to identify retrospective and prospective studies published from 01/01/2005 to 01/06/2022 reporting AMR of Enterobacterales causing infections in sub-Saharan children (0-18 years old). Studies were excluded if they had unclear documentation of antimicrobial susceptibility testing methods or fewer than ten observations per bacteria. Data extraction and quality appraisal were conducted by two authors independently. The primary outcome was the proportion of Enterobacterales resistant to antibiotics commonly used in paediatrics. Proportions were combined across studies using mixed-effects logistic regression models per bacteria and per antibiotic. Between-study heterogeneity was assessed using the I2 statistic. The protocol was registered with PROSPERO (CRD42021260157).
FINDINGS
After screening 1111 records, 122 relevant studies were included, providing data on more than 30,000 blood, urine and stool isolates. Escherichia coli and Klebsiella spp. were the predominant species, both presenting high proportions of resistance to third-generation cephalosporins, especially in blood cultures: 40.6% (95% CI: 27.7%-55%; I2: 85.7%, number of isolates (n): 1032) and 84.9% (72.8%-92.2%; I2: 94.1%, n: 2067), respectively. High proportions of resistance to other commonly used antibiotics were also observed. E. coli had high proportions of resistance, especially for ampicillin (92.5%; 95% CI: 76.4%-97.9%; I2: 89.8%, n: 888) and gentamicin (42.7%; 95% CI: 30%-56.5%; I2: 71.9%, n: 968). Gentamicin-resistant Klebsiella spp. were also frequently reported (77.6%; 95% CI: 65.5%-86.3%; I2: 91.6%, n: 1886).
INTERPRETATION
High proportions of resistance to antibiotics commonly used for empirical treatment of infectious syndromes were found for Enterobacterales in sub-Saharan children. There is a critical need to better identify local patterns of AMR to inform and update clinical guidelines for better treatment outcomes.
The burden of antimicrobial resistance (AMR) has been estimated to be the highest in sub-Saharan Africa (SSA). The current study estimated the proportion of drug-resistant Enterobacterales causing infections in SSA children.
METHODS
We searched MEDLINE/PubMed, Embase and the Cochrane Library to identify retrospective and prospective studies published from 01/01/2005 to 01/06/2022 reporting AMR of Enterobacterales causing infections in sub-Saharan children (0-18 years old). Studies were excluded if they had unclear documentation of antimicrobial susceptibility testing methods or fewer than ten observations per bacteria. Data extraction and quality appraisal were conducted by two authors independently. The primary outcome was the proportion of Enterobacterales resistant to antibiotics commonly used in paediatrics. Proportions were combined across studies using mixed-effects logistic regression models per bacteria and per antibiotic. Between-study heterogeneity was assessed using the I2 statistic. The protocol was registered with PROSPERO (CRD42021260157).
FINDINGS
After screening 1111 records, 122 relevant studies were included, providing data on more than 30,000 blood, urine and stool isolates. Escherichia coli and Klebsiella spp. were the predominant species, both presenting high proportions of resistance to third-generation cephalosporins, especially in blood cultures: 40.6% (95% CI: 27.7%-55%; I2: 85.7%, number of isolates (n): 1032) and 84.9% (72.8%-92.2%; I2: 94.1%, n: 2067), respectively. High proportions of resistance to other commonly used antibiotics were also observed. E. coli had high proportions of resistance, especially for ampicillin (92.5%; 95% CI: 76.4%-97.9%; I2: 89.8%, n: 888) and gentamicin (42.7%; 95% CI: 30%-56.5%; I2: 71.9%, n: 968). Gentamicin-resistant Klebsiella spp. were also frequently reported (77.6%; 95% CI: 65.5%-86.3%; I2: 91.6%, n: 1886).
INTERPRETATION
High proportions of resistance to antibiotics commonly used for empirical treatment of infectious syndromes were found for Enterobacterales in sub-Saharan children. There is a critical need to better identify local patterns of AMR to inform and update clinical guidelines for better treatment outcomes.
Journal Article > ResearchFull Text
Int J Infect Dis. 1 September 2022; Volume 122; 215-221.; DOI:10.1016/j.ijid.2022.05.039
Zheng Q, Luquero FJ, Ciglenecki I, Wamala JF, Abubakar A, et al.
Int J Infect Dis. 1 September 2022; Volume 122; 215-221.; DOI:10.1016/j.ijid.2022.05.039
BACKGROUND
Cholera remains a public health threat but is inequitably distributed across sub-Saharan Africa. Lack of standardized reporting and inconsistent outbreak definitions limit our understanding of cholera outbreak epidemiology.
METHODS
From a database of cholera incidence and mortality, we extracted data from sub-Saharan Africa and reconstructed outbreaks of suspected cholera starting in January 2010 to December 2019 based on location-specific average weekly incidence rate thresholds. We then described the distribution of key outbreak metrics.
RESULTS
We identified 999 suspected cholera outbreaks in 744 regions across 25 sub-Saharan African countries. The outbreak periods accounted for 1.8 billion person-months (2% of the total during this period) from January 2010 to January 2020. Among 692 outbreaks reported from second-level administrative units (e.g., districts), the median attack rate was 0.8 per 1000 people (interquartile range (IQR), 0.3-2.4 per 1000), the median epidemic duration was 13 weeks (IQR, 8-19), and the median early outbreak reproductive number was 1.8 (range, 1.1-3.5). Larger attack rates were associated with longer times to outbreak peak, longer epidemic durations, and lower case fatality risks.
CONCLUSIONS
This study provides a baseline from which the progress toward cholera control and essential statistics to inform outbreak management in sub-Saharan Africa can be monitored.
Cholera remains a public health threat but is inequitably distributed across sub-Saharan Africa. Lack of standardized reporting and inconsistent outbreak definitions limit our understanding of cholera outbreak epidemiology.
METHODS
From a database of cholera incidence and mortality, we extracted data from sub-Saharan Africa and reconstructed outbreaks of suspected cholera starting in January 2010 to December 2019 based on location-specific average weekly incidence rate thresholds. We then described the distribution of key outbreak metrics.
RESULTS
We identified 999 suspected cholera outbreaks in 744 regions across 25 sub-Saharan African countries. The outbreak periods accounted for 1.8 billion person-months (2% of the total during this period) from January 2010 to January 2020. Among 692 outbreaks reported from second-level administrative units (e.g., districts), the median attack rate was 0.8 per 1000 people (interquartile range (IQR), 0.3-2.4 per 1000), the median epidemic duration was 13 weeks (IQR, 8-19), and the median early outbreak reproductive number was 1.8 (range, 1.1-3.5). Larger attack rates were associated with longer times to outbreak peak, longer epidemic durations, and lower case fatality risks.
CONCLUSIONS
This study provides a baseline from which the progress toward cholera control and essential statistics to inform outbreak management in sub-Saharan Africa can be monitored.
Journal Article > ResearchFull Text
Trans R Soc Trop Med Hyg. 21 October 2008; Volume 103 (Issue 3); DOI:10.1016/j.trstmh.2008.09.005
Balasegaram M, Young H, Chappuis F, Priotto G, Raguenaud ME, et al.
Trans R Soc Trop Med Hyg. 21 October 2008; Volume 103 (Issue 3); DOI:10.1016/j.trstmh.2008.09.005
This paper describes the effectiveness of first-line regimens for stage 2 human African trypanosomiasis (HAT) due to Trypanosoma brucei gambiense infection in nine Médecins Sans Frontières HAT treatment programmes in Angola, Republic of Congo, Sudan and Uganda. Regimens included eflornithine and standard- and short-course melarsoprol. Outcomes for 10461 naïve stage 2 patients fitting a standardised case definition and allocated to one of the above regimens were analysed by intention-to-treat analysis. Effectiveness was quantified by the case fatality rate (CFR) during treatment, the proportion probably and definitely cured and the Kaplan-Meier probability of relapse-free survival at 12 months and 24 months post admission. The CFR was similar for the standard- and short-course melarsoprol regimens (4.9% and 4.2%, respectively). The CFR for eflornithine was 1.2%. Kaplan-Meier survival probabilities varied from 71.4-91.8% at 1 year and 56.5-87.9% at 2 years for standard-course melarsoprol, to 73.0-91.1% at 1 year for short-course melarsoprol, and 79.9-97.4% at 1 year and 68.6-93.7% at 2 years for eflornithine. With the exception of one programme, survival at 12 months was >90% for eflornithine, whilst for melarsoprol it was <90% except in two sites. Eflornithine is recommended where feasible, especially in areas with low melarsoprol effectiveness.
Journal Article > ResearchFull Text
BMJ. 20 September 2003; Volume 327 (Issue 7416); 650.; DOI:10.1136/bmj.327.7416.650
Grein T
BMJ. 20 September 2003; Volume 327 (Issue 7416); 650.; DOI:10.1136/bmj.327.7416.650
OBJECTIVE
To measure retrospectively mortality among a previously inaccessible population of former UNITA members and their families displaced within Angola, before and after their arrival in resettlement camps after ceasefire of 4 April 2002.
DESIGN
Three stage cluster sampling for interviews. Recall period for mortality assessment was from 21 June 2001 to 15-31 August 2002.
SETTING
Eleven resettlement camps over four provinces of Angola (Bié, Cuando Cubango, Huila, and Malange) housing 149 000 former UNITA members and their families. PARTICIPANTS: 900 consenting family heads of households, or most senior household members, corresponding to an intended sample size of 4500 individuals. MAIN
OUTCOME MEASURES
Crude mortality and proportional mortality, overall and by period (monthly, and before and after arrival in camps).
RESULTS
Final sample included 6599 people. The 390 deaths reported during the recall period corresponded to an average crude mortality of 1.5/10 000/day (95% confidence interval 1.3 to 1.8), and, among children under 5 years old, to 4.1/10 000/day (3.3 to 5.2). Monthly crude mortality rose gradually to a peak in March 2002 and remained above emergency thresholds thereafter. Malnutrition was the leading cause of death (34%), followed by fever or malaria (24%) and war or violence (18%). Most war victims and people who had disappeared were women and children.
CONCLUSIONS
This population of displaced Angolans experienced global and child mortality greatly in excess of normal levels, both before and after the 2002 ceasefire. Malnutrition deaths reflect the extent of the food crisis affecting this population. Timely humanitarian assistance must be made available to all populations in such conflicts.
To measure retrospectively mortality among a previously inaccessible population of former UNITA members and their families displaced within Angola, before and after their arrival in resettlement camps after ceasefire of 4 April 2002.
DESIGN
Three stage cluster sampling for interviews. Recall period for mortality assessment was from 21 June 2001 to 15-31 August 2002.
SETTING
Eleven resettlement camps over four provinces of Angola (Bié, Cuando Cubango, Huila, and Malange) housing 149 000 former UNITA members and their families. PARTICIPANTS: 900 consenting family heads of households, or most senior household members, corresponding to an intended sample size of 4500 individuals. MAIN
OUTCOME MEASURES
Crude mortality and proportional mortality, overall and by period (monthly, and before and after arrival in camps).
RESULTS
Final sample included 6599 people. The 390 deaths reported during the recall period corresponded to an average crude mortality of 1.5/10 000/day (95% confidence interval 1.3 to 1.8), and, among children under 5 years old, to 4.1/10 000/day (3.3 to 5.2). Monthly crude mortality rose gradually to a peak in March 2002 and remained above emergency thresholds thereafter. Malnutrition was the leading cause of death (34%), followed by fever or malaria (24%) and war or violence (18%). Most war victims and people who had disappeared were women and children.
CONCLUSIONS
This population of displaced Angolans experienced global and child mortality greatly in excess of normal levels, both before and after the 2002 ceasefire. Malnutrition deaths reflect the extent of the food crisis affecting this population. Timely humanitarian assistance must be made available to all populations in such conflicts.
Journal Article > ResearchFull Text
Confl Health. 17 June 2010; Volume 4; 12.; DOI:10.1186/1752-1505-4-12
O'Brien DP, Venis S, Greig J, Shanks L, Ellman T, et al.
Confl Health. 17 June 2010; Volume 4; 12.; DOI:10.1186/1752-1505-4-12
INTRODUCTION
Many countries ravaged by conflict have substantial morbidity and mortality attributed to HIV/AIDS yet HIV treatment is uncommonly available. Universal access to HIV care cannot be achieved unless the needs of populations in conflict-affected areas are addressed.
METHODS
From 2003 Médecins Sans Frontières introduced HIV care, including antiretroviral therapy, into 24 programmes in conflict or post-conflict settings, mainly in sub-Saharan Africa. HIV care and treatment activities were usually integrated within other medical activities. Project data collected in the Fuchia software system were analysed and outcomes compared with ART-LINC data. Programme reports and other relevant documents and interviews with local and headquarters staff were used to develop lessons learned.
RESULTS
In the 22 programmes where ART was initiated, more than 10,500 people were diagnosed with HIV and received medical care, and 4555 commenced antiretroviral therapy, including 348 children. Complete data were available for adults in 20 programmes (n = 4145). At analysis, 2645 (64%) remained on ART, 422 (10%) had died, 466 (11%) lost to follow-up, 417 (10%) transferred to another programme, and 195 (5%) had an unclear outcome. Median 12-month mortality and loss to follow-up were 9% and 11% respectively, and median 6-month CD4 gain was 129 cells/mm3. Patient outcomes on treatment were comparable to those in stable resource-limited settings, and individuals and communities obtained significant benefits from access to HIV treatment. Programme disruption through instability was uncommon with only one program experiencing interruption to services, and programs were adapted to allow for disruption and population movements. Integration of HIV activities strengthened other health activities contributing to health benefits for all victims of conflict and increasing the potential sustainability for implemented activities.
CONCLUSIONS
With commitment, simplified treatment and monitoring, and adaptations for potential instability, HIV treatment can be feasibly and effectively provided in conflict or post-conflict settings.
Many countries ravaged by conflict have substantial morbidity and mortality attributed to HIV/AIDS yet HIV treatment is uncommonly available. Universal access to HIV care cannot be achieved unless the needs of populations in conflict-affected areas are addressed.
METHODS
From 2003 Médecins Sans Frontières introduced HIV care, including antiretroviral therapy, into 24 programmes in conflict or post-conflict settings, mainly in sub-Saharan Africa. HIV care and treatment activities were usually integrated within other medical activities. Project data collected in the Fuchia software system were analysed and outcomes compared with ART-LINC data. Programme reports and other relevant documents and interviews with local and headquarters staff were used to develop lessons learned.
RESULTS
In the 22 programmes where ART was initiated, more than 10,500 people were diagnosed with HIV and received medical care, and 4555 commenced antiretroviral therapy, including 348 children. Complete data were available for adults in 20 programmes (n = 4145). At analysis, 2645 (64%) remained on ART, 422 (10%) had died, 466 (11%) lost to follow-up, 417 (10%) transferred to another programme, and 195 (5%) had an unclear outcome. Median 12-month mortality and loss to follow-up were 9% and 11% respectively, and median 6-month CD4 gain was 129 cells/mm3. Patient outcomes on treatment were comparable to those in stable resource-limited settings, and individuals and communities obtained significant benefits from access to HIV treatment. Programme disruption through instability was uncommon with only one program experiencing interruption to services, and programs were adapted to allow for disruption and population movements. Integration of HIV activities strengthened other health activities contributing to health benefits for all victims of conflict and increasing the potential sustainability for implemented activities.
CONCLUSIONS
With commitment, simplified treatment and monitoring, and adaptations for potential instability, HIV treatment can be feasibly and effectively provided in conflict or post-conflict settings.
Journal Article > ResearchFull Text
Am J Trop Med Hyg. 1 July 2006; Volume 75 (Issue 1); 143-145.
Guthmann JP, Cohuet S, Rigutto C, Fortes F, Saraiva N, et al.
Am J Trop Med Hyg. 1 July 2006; Volume 75 (Issue 1); 143-145.
In April 2004, 137 children 6-59 months of age with uncomplicated Plasmodium falciparum (Pf) malaria (Caala, Central Angola) were randomized to receive either artemether-lumefantrine (Coartem) or artesunate + amodiaquine (ASAQ). After 28 days of follow-up, there were 2/61 (3.2%) recurrent parasitemias in the Coartem group and 4/64 (6.2%) in the ASAQ group (P = 0.72), all classified as re-infections after PCR genotyping (cure rate = 100% [95%CI: 94-100] in both groups). Only one patient (ASAQ group) had gametocytes on day 28 versus five (Coartem) and three (ASAQ) at baseline. Compared with baseline, anemia was significantly improved after 28 days of follow-up in both groups (Coartem: from 54.1% to 13.4%; ASAQ: from 53.1% to 15.9%). Our findings are in favor of a high efficacy of both combinations in Caala. Now that Coartem has been chosen as the new first-line anti-malarial, the challenge is to insure that this drug is available and adequately used.
Journal Article > ResearchFull Text
AIDS. 1 August 2019; Volume 33 (Issue 10); 1635-1644.; DOI:10.1097/QAD.0000000000002234
Shroufi A, van Cutsem G, Cambiano V, Bansi-Matharu L, Duncan K, et al.
AIDS. 1 August 2019; Volume 33 (Issue 10); 1635-1644.; DOI:10.1097/QAD.0000000000002234
Français
BACKGROUND
Many individuals failing first-line antiretroviral therapy (ART) in sub-Saharan Africa never initiate second-line ART or do so after significant delay. For people on ART with a viral load more than 1000 copies/ml, the WHO recommends a second viral load measurement 3 months after the first viral load and enhanced adherence support. Switch to a second-line regimen is contingent upon a persistently elevated viral load more than 1000 copies/ml. Delayed second-line switch places patients at increased risk for opportunistic infections and mortality.
METHODS
To assess the potential benefits of a simplified second-line ART switch strategy, we use an individual-based model of HIV transmission, progression and the effect of ART which incorporates consideration of adherence and drug resistance, to compare predicted outcomes of two policies, defining first-line regimen failure for patients on efavirenz-based ART as either two consecutive viral load values more than 1000 copies/ml, with the second after an enhanced adherence intervention (implemented as per current WHO guidelines) or a single viral load value more than 1000 copies/ml. We simulated a range of setting-scenarios reflecting the breadth of the sub-Saharan African HIV epidemic, taking into account potential delays in defining failure and switch to second-line ART.
FINDINGS
The use of a single viral load more than 1000 copies/ml to define ART failure would lead to a higher proportion of persons with nonnucleoside reverse-transcriptase inhibitor resistance switched to second-line ART [65 vs. 48%; difference 17% (90% range 14-20%)], resulting in a median 18% reduction in the rate of AIDS-related death over setting scenarios (90% range 6-30%; from a median of 3.1 to 2.5 per 100 person-years) over 3 years. The simplified strategy also is predicted to reduce the rate of AIDS conditions by a median of 31% (90% range 8-49%) among people on first-line ART with a viral load more than 1000 copies/ml in the past 6 months. For a country of 10 million adults (and a median of 880 000 people with HIV), we estimate that this approach would lead to a median of 1322 (90% range 67-3513) AIDS deaths averted per year over 3 years. For South Africa this would represent around 10 215 deaths averted annually.
INTERPRETATION
As a step towards reducing unnecessary mortality associated with delayed second-line ART switch, defining failure of first-line efavirenz-based regimens as a single viral load more than 1000 copies/ml should be considered.
Many individuals failing first-line antiretroviral therapy (ART) in sub-Saharan Africa never initiate second-line ART or do so after significant delay. For people on ART with a viral load more than 1000 copies/ml, the WHO recommends a second viral load measurement 3 months after the first viral load and enhanced adherence support. Switch to a second-line regimen is contingent upon a persistently elevated viral load more than 1000 copies/ml. Delayed second-line switch places patients at increased risk for opportunistic infections and mortality.
METHODS
To assess the potential benefits of a simplified second-line ART switch strategy, we use an individual-based model of HIV transmission, progression and the effect of ART which incorporates consideration of adherence and drug resistance, to compare predicted outcomes of two policies, defining first-line regimen failure for patients on efavirenz-based ART as either two consecutive viral load values more than 1000 copies/ml, with the second after an enhanced adherence intervention (implemented as per current WHO guidelines) or a single viral load value more than 1000 copies/ml. We simulated a range of setting-scenarios reflecting the breadth of the sub-Saharan African HIV epidemic, taking into account potential delays in defining failure and switch to second-line ART.
FINDINGS
The use of a single viral load more than 1000 copies/ml to define ART failure would lead to a higher proportion of persons with nonnucleoside reverse-transcriptase inhibitor resistance switched to second-line ART [65 vs. 48%; difference 17% (90% range 14-20%)], resulting in a median 18% reduction in the rate of AIDS-related death over setting scenarios (90% range 6-30%; from a median of 3.1 to 2.5 per 100 person-years) over 3 years. The simplified strategy also is predicted to reduce the rate of AIDS conditions by a median of 31% (90% range 8-49%) among people on first-line ART with a viral load more than 1000 copies/ml in the past 6 months. For a country of 10 million adults (and a median of 880 000 people with HIV), we estimate that this approach would lead to a median of 1322 (90% range 67-3513) AIDS deaths averted per year over 3 years. For South Africa this would represent around 10 215 deaths averted annually.
INTERPRETATION
As a step towards reducing unnecessary mortality associated with delayed second-line ART switch, defining failure of first-line efavirenz-based regimens as a single viral load more than 1000 copies/ml should be considered.
Journal Article > ResearchFull Text
Malar J. 23 August 2009; Volume 8 (Issue 1); 203.; DOI:10.1186/1475-2875-8-203
Zwang J, Olliaro PL, Barennes H, Bonnet MMB, Brasseur P, et al.
Malar J. 23 August 2009; Volume 8 (Issue 1); 203.; DOI:10.1186/1475-2875-8-203
BACKGROUND: Artesunate and amodiaquine (AS&AQ) is at present the world's second most widely used artemisinin-based combination therapy (ACT). It was necessary to evaluate the efficacy of ACT, recently adopted by the World Health Organization (WHO) and deployed over 80 countries, in order to make an evidence-based drug policy.
METHODS: An individual patient data (IPD) analysis was conducted on efficacy outcomes in 26 clinical studies in sub-Saharan Africa using the WHO protocol with similar primary and secondary endpoints.
RESULTS: A total of 11,700 patients (75% under 5 years old), from 33 different sites in 16 countries were followed for 28 days. Loss to follow-up was 4.9% (575/11,700). AS&AQ was given to 5,897 patients. Of these, 82% (4,826/5,897) were included in randomized comparative trials with polymerase chain reaction (PCR) genotyping results and compared to 5,413 patients (half receiving an ACT). AS&AQ and other ACT comparators resulted in rapid clearance of fever and parasitaemia, superior to non-ACT. Using survival analysis on a modified intent-to-treat population, the Day 28 PCR-adjusted efficacy of AS&AQ was greater than 90% (the WHO cut-off) in 11/16 countries. In randomized comparative trials (n = 22), the crude efficacy of AS&AQ was 75.9% (95% CI 74.6-77.1) and the PCR-adjusted efficacy was 93.9% (95% CI 93.2-94.5). The risk (weighted by site) of failure PCR-adjusted of AS&AQ was significantly inferior to non-ACT, superior to dihydroartemisinin-piperaquine (DP, in one Ugandan site), and not different from AS+SP or AL (artemether-lumefantrine). The risk of gametocyte appearance and the carriage rate of AS&AQ was only greater in one Ugandan site compared to AL and DP, and lower compared to non-ACT (p = 0.001, for all comparisons). Anaemia recovery was not different than comparator groups, except in one site in Rwanda where the patients in the DP group had a slower recovery.
CONCLUSION: AS&AQ compares well to other treatments and meets the WHO efficacy criteria for use against falciparum malaria in many, but not all, the sub-Saharan African countries where it was studied. Efficacy varies between and within countries. An IPD analysis can inform general and local treatment policies. Ongoing monitoring evaluation is required.
METHODS: An individual patient data (IPD) analysis was conducted on efficacy outcomes in 26 clinical studies in sub-Saharan Africa using the WHO protocol with similar primary and secondary endpoints.
RESULTS: A total of 11,700 patients (75% under 5 years old), from 33 different sites in 16 countries were followed for 28 days. Loss to follow-up was 4.9% (575/11,700). AS&AQ was given to 5,897 patients. Of these, 82% (4,826/5,897) were included in randomized comparative trials with polymerase chain reaction (PCR) genotyping results and compared to 5,413 patients (half receiving an ACT). AS&AQ and other ACT comparators resulted in rapid clearance of fever and parasitaemia, superior to non-ACT. Using survival analysis on a modified intent-to-treat population, the Day 28 PCR-adjusted efficacy of AS&AQ was greater than 90% (the WHO cut-off) in 11/16 countries. In randomized comparative trials (n = 22), the crude efficacy of AS&AQ was 75.9% (95% CI 74.6-77.1) and the PCR-adjusted efficacy was 93.9% (95% CI 93.2-94.5). The risk (weighted by site) of failure PCR-adjusted of AS&AQ was significantly inferior to non-ACT, superior to dihydroartemisinin-piperaquine (DP, in one Ugandan site), and not different from AS+SP or AL (artemether-lumefantrine). The risk of gametocyte appearance and the carriage rate of AS&AQ was only greater in one Ugandan site compared to AL and DP, and lower compared to non-ACT (p = 0.001, for all comparisons). Anaemia recovery was not different than comparator groups, except in one site in Rwanda where the patients in the DP group had a slower recovery.
CONCLUSION: AS&AQ compares well to other treatments and meets the WHO efficacy criteria for use against falciparum malaria in many, but not all, the sub-Saharan African countries where it was studied. Efficacy varies between and within countries. An IPD analysis can inform general and local treatment policies. Ongoing monitoring evaluation is required.
Journal Article > ResearchFull Text
Am J Clin Nutr. 1 January 2007; Volume 85 (Issue 1); 218-224.; DOI:10.1093/ajcn/85.1.218
Seal AJ, Creeke PI, Dibari F, Cheung E, Kyroussis E, et al.
Am J Clin Nutr. 1 January 2007; Volume 85 (Issue 1); 218-224.; DOI:10.1093/ajcn/85.1.218
BACKGROUND
Outbreaks of pellagra were documented during the civil war in Angola, but no contemporary data on the incidence of pellagra or the prevalence of niacin deficiency were available.
OBJECTIVE
The objective was to investigate the incidence of pellagra and the prevalence of niacin deficiency in postwar Angola and their relation with dietary intake, poverty, and anthropometric status.
DESIGN
Admissions data from 1999 to 2004 from the pellagra treatment clinic in Kuito, Angola, were analyzed. New patients admitted over 1 wk were examined, and urine and blood samples were collected. A multistage cluster population survey collected data on anthropometric measures, household dietary intakes, socioeconomic status, and clinical signs of pellagra for women and children. Urinary excretion of 1-methylnicotinamide, 1-methyl-2-pyridone-5-carboxymide, and creatinine was measured and hemoglobin concentrations were measured with a portable photometer.
RESULTS
The incidence of clinical pellagra has not decreased since the end of the civil war in 2002. Low excretion of niacin metabolites was confirmed in 10 of 11 new clinic patients. Survey data were collected for 723 women aged 15-49 y and for 690 children aged 6-59 mo. Excretion of niacin metabolites was low in 29.4% of the women and 6.0% of the children, and the creatinine-adjusted concentrations were significantly lower in the women than in the children (P < 0.001, t test). In children, niacin status was positively correlated with the household consumption of peanuts (r = 0.374, P = 0.001) and eggs (r = 0.290, P = 0.012) but negatively correlated with socioeconomic status (r = -0.228, P = 0.037).
CONCLUSIONS
The expected decrease in pellagra incidence after the end of the civil war has not occurred. The identification of niacin deficiency as a public health problem should refocus attention on this nutritional deficiency in Angola and other areas of Africa where maize is the staple.
Outbreaks of pellagra were documented during the civil war in Angola, but no contemporary data on the incidence of pellagra or the prevalence of niacin deficiency were available.
OBJECTIVE
The objective was to investigate the incidence of pellagra and the prevalence of niacin deficiency in postwar Angola and their relation with dietary intake, poverty, and anthropometric status.
DESIGN
Admissions data from 1999 to 2004 from the pellagra treatment clinic in Kuito, Angola, were analyzed. New patients admitted over 1 wk were examined, and urine and blood samples were collected. A multistage cluster population survey collected data on anthropometric measures, household dietary intakes, socioeconomic status, and clinical signs of pellagra for women and children. Urinary excretion of 1-methylnicotinamide, 1-methyl-2-pyridone-5-carboxymide, and creatinine was measured and hemoglobin concentrations were measured with a portable photometer.
RESULTS
The incidence of clinical pellagra has not decreased since the end of the civil war in 2002. Low excretion of niacin metabolites was confirmed in 10 of 11 new clinic patients. Survey data were collected for 723 women aged 15-49 y and for 690 children aged 6-59 mo. Excretion of niacin metabolites was low in 29.4% of the women and 6.0% of the children, and the creatinine-adjusted concentrations were significantly lower in the women than in the children (P < 0.001, t test). In children, niacin status was positively correlated with the household consumption of peanuts (r = 0.374, P = 0.001) and eggs (r = 0.290, P = 0.012) but negatively correlated with socioeconomic status (r = -0.228, P = 0.037).
CONCLUSIONS
The expected decrease in pellagra incidence after the end of the civil war has not occurred. The identification of niacin deficiency as a public health problem should refocus attention on this nutritional deficiency in Angola and other areas of Africa where maize is the staple.