Defense against infection incurs costs as well as benefits that are expected to shape the evolution of optimal defense strategies. In particular, many theoretical studies have investigated contexts favoring constitutive versus inducible defenses. However, even when one immune strategy is theoretically optimal, it may be evolutionarily unachievable. This is because evolution proceeds via mutational changes to the protein interaction networks underlying immune responses, not by changes to an immune strategy directly. Here, we use a theoretical simulation model to examine how underlying network architectures constrain the evolution of immune strategies, and how these network architectures account for desirable immune properties such as inducibility and robustness. We focus on immune signaling because signaling molecules are common targets of parasitic interference but are rarely studied in this context. We find that in the presence of a coevolving parasite that disrupts immune signaling, hosts evolve constitutive defenses even when inducible defenses are theoretically optimal. This occurs for two reasons. First, there are relatively few network architectures that produce immunity that is both inducible and also robust against targeted disruption. Second, evolution toward these few robust inducible network architectures often requires intermediate steps that are vulnerable to targeted disruption. The few networks that are both robust and inducible consist of many parallel pathways of immune signaling with few connections among them. In the context of relevant empirical literature, we discuss whether this is indeed the most evolutionarily accessible robust inducible network architecture in nature, and when it can evolve.
Schistosomiasis is a major socio-economic and public health problem in many sub-Saharan African countries. After large mass drug administration (MDA) campaigns, prevalence of infection rapidly returns to pre-treatment levels. The traditional egg-based diagnostic for schistosome infections, Kato-Katz, is being substituted in many settings by circulating antigen recognition-based diagnostics, usually the point-of-care circulating cathodic antigen test (CCA). The relationship between these diagnostics is poorly understood, particularly after treatment in both drug-efficacy studies and routine monitoring.
We created a model of schistosome infections to better understand and quantify the relationship between these two egg- and adult worm antigen-based diagnostics. We focused particularly on the interpretation of ?trace? results after CCA testing. Our analyses suggest that CCA is generally a better predictor of prevalence, particularly after treatment, and that trace CCA results are typically associated with truly infected individuals.
Even though prevalence rises to pre-treatment levels only six months after MDAs, our model suggests that the average intensity of infection is much lower, and is probably in part due to a small burden of surviving juveniles from when the treatment occurred. This work helps to better understand CCA diagnostics and the interpretation of post-treatment prevalence estimations.
Davis E.L., Danon L., Prada Joaquin, Gunawardena S.A., Truscott J.E., Vlaminck J., Anderson R.M., Levecke B., Morgan E.R., Hollingsworth T.D. (2018) Seasonally timed treatment programs for Ascaris lumbricoides to increase impact?An investigation using mathematical models, PLoS Neglected Tropical Diseases 12 (1)
Public Library of Science
There is clear empirical evidence that environmental conditions can influence Ascaris spp. free-living stage development and host reinfection, but the impact of these differences on human infections, and interventions to control them, is variable. A new model framework reflecting four key stages of the A. lumbricoides life cycle, incorporating the effects of rainfall and temperature, is used to describe the level of infection in the human population alongside the environmental egg dynamics. Using data from South Korea and Nigeria, we conclude that settings with extreme fluctuations in rainfall or temperature could exhibit strong seasonal transmission patterns that may be partially masked by the longevity of A. lumbricoides infections in hosts; we go on to demonstrate how seasonally timed mass drug administration (MDA) could impact the outcomes of control strategies. For the South Korean setting the results predict a comparative decrease of 74.5% in mean worm days (the number of days the average individual spend infected with worms across a 12 month period) between the best and worst MDA timings after four years of annual treatment. The model found no significant seasonal effect on MDA in the Nigerian setting due to a narrower annual temperature range and no rainfall dependence. Our results suggest that seasonal variation in egg survival and maturation could be exploited to maximise the impact of MDA in certain settings. © 2018 Davis et al.
Measles is a target for elimination in all six WHO regions by 2020, and over the last decade, there has been considerable progress towards this goal. Surveillance is recognised as a cornerstone of elimination programmes, allowing early identification of outbreaks, thus enabling control and preventing re-emergence. Fever-rash surveillance is increasingly available across WHO regions, and this symptom-based reporting is broadly used for measles surveillance. However, as measles control increases, symptom-based cases are increasingly likely to reflect infection with other diseases with similar symptoms such as rubella, which affects the same populations, and can have a similar seasonality. The WHO recommends that cases from suspected measles outbreaks be laboratory-confirmed, to identify 'true' cases, corresponding to measles IgM titres exceeding a threshold indicative of infection. Although serological testing for IgM has been integrated into the fever-rash surveillance systems in many countries, the logistics of sending in every suspected case are often beyond the health system's capacity. We show how age data from serologically confirmed cases can be leveraged to infer the status of non-Tested samples, thus strengthening the information we can extract from symptom-based surveillance. Applying an age-specific confirmation model to data from three countries with divergent epidemiology across Africa, we identify the proportion of cases that need to be serologically tested to achieve target levels of accuracy in estimated infected numbers and discuss how this varies depending on the epidemiological context. Our analysis provides an approach to refining estimates of incidence leveraging all available data, which has the potential to improve allocation of resources, and thus contribute to rapid and efficient control of outbreaks. © Cambridge University Press 2018 This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/
), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Stolk W.A., Prada Joaquin, Smith M.E., Kontoroupis P., De Vos A.S., Touloupou P., Irvine M.A., Brown P., Subramanian S., Kloek M., Michael E., Hollingsworth T.D., De Vlas S.J. (2018) Are alternative strategies required to accelerate the global elimination of lymphatic filariasis? Insights from mathematical models, Clinical Infectious Diseases 66 pp. S260-S266
Oxford University Press
Background. With the 2020 target year for elimination of lymphatic filariasis (LF) approaching, there is an urgent need to assess how long mass drug administration (MDA) programs with annual ivermectin + albendazole (IA) or diethylcarbamazine + albendazole (DA) would still have to be continued, and how elimination can be accelerated. We addressed this using mathematical modeling. Methods. We used 3 structurally different mathematical models for LF transmission (EPIFIL, LYMFASIM, TRANSFIL) to simulate trends in microfilariae (mf) prevalence for a range of endemic settings, both for the current annual MDA strategy and alternative strategies, assessing the required duration to bring mf prevalence below the critical threshold of 1%. Results. Three annual MDA rounds with IA or DA and good coverage (e65%) are sufficient to reach the threshold in settings that are currently at mf prevalence
Mathematical models are increasingly being used to evaluate strategies aiming to achieve the control or elimination of parasitic diseases. Recently, owing to growing realization that process-oriented models are useful for ecological forecasts only if the biological processes are well defined, attention has focused on data assimilation as a means to improve the predictive performance of these models.
Background: Co-infection with multiple soil-transmitted helminth (STH) species is common in communities with a high STH prevalence. The life histories of STH species share important characteristics, particularly in the gut, and there is the potential for interaction, but evidence on whether interactions may be facilitating or antagonistic are limited. Methods: Data from a pretreatment cross-sectional survey of STH egg deposition in a tea plantation community in Sri Lanka were analysed to evaluate patterns of co-infection and changes in egg deposition. Results: There were positive associations between Trichuris trichiura (whipworm) and both Necator americanus (hookworm) and Ascaris lumbricoides (roundworm), but N. americanus and Ascaris were not associated. N. americanus and Ascaris infections had lower egg depositions when they were in single infections than when they were co-infecting. There was no clear evidence of a similar effect of co-infection in Trichuris egg deposition. Conclusions: Associations in prevalence and egg deposition in STH species may vary, possibly indicating that effects of co-infection are species dependent. We suggest that between-species interactions that differ by species could explain these results, but further research in different populations is needed to support this theory. © The Author(s) 2018. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.
Introduction All six WHO regions currently have goals for measles elimination by 2020. Measles vaccination is delivered via routine immunization programmes, which in most sub-Saharan African countries reach children around 9Â months of age, and supplementary immunization activities (SIAs), which target a wider age range at multi-annual intervals. In the absence of endemic measles circulation, the proportion of individuals susceptible to measles will gradually increase through accumulation of new unvaccinated individuals in each birth cohort, increasing the risk of an epidemic. The impact of SIAs and the financial investment they require, depend on coverage and target age range. Materials and methods We evaluated the impact of target population age range for periodic SIAs, evaluating outcomes for two different levels of coverage, using a demographic and epidemiological model adapted to reflect populations in 4 sub-Saharan African countries. Results We found that a single SIA can maintain elimination over short time-scales, even with low routine coverage. However, maintaining elimination for more than a few years is difficult, even with large (high coverage/wide age range) recurrent SIAs, due to the build-up of susceptible individuals. Across the demographic and vaccination contexts investigated, expanding SIAs to target individuals over 10Â years did not significantly reduce outbreak risk. Conclusions Elimination was not maintained in the contexts we evaluated without a second opportunity for vaccination. In the absence of an expanded routine program, SIAs provide a powerful option for providing this second dose. We show that a single high coverage SIA can deliver most key benefits in terms of maintaining elimination, with follow-up campaigns potentially requiring smaller investments. This makes post-campaign evaluation of coverage increasingly relevant to correctly assess future outbreak risk. Â© 2017 The Author(s)
Background: In this study, two traits related with resistance to gastrointestinal nematodes (GIN) were measured in 529 adult sheep: faecal egg count (FEC) and activity of immunoglobulin A in plasma (IgA). In dry years, FEC can be very low in semi-extensive systems, such as the one studied here, which makes identifying animals that are resistant or susceptible to infection a difficult task. A zero inflated negative binomial model (ZINB) model was used to calculate the extent of zero inflation for FEC; the model was extended to include information from the IgA responses. Results: In this dataset, 64 % of animals had zero FEC while the ZINB model suggested that 38 % of sheep had not been recently infected with GIN. Therefore 26 % of sheep were predicted to be infected animals with egg counts that were zero or below the detection limit and likely to be relatively resistant to nematode infection. IgA activities of all animals were then used to decide which of the sheep with zero egg counts had been exposed and which sheep had not been recently exposed. Animals with zero FEC and high IgA activity were considered resistant while animals with zero FEC and low IgA activity were considered as not recently infected. For the animals considered as exposed to the infection, the correlations among the studied traits were estimated, and the influence of these traits on the discrimination between unexposed and infected animals was assessed. Conclusions: The model presented here improved the detection of infected animals with zero FEC. The correlations calculated here will be useful in the development of a reliable index of GIN resistance that could be of assistance for the study of host resistance in studies based on natural infection, especially in adult sheep, and also the design of breeding programs aimed at increasing resistance to parasites. © 2016 The Author(s).