Your Account Logout. Gail, Alastair Scott, Chris J. Edition 1st Edition.
First Published Pages pages. Back to book. By Joseph A. Pages Although the theory behind herd immunity was developed during the s, the development of vaccines coupled with advances in mathematical modeling in epidemiology found a new synergy in a paper written in 9. Four years earlier, in , the World Health Organization had declared its intent to eradicate smallpox within 10 years, and the U.
Public Health Service had declared its intent to eliminate measles from the United States within 1 year Both of these tasks were theoretically to be achieved by the induction of herd immunity with vaccines. The year saw the beginning of flexible computing in public health. To address the swine flu crisis 11 , an auditorium at CDC was filled with epidemiologists and a Digital Equipment PDP 11 minicomputer the size of a large refrigerator. In the s, public health saw an expansion of emphasis on statistical methods and more statistical sophistication among epidemiologists and analysts.
The computer-punched card was gradually replaced as the primary means for data storage by magnetic tape, as better computers became available Figure 2. Punched cards were still commonly used for data entry and programming at CDC until the mids, when the combination of lower-cost magnetic disk storage and affordable interactive terminals on less expensive minicomputers made punched cards obsolete.
However, their influence persists through many standard conventions and file formats. For example, the terminals that replaced the mainframe card readers displayed 80 columns of text, the same amount of space on the punched card.
Epidemiology and Medical Statistics, Volume 27
In this investigation of typhoid fever in Michigan, the model was unable to identify risk associated with any food item because of a small number of cases and little variation in food-consumption patterns. Since this first use, logistic regression has become a standard technique in public health and has contributed to policy formulation in many areas. For example, the results from a logistic regression analysis were used to implement a requirement that tobacco-control programs should include opportunities for community participation and interaction for maximal impact.
To address these problems, statistical scientists adapted methods from correlation analysis 18 and developed a technique known as back-calculation This incubation distribution must be estimated from cohort studies.
Epidemiology and Medical Statistics, Volume 27 - 1st Edition
On the basis of these data, back-calculation methods provide estimates of the number of persons infected with HIV during each month or calendar quarter necessary to account for the number of persons in whom AIDS has been diagnosed during those same periods. The number of persons in whom AIDS will be diagnosed in the future can then be projected from the estimated HIV epidemic curve and the incubation period distribution The other was the development and widespread use of pharmacotherapy zidovudine During the mids, with the increasing availability of microcomputers, CDC epidemiologists first began using computers during field investigations, but no user-friendly software existed for the purpose.
To remedy this problem, in the early s, CDC began development of Epi Info, a general-purpose computer program that could be used for epidemic investigations and surveillance Table. Early versions of Epi Info were used in field investigations on large "luggable" computers 23 Figure 3. The widespread distribution of Epi Info and the responsiveness of its developers to the needs of epidemiologists in the field drove the application of statistical methods in field investigations throughout the world Add to this countless other citations in reports not indexed, and the impact of its development on the field of statistics is apparent.
In addition, Epi Info aided in early efforts to coordinate surveillance activities to reduce the workload of state health departments During this period, statistical methods for surveillance also advanced. The availability of methods of forecasting by using time series methods augmented previous regression results 26, An investigation in response to food poisoning in Peru was the first documented field investigation to implement a time series analysis CDC, unpublished data, Use of these methods, developed during the s, was aided by the availability of computers that allowed computations to be conducted in a reasonable amount of time.
More broadly, methods were developed to investigate changes in patterns of surveillance data to aid in epidemic detection and control This development was further aided in , when the National Center for Health Statistics became part of CDC and brought its expertise in vital statistics and surveys Innovations continued during the s in such areas as the detection of statistical aberrations, and changes in patterns of data reported over time A Symposium on Statistics in Surveillance 34 became the foundation for ongoing CDC symposia on the statistics of cluster investigations 35 , statistics for rare events and small areas 36 , statistics as a basis for public health decisions 37 , emerging statistical issues 38 , complicated designs and data structures 39 , methods for decisions in uncertainty 40 , methods for addressing health inequities 41 , and use of multisource data Over time, these symposia were accompanied by short courses to educate the public health community about statistical methods In addition, CDC began giving awards for outstanding statistical work that had public health impact Figure 4.
- Handbook of Statistical Methods for Case-Control Studies | Taylor & Francis Group.
- What He Spoke to My Heart: The Poetry of Geralisha Hunter?
- What is Life? (Canto).
Despite considerable achievements in reducing smoking prevalence as the 20th century closed, tobacco use remained responsible for one of every five U. The National Youth Tobacco Survey measured the tobacco-related beliefs, attitudes, and behaviors of youth and was the first to gather data from both high school and middle school students.
- Of Ants and Dinosaurs (Short Stories by Liu Cixin Book 4)!
- Kundenbindung und Kundengewinnung bei Banken: Erlebnisbanking als Strategie? (German Edition).
- Tutte noi abbiamo un Mister Big (Guest book) (Italian Edition).
Findings were used to design strategies for youth-focused antitobacco campaigns In , economic methods were used to measure smoking-attributable costs In , Anderson and May published Infectious Disease of Humans 47 , documenting their work in mathematical modeling transmission of infectious diseases, which was critically important to understanding the ongoing work in fighting the global HIV epidemic, as well as malaria and tuberculosis.
Subsequent work on modeling diseases has been used to monitor and model the impact of influenza outbreaks.
During the s, laboratory techniques improved enough so that strains of viruses could be mapped and links made to the epidemiologic investigation. Although today the consequences of unhealthy dietary choices, sedentary lifestyles, and "supersized" food portions are familiar, during the late s, their potential for harm was underestimated. Research published in documented the nation's rapidly increasing obesity rates in all U. CDC responded through VERB, an innovative and expansive campaign based on behavioral science theory and contemporary principles of marketing, which produced measurable positive results Once again, CDC epidemiologists were using statistical analytic methods that had previously been used in other disciplines.
For example, Bayesian methods used by businesses and marketers to model personal and community decision making preferences 50 or cluster analysis and marketing segmentation methods were being used to inform health intervention and evaluation of health programs Statistical methods in longitudinal analysis and mixed models used commonly in social research also contributed to the evaluation of results Likewise, a method developed in for studies in biological sciences, capture-recapture analysis, was adapted for evaluating surveillance systems 53, This method facilitated the estimation of total number of cases from two surveillance sources, each of which might not be complete.
In response to the terrorism events of , statisticians began to develop methods for use in defense and national security The rise of spatial statistics and geographic information systems meant that epidemiologists could better map prevalence data to suggest gaps in response or impact of disease or injury Economic data could be mapped for use in cost-effectiveness studies, and overlaying data types prevalence, economic costs, demographics could be used for better decision making and for evaluation of programs.
Mapping the cholera outbreak in John Snow's time seemed to have come full circle. Many of the techniques of spatial analysis depend on statistical measures and methods, including univariate statistical measures and directional analysis Additionally, statistical methods have been developed to address the specific needs of spatial datasets. The nature of these extensions differs from the ways in which multivariate statistics are derived from their univariate counterparts because of concepts of distance, direction, contiguity, and scale.
In the future, epidemiologists will continue to pursue new statistical techniques that can increase the impact of their analyses on public health. For example, the coming decades might bring innovations in new data collection modalities e. A large body of methods e. However, the use of these new technologies also comes with challenges.
Table of contents
For example, the introduction of parallel sequencing technologies 58 has led to an exponential increase in the amount of available DNA sequence information for epidemiologic investigations. Because sequence data are now produced faster than they can be meaningfully analyzed, new approaches to the analysis of this information is one of the most important recent challenges for epidemiologists, bioinformaticians, and statisticians. Beyond methods to carefully sample and organize the massive amount of data, challenges include development of quantitative methods and models to estimate errors for the various sequencing platforms; algorithms and mathematical estimates of the reliability of genomes assembled from short-gapped reads; approaches to distinguish sequence-determination errors from biological polymorphism and mutation; and means to distinguish among multiple genomes within a single dataset, particularly when the relative sizes of those different genomes vastly differ.