150
Views
0
CrossRef citations to date
0
Altmetric
Book Reviews

Book Reviews

Pages 404-412 | Published online: 24 Jul 2017
 

Stan Lipovetsky

GfK North America, Minneapolis

Advances in Sequence Analysis: Theory, Methods Applications, by Philippe Blanchard, Felix Bühlmann, and Jacques-Antoine Gauthier. New York: Springer, Life Course Research and Social Policies 2, 2014, xiii + 304 pp., $129.00 (HB), ISBN: 978-3-319-04968-7; $99.00, ISBN: 978-3-319-04969-4 (eBook).

This book consists of 15 articles presented by 24 sociologists. It is devoted to sequence analysis (SA). The monograph is divided into four topics. It begins with an introductory chapter that describes what sequence analysis is followed by an explanation of the four topics. The four topics are as follows. Topic one, covered in four articles, is “how to compare sequences”; topic two is addressed by three articles and it deals with “life course sequences”; topic three consists of three articles and deals with “political sequences.” Finally, the last topic is covered in four articles and deals with “visualization of sequences and their use for survey research.” Based on statistics, a formalized analysis on successions of states/events is investigated. Specifically, SA is a subbranch of social sciences that deals with topics allied with understanding trajectories of cohabitation and housing, occupational carriers or crucial transitions (from school to work or from employment to retirement), historical evolution of political institutions, and a historical life course.

In contrast with statistical analysis, SA compares chronological sequences of states within a holistic conceptual model instead of observing independent observations over time; it accounts for both individuals and structural dynamics. The scope of SA is to enhance, in a broad sense, social processes. The authors were inspired by a European conference that revealed an international adoption of the method. The intention of the monograph is to inform young scholars about the methods of SA and the expectation is to expand SA to users with diverse views—including sociologists, historians, political scientists, and specialists of other social sciences using longitudinal data. Some of the variants of SA are based on questions such as “what have you done for us lately?” or some variation of the same question “what have you done in the last 14 years?” and so on.

Among the specialized points brought up in the monograph is how SA fits into a motif. Subsequently, the optimal matching analysis, OMA, is introduced to show how insertion, deletion, and substitution are used in social sciences. Under these circumstances, variants are discussed, including the dynamic Hamming distance and measuring heterogeneity over time. In addition, optimal matching metrics are introduced to measure distances between consequences.

The authors continue by providing algorithms for determining similarities between a pair of sequences, named the optimal matching (OM) algorithm. Under this, localization, duration, adjusted, implementation, and time warping are considered. As an application, the field of family formation is considered. In this example, a sequential methodology is tied to link theory and empirical applications in life course research.

Another application of SA discussed in the monograph is the developmental psychology linking young people's educational and employment activities and psychological resources. Also, the marriage and household formation process from an original prospective is analyzed in Chapter 8.

Political sequences begin with the mobilization of prosopographic methods on the Holocaust by introducing the trajectories of prosecution. An analysis of classifying and explaining various aspects of trajectories are some of the elements in the chapter. The next example, covered in Chapter 10, deals with electoral participation as a product of social environments. The causes and consequences of democratization is another subject covered here. The author addresses the nature of political sequences by conveying prior regime histories as principal determinants of democratization.

Visualization of sequences and their use of survey research is the subject of the last portion of the book. Displaying time series marginal distributions of life phenomena are key enquiries to capture dynamic behaviors. To this end, converting sequences into objects—network graphs—with is a key study to explore how life phenomena evolve. Also, SA has benefitted from its holistic approach, and not so much to event historic modeling. In conclusion, since surveys become a fundamental tool to study social sciences and participation may be a problem (growing level of non-responses), the authors propose to re-establish a sociological approach to survey participation. In this case, the root that causes nonresponses can be adjusted according to social characteristics.

In summary, this book provides a global view of sequence analysis, the statistical assessment of successions of states or events. It includes contributions in various applications on life course studies. It is an attempt to present in detail a set of statistical tools and statistical facts which emerged from the empirical studies related to life studies including employment, careers, and political trajectories. I believe that this monograph reassesses the use of sequences in a way of collecting, representing and processing them. The articles are well written and covers material that is very useful for anyone to know. It does not provide sophisticated statistical methodologies. I would recommend this book only for reading.

Stergios B. Fotopoulos

Washington State University

Sequential Analysis: Hypothesis Testing and Change-point Detection, by Alexander Tartakovsky, Igor Nikiforov, and Michéle Bassevile. Boca Raton, FL: Chapman & Hall/CRC Monographs on Statistics and Probability, 2015, xxiii + 579 pp., $101.75 (HB), ISBN: 978-1-4398-3820-4.

The first theoretical studies that followed Wald's (1949) monograph on sequential analysis established methods that are much faster, with fewer observations, than equally reliable tests based on a predetermined number of observations. This work was then continued by Page (1954) who again established sequential detection procedures based on the famous cumulative sum (CUSUM) algorithm. In the last two to three decades, general stochastic models have been successfully used to explore new developments of the known sequential tests, matrix versions of the same type of tests suitable for multiple decision problems, and CUSUM and Shiryaev–Roberts change detection algorithms for both single and multichannel cases. Tartakovsky's et al. (2016) take up a major theme of the sequential theory—first introducing Markov times, basic results on Brownian motion, and Itô’s stochastic integral and differential equations, point processes, renewal, and non-renewal theory for random walks, and distributional properties on stopping times and overshoots. They carry on presenting core material in sequential theory including sequential decision rules, minmax rules, and some general hypothesis testing criteria. Using the above material as a basis, they develop rigorous methodologies on sequential hypothesis testing, the change-point detection, and their applications. The last material of the last three topics is drawn mostly from recent articles by the authors and their coauthors. The monograph provides an overview of both online quickest sequential and detection literature as well as specialized coverage of its title topic.

The principal object of Chapter 3 is sequential hypothesis testing to test which of two densities is the true one. Their procedures show that as an observation arrives, the knowledge of the true state of the process become more refined, in which case one decides whether or not more data are needed to make the final decision. Here, the authors review the most recent developments of the sequential probability ratio tests (SPRT) to find asymptotic optimality under various conditions, including iid and general non-iid scenarios. The ingredients exploited in this chapter are: Stein's lemma, ideas from random walks as ladder heights and epochs and Weiner–Hopf decomposition for both the discrete and continuous case. The results applied for the Gaussian case, and integral equations are presented and numerical techniques are used for performance evaluation. Bounds and asymptotic approximations for the operating characteristics (OP) and the expected sample size (ESS) are also revealed. In addition, the SPRT optimality problem using a Bayesian procedure is discussed and analyzed. The sequential hypothesis testing is sustained in Chapter 4 for multiple simple hypotheses. Here, a classical classification-type of setting is formulated. Multiple hypotheses are introduced, in which case one of the corresponding densities needs to be identified as the true one. Again, log-likelihood ratios (LLR) are formed and the given densities are compared versus a dominating measure that may or may not be one of the corresponding densities. As in Chapter 3, asymptotic weak and strong first- and second-order optimality properties for either symmetric or asymmetric instance in the iid and non-iid case for multiple SPRT (MSPRT) are discussed and formulated. Finally, the sequential hypothesis testing section finishes with composite hypotheses. In this case, the Kiefer–Weiss problem is deliberated, discussing asymptotic optimal tests at an intermediate point. Further, mixture-based sequential LR tests are presented and minmax open-ended sequential tests for the exponential family and discrete spaces are analyzed.

Optimality criteria and a number of Bayesian and non-Bayesian methodologies to quickest change-point detection are offered in Chapters 5–9. The methods on change-point detection presented are online in nature. The detections offered are always subject to tolerance limit on the risk. The authors provide information on how to identify a time that the process undergoes an abrupt or gradual change-of-state. Again, in the sequential set up, observations are obtained one at a time and on condition that the process remains the same as the initial target state. If that is the case, one lets the process continue. However, if the state changes and becomes abnormal, then one is interested in detecting that change is in effect. Thus, upon the entrance of a new observation, one is faced with the question of whether to let the process continue or to stop and raise an alarm. Toward this, classical and contemporary methodologies are delivered to detect changes in the behavior of a process when the change-point is either unknown and constant or random. False alarms and the expected minimized average loss associated with the detection delay are some of the topics considered in this part of the monograph. The techniques employed here are of Bayesian form. Some of the ingredients consistently used are the minimax criterion and the multicyclic detection of a distant change in a stationary regime. The monograph also covers many cases for multiple hypotheses of postchange distributions. Changes related to signal or system (changes in the mean value), known as additive changes, embody the majority of this monograph. However, nonadditive or spectral changes are also included. These involve changes in variance, correlation, spectral characteristics, or dynamics of the signal or system. The statistics formulated are of LR-type and of recurrent form. Many of the procedures presented include the integral average detection delay (IADD), the average run length (ARL), average run length to false alarm (ARL2FA), the Lorden's risk or the worst-case ADD (ESADD(T)), the supremum ADD (SADD(T)), and the stationary ADD (STADD(T)). Operating characteristics of generic detection involve stopping times, which in turn aid one to obtain several integral equations. Thus, conditioning that the prior distribution of the change-point satisfies a geometric distribution and the fact that the Kullback–Leibler distance is finite, integral equations in terms of the stopping times are analyzed, and in turn asymptotic approximations of first and higher order are computed for the ADDs. Numerous asymptotic results under false alarm probability are also given for both simple and composite hypotheses. Comparable studies of the Shiryaev procedure with other ones in a Bayesian context are demonstrated in Chapter 7. In Chapter 8, the monograph continues covering the change-point detection using non-Bayesian approaches. As a basis the LR and given threshold constants, stopping rules are established. Then, ARL, the ARL2FA and various forms of ADD are computed, from which optimal sample sizes and optimal threshold values are determined. Emphasis is given in the CUSUM and the Wald's approximations. Illustrations are presented in the Gaussian case and the Siegmund's corrected Brownian motion approximation is stated. Various ARL expressions are introduced and comparisons are conducted. Finally, Chapter 9 covers multichart change-point detection algorithms for composite hypotheses and multi-population models. Quickest detection of unconstructed changes in multiple population methodologies are stated using a linear Gaussian model, χ2 detection schemes, ϵ-optimal multi-chart tests, and the linear regression model. The book completes its coverage in Part III, where interesting applications are introduced.

Tartakovsky's et al. monograph gives an up-to-date and comprehensive account of its title theme, with both rigorous analysis and description of the subjects in all 11 chapters as a welcome bonus. The mathematical level is graduate and assumes prior exposure to inferential and large sample methods in theoretical mathematical statistics. This reference book delivers a summary path to several major research frontiers in the sequential analysis hypothesis testing and change-point detection. I would be more than thrilled to own this book. The authors have selected their topics carefully, have given clear exposition of the methods and their applications, and in some topics they have even illustrated with numeric examples. The book On balance, both exact and asymptotic properties established procedures article, respectively, established a foundation for asymptotic properties of known sequential tests, matrix versions. This monograph is a comprehensive treatment of constructing stochastic mathematical models describing financial time series. The author characterizes this field as a study of an optimum dynamic modeling movement of pricing to many financial applications including portfolio optimization, risk evaluation, and/or valuation of contingent claims. All these applications require accurate forecasts over a chosen investment period of returns and a closed form covariance matrix. Thus, the goal of this project is to extract as much information as possible from empirical data in order to build processes that accurately describe financial time series. An additional goal of this monograph is to survey as many optimum model building methods as possible in a concise manner without including much of the mathematical details and to provide the most recent and appropriate set of references in the field. It can be noted that 95% of the reference bibliography relies on recent work (post 2000). This is quite commendable and should prove quite useful to most contemporary time series researchers. The references, with an overview of the breadth of the developmental evolution of research in the area of GARCH-type models, and the inclusion of stochastic volatility makes this book extremely useful for both newer and more established readers in the field.

The attraction of this book lies in the presentation of original and comprehensive statistical procedures of empirical financial time series that are repeatedly applied to a wide range of theoretical processes. This methodology is used in a multiscale level. In this way, one extracts more information and, at the same time, is able to discern one model from another. This kind of treatment distinguishes this project from its competitors. Specifically, it brings theory, visualization, and rich statistical information together to accurately identify the correct model. The most relevant graphs are summarized in mug shots which show a common set of information about empirical data and processes.

The book's main emphasis is on when the fat-tailed distributions are preferable to the Gaussian models. Although it is well known that daily returns are not independent and not Gaussian, they remain this way for short time intervals only; but at longer time intervals, they are expected to converge fairly quickly to iid Gaussian random variables. This is simply because the moment conditions for log-returns can be tested to satisfy higher than two moments and thus the central limit theorem (CLT) holds. The additional conditions for the CLT to hold are essentially the weak dependency that results in asymptotic independence. The author here, however, explains in a clear manner that this is not the case and, obviously, this is in agreement with recent literature that Gaussian properties do not hold in financial data. The author claims that “The catch in the present line of thought is the difference between linear correlation and general dependency, and the fact that the dependencies in the volatility decay slowly.” He continuous by saying that the “…CLT fails for financial data. Several statistics are meant to measure the pace of the convergence toward the CLT fixed point.” To confirm this, the author provides numerous probability densities (pdf) and demonstrates their behavior at time horizons. To emphasize the tail behavior, the shape of the pdf is envisioned at both linear and logarithmic scales. The figures suggest that the pdf are different from Gaussian at the daily scale but converge to Gaussian (slowly) over a large scale of intervals. Some of the stylistic properties for the tail exponent are indicated to be between 3 and 5 but clearly larger than 2, which excludes the Lévy stable distribution proposed by Mandelbrot. After collecting as many facts as possible, the goal of this book is then to extract many characteristics and discuss various computations parameters and results. This goes as far as modeling prices, and many types of volatilities (with and without leverage effects) at both short and long range dependence. The author, in agreement with the current literature, brings the idea of stochastic volatility processes as a genuine independent process. To capture market behavior, the book discusses examples of the regime switching models and also answers questions of whether financial time series satisfy the time reversal property. In particular, it is shown that a large number of properties cannot reproduce the stylized facts related to the time irreversibility observed in empirical time series.

The book is an attempt to construct, justify, and settle some old disputes regarding the nature of financial data, but it has also generated many new challenges. The author attempts to capture in a meaningful fashion the information and properties contained in the huge amount of financial data. The set of properties common across many instruments, markets and time periods have been demonstrated by independent studies and classified as stylized facts. The author, of course, reaffirms these and many new statements by using data specific to Switzerland. The book is comprised of 20 chapters, and references. There are no exercises in this book. Its role is to encompass theory (in stochastic modeling and statistic form), applications and visualization. The layout of the book is well done and very easy to read. From my experience, there are not many books of a similar approach; I believe it is quite unique in its nature. I personally found the repetition of similar figures throughout the text to be very useful, informative, and important. Although the density of mathematical rigor is not as high as you might find in other finance books, it provides an incredible amount of information that researchers interested in both mathematical and applied finance will find it a useful resource to learn basic asset behavior. The level is right for all researchers in the area with a master's degree in statistics.

In summary, this book is an important step in the right direction in bringing together recent literature in financial data analysis. It is an attempt to present in detail a set of theoretical tools and statistical facts which emerged from the empirical studies related to asset returns that are common to a large set of assets and markets. The author exploits and provides many properties related to volatility processes that can be useful tools to forecasting issues.

Stergios B. Fotopoulos

Washington State University

Editor Reports on New Editions, Proceedings, Collections, and Other Books

Computational Probability Applications, by Andrew G. Glen and Lawrence M. Leemis (Eds.). New York: Springer, 2016, x + 256 pp., $129.00 (HB), $99.00 (ebook), ISBN: 978-3-319-43315-8.

The intention of this edited volume is to promote the use of A Probability Programming Language (APPL), an open-source language. APPL is design to help equally methodologists and practitioners to advance new strategies, ideas, and models. This volume successfully merges the use of symbolic algebra with stochastic applications and displays its applications in a host of situations. The volume consists of 15 chapters written by scholars in the respective fields. The book organized sequentially, well structured, and chapters are self-contained. The book includes many useful topics and techniques for practitioners and researchers alike. The book is a good source as a reference book in a multitude fields. The volume contains the original papers of which some already published and some are in pipelines.

Below is a biased sample of the chapters.

  • Chapter 1: Accurate Estimation with One Order Statistics

  • Chapter 2: On the Inverse Gamma as a Survival Distribution

  • Chapter 4: The “Straightforward” Nature of Arrival Rate Estimation?

  • Chapter 5: Survival Distributions Based on the Incomplete Gamma Function Ratio

  • Chapter 7: Maximum Likelihood Estimation Using Probability Density Functions of Order Statistics

  • Chapter 10: Linear Chart Constants for Non-Normal Sampling

  • Chapter 12: Moment-Ratio Diagrams for Univariate Distributions

  • Chapter 14: Parametric Model Discrimination for Heavily Censored Survival Data

  • Chapter 15: Lower Confidence Bounds for System Reliability from Binary Failure Data Using Bootstrapping

One of the interesting features of this book is that its approach is explanatory rather than mathematical. Further, it includes some special topics, which are methods used in finance and neuroscience. Each chapter of the book presents a specific application computational probability using the APPL, and ends with a section on concluding remarks and/or future research. It provides in total 178 bibliographic references. The major weakness of the book is that it does not offer an exercise and problems section, an essential tool for adopting any book as a textbook especially at the undergrad level.

Briefly, I can safely conclude that all the chapters written nicely and well presented for a wider audience. To get the maximum benefit of this book, I would suggest that the readers should have an introductory course in elementary probability theory, and statistics. In summary, this is a good contribution, providing up-to-date coverage on selected topics in a logical and systematic manner. Variability and diversity in research is the spice of the life!

S. Ejaz Ahmed

Brock University

Digital Methods and Remote Sensing in Archaeology, by Maurizio Forte and Stefano Campana (Eds.). New York: Springer, 2016, xix + 496 pp., $129.00 (HB), $99.00 (ebook), ISBN: 978-3-319-40656-5.

This edited volume highlights new research on digital methods and remote sensing in archaeology. This is an important and useful area for researchers and professional equally. The use of remote sensing is more in demand these days due to availability of the large datasets collected on the scale of object, site, locality, and landscape. The applications of remote sensing are widely spread now as evident by chapters presented in this book.

The editor skillfully divided the volume into 6 parts and with 20 separate chapters in total.

Part I – Data Collection and Technology (2 chapters)

Part II – Image and Digital Processing (2 chapters)

Part III – Landscape Representation and Scales (7 chapters)

Part IV – Simulation, Visualization and Computing (4 chapters)

Part V – Interpretation and Discussion (3 chapters)

Part VI – Cultural Resource Management: Communication & Society (2 chapters)

The editors have done a good job in preparing the introduction of the book, it also gives a useful introduction of all the parts/chapters. However, I would prefer to combine the Preface and Introduction at one place. The volume compromised of chapters on the methodology and application of remote sensing in a host of related fields. Each chapter presenting a specific application and methodology of the topic at hand. These works highlight interesting data and figures.

The book is sequentially organized and well structured, and many chapters are self-contained. It includes many useful topics and techniques for graduate students, professionals, and researchers alike. This volume is a rich source of information and an excellent reference book.

Below is a biased sample of the topics.

  • Chapter 1: Terrestrial Laser Scanning in the Age of Sensing

  • Chapter 2: Airborne Laserscanning in Archaeology: Maturing Methods and Democratizing Applications

  • Chapter 4: Applying UAS Photogrammetry to Analyze Spatial Patterns of Indigenous Settlements Sites in the Northern Military Park, Guilford County, North Carolina

  • Chapter 7: What do the Patterns Mean? Archaeological Distributions and Bias in Survey Data

  • Chapter 8: 3D Tool Evaluation and Workflow for an Ecological Approach to Visualizing Ancient Socio-environmental Landscapes

  • Chapter 12: Cyber Archaeology: 3D Sensing and Digital Embodiment

  • Chapter 14: Using 3D GIS Platforms to Analyze and Interpret the Past

  • Chapter 18: Creating a Chronological Model for Historical Roads and Paths Extracted from Airborne Laser Scanning Data

  • Chapter 20: Cultural Heritage and Digital Technologies

In the preface and/or introduction, the editors do not indicate the specific targeted audience, or I am missing something here? I think this book will be attractive to a broad transdisciplinary research community. The book is useful as reference book for researchers and practitioners, providing real and interesting colored figures, data, and applications. In summary, this is an important contribution, providing up-to-date coverage on remote sensing in a systematic fashion.

S. Ejaz Ahmed

Brock University

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 97.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.