Colloquium Abstracts
Tuesday, Sept. 5, 2006
2:10pm, 421 Neville Hall
Dr. Gerhard Dikta, Fachhochschule Aachen, Germany
Bootstrap Approximation in Model Checks for Binary Data
Consider a binary regression model, where the conditional expectation of the binary variable given an explanatory variable belongs to a parametric family. To check whether a sequence of independent and identically distributed observations of these variables belongs to such a parametric family, we use Kolmogorov-Smirnov and Cramer von Mises type tests, which are based on maximum likelihood estimation of the parameter and on a marked empirical process introduced by Stute. We study a new resampling scheme of the bootstrap in this setup to approximate the critical values belonging to these tests. Furthermore, this approach is applied to simulated and real data. In the latter case we check parametric model assumptions of some right-censored data sets.
Wednesday, Oct. 25, 2006
3:00pm, 101 Neville Hall
Dr. Alain Arneodo, Laboratoire Joliot Curie and Laboratoire de Physique, Ecole Normale Superieure de Lyon, 46 Allee d’Italie, 69364 Lyon Cedex 07, France
Large-scale wavelet analysis of the human genome: from DNA sequence analysis to the modeling of replication in higher eukaryotes
Understanding how chromatin is spatially and dynamically organized in the nucleus of eukaryotic cells and how this affects genome functions is one of the main challenges of cell biology. Recent technical progress in live cell imaging have confirmed that the structure and dynamics of chromatin play an essential role in regulating many biological processes, such as gene activity, DNA replication, recombination and DNA damage repair. The emerging view is that genomes are compartimentalized at the level of chromosome territories in mammalian nuclei, into sub-chromosomal structural domains that are likely to be fundamental functional units that coordinate the spatial organization and timing of replication and transcription. To which extent one may learn about the higher order structure and dynamics of chromatin directly from the primary DNA sequence and its functional landmarks, is a question of fundamental and practical importance.
In this talk, we explore the large-scale compositional heterogeneity of human autosomal chromosomes through the optics of the wavelet transform (WT) microscope. We show that the GC content displays relaxational nonlinear oscillations with two main frequencies corresponding to 100 kb and 400 kb which are well recognized characteristic sizes of chromatin loops and loop domains involved in the hierarchical folding of the chromatin fiber. These frequencies are also remarkably similar to the size of mammalian replicons. When further investigating deviations from intra-strand equimolarities between A and T and between G and C, we corroborate the existence of these two fundamental frequencies as the footprints of the replication and/or transcription mutation bias and we show that the observed nonlinear oscillations enlighten a remarkable cooperative organization of gene location and orientation. When further investigating the inter-genic and transcribed regions flanking experimentally identified human replication origins and the corresponding mouse and dog homologous regions, we reveal that for 7 of 9 of these known origins, the (TA+GC) skew displays rather sharp upward jumps, with a linear decreasing profile in between two successive jumps. We present a model of replication with well positioned replication origins and random terminations that accounts for the observed characteristic serrated skew profiles. We further use the singularity tracking ability of the WT to develop a methodology to detect the origins of replication. We report the discovery of 1024 putative origins of replications in the human genome. The statistical analysis of the distribution of sense and anti-sense genes around these origins strongly suggests that the origins of replication play a fundamental role in the organization of mammalian genomes. Taken together, these analyses show that replication and gene expression are likely to be regulated by the structure and dynamics of the chromatin fiber.
Thursday, Dec. 7, 2006
3:10pm, 421 Neville Hall
Zachary J. Smith, Dept. of Mathematics and Statistics, University of Maine
The Bochner Identity in Euclidean Space
In harmonic analysis, we aim to decompose spaces via their isometry groups into invariant irreducible subspaces. We then look at the action of the Fourier transform on those spaces. In Euclidean space, the isometry group of interest in the rotation group. We will motivate our discussion in 2 dimensions, as our questions become those of ordinary Fourier analysis.
Friday, Mar. 23, 2007
3:10pm, 100 Neville Hall
Prof. Claude Levesque, Dept. of Mathematics & Statistics, Universite Laval, Quebec
Congruent numbers, diophantine equations, elliptic curves and a frightening dream of Professor Bresinsky
We will mention what a nightmare of Professor Henrik Bresinsky, congruent numbers, diophantine equations, Fermat’s last theorem, cryptography, elliptic curves and modular forms have in common. We will say a few words about the above mentioned mathematical topics and emphasize the importance of elliptic curves. This will be a general audience lecture and it will definitely be accessible to graduate students (as well as to advanced undergraduate students).
Friday, April 6, 2007
3:10pm, 421 Neville Hall
Prof. Alex Ghitza, Dept. of Mathematics, Colby College
Serre’s conjecture on Galois representations, and future directions
Modular forms are complex-valued functions. Galois representations translate questions about solutions of Diophantine equations into two-dimensional linear algebra. A priori, these objects should have nothing in common; yet, whenever they come together, wonderful things happen in number theory.
The first part of the talk will be an introduction to this circle of ideas, more precisely to the statement and implications of Serre’s conjecture. In the second part, we will discuss ongoing work on generalizing Serre’s conjecture to higher-dimensional situations.
The emphasis will be on getting a feeling for the objects and ideas involved in the theory, rather than on its technicalities.
Monday, April 9, 2007
2:30pm, 108 Neville Hall
Benjamin Palmer, Dept. of Mathematics & Statistics, University of Maine
An Historical Review of Compactness
As mathematics became more rigorous and more abstract in the late 19th and early 20th centuries, it became clear that an idea like compactness would be necessary to develop the theories that were emerging at the time.
We will explore how the work of mathematicians like Cauchy and Riemann studying highly discontinuous functions led to the definitions of sequential and open-cover compactness that we use today.
Thursday, April 12, 2007
2:10pm, 421 Neville Hall
AbouEl-Makarim Aboueissa, Ph.D., Dept. of Mathematics and Statistics, University of Southern Maine
An Historical Review of Compactness
In recent years there has been a great deal of interest in the analysis of censored data in the context of survival in medical trials and in the context of environmental studies. In clinical trials where patients often survive beyond the end of the trial period or are lost to follow-up for some reasons, we are unable to observe the variates of interest (the survival times) and instead observe right censored values. In environmental studies when chemists cannot quantify the concentration in a field sample, they report nondetect (left censored) instead of numerical measurement. Censored data often arise in environmental contexts with one or more detection limits DL’s. Such data are called left-censored. In environmental studies usually data sets are not normally distributed, but by suitable transformation (e.g. logarithmic) they can be made approximately normal. The estimation of population parameters from censored samples has been considered by many authors who have used different methods. Maximum likelihood estimators are the principal estimators for calculating estimates of population parameters from censored samples. In addition, many other methods, such as replacement by a constant value, quantile-quantile regression, method of moments, etc. will be described. Numerical examples will also be given.
Thursday, April 19, 2007
2:10pm, 108 Neville Hall
Sergey Lvin, Dept. of Mathematics & Statistics, University of Maine
From X-Ray Pictures to Thermo-Acoustic Tomography: The Triumph of the Unity of Pure and Applied Mathematics
We will discuss the mathematics of X-ray images and medical tomography, including CT scan, SPECT, and TAT*. Surprisingly, this mathematics had been created as pure mathematics long before it was used in medical or other applications (and applied mathematics). Vice versa, the development of applied mathematics of medical imaging led to new and unexpected achievements in pure algebra and analysis.
We will try to answer questions such as how to see a fracture inside a bone, how to watch for brain activity, and how to hear the location of a microscopic tumor. Previous experience with fractured bones is not required, but hands-on experience with brain activity is a plus. Students with some knowledge of calculus are welcome.
* The last T in all three abbreviations means tomography.
Monday, April 23, 2007
2:30pm, 421 Neville Hall
Zachary J. Smith, Dept. of Mathematics and Statistics, University of Maine
An Introduction to Hyperbolic Space
This talk gives an introduction to the structure of hyperbolic space. We will cover symmetry and isometry groups, touch upon the underlying Lie algebra, and discuss invariant measure for integration. The Fourier transform and its inversion formula will be covered in 2 dimensions. This talk should be accessible to advanced undergraduates.
Thursday, April 26, 2007
12:30pm, DPC 109
Zachary J. Smith, Dept. of Mathematics and Statistics, University of Maine
The Bochner Identity in Harmonic Analysis
In this defense we will discuss the hyperbolic case of Bochner’s identity. Hyperbolic space has an underlying geometry such it can be associated with Sn-1 x R. This allows the decomposition of L2 functions into the spaces of spherical harmonics. We will look at the action of the Fourier transform on these spaces, and then demonstrate how this can be used by considering Hardy’s theorem.
Friday, June 15, 2007
11:00am, 421 Neville Hall
Na Wang, Dept. of Mathematics and Statistics, University of Maine
Estimation of Extra Risk and Benchmark Dose in Dose-Response Models
An important goal in quantitative risk/safety analysis of chemical toxins or pharmaceutical agents is determination of toxic risk posed by exposure to the agent. For purpose of assessing exposure risks, the extra risk function is defined as the risk above the background level corrected for non-response in the unexposed population. This talk will introduce the statistical methods for obtaining upper confidence limits on the extra risk and lower confidence limits on the dose level at which a certain benchmark risk is achieved. Several examples from literature will be given and simulations results on the estimation of extra risk and benchmark dose will be presented (Nitcheva et al., 2005).