
Statistics and Probability Seminar Series  Spring 2009
Thursday 4:005:00pm, Room MCS 149 (Tea served from 3:304:00pm, Room MCS 153)
January 22, 45pm, Room MCS 149 (Thursday)
Jeff Hamrick
Department of Mathematics and Statistics,
Boston University
Are different slices of fixed income markets more dependent during times of crisis than during normal times? We call this increase in dependence during times of crisis credit contagion, and define this concept through a local correlation function very similar to the usual correlation coefficient. Surprisingly, an analysis of bond yield spreads and credit default swap premia suggests that fixed income markets have not experienced credit contagion during crises like the Panic of 2008.
Instead, these measures of credit risk have tended to become less correlated or even conditionally uncorrelated during crisesa concept we call credit confusion.
January 23, 11noon, Room MCS 135 (Friday)
Luis Carvalho
Department of Applied Mathematics,
Brown University
Maximum likelihood estimators have traditionally dominated discrete inference
for a long time. In this work we apply statistical decision theory to derive a
new contender that minimizes a posterior generalized Hamming loss: the
centroid estimator. The centroid estimator is formally characterized as a
solution to a discrete optimization problem having posterior marginal
distributions as inputs. We discuss both specific constraints of interest and
broad conditions under which this optimization problem becomes tractable and
provide further generalizations to centroid estimation. We illustrate centroid
estimation with simple applications to stochastic grammar parsing,
reconstruction of ancestral states given a phylogeny, and RNA secondary
structure prediction. Finally, we offer a few concluding remarks and
directions for future work.
January 27, 45pm, Room MCS 149
Donatello Telesca
Department of Biostatistics,
The University of Texas M. D. Anderson Cancer Center
We consider modeling dependent high throughput expression data arising
from different molecular interrogation technologies. Dependence
between molecules is introduced via the explicit consideration of
informative prior information associated with available pathways,
representing known biochemical regulatory processes. The important
features of the proposed methodology are the ease of representing
typical
prior information on the nature of dependencies, modelbased
parsimonious representation of the signal as
an ordinal outcome, and the use of coherent probabilistic schemes over
both, structure and strength of
the conjectured dependencies. As part of the inference we reduce the
recorded data to a trinary response
representing underexpression, average expression and overexpression.
Inference in the described model is
implemented through Markov chain Monte Carlo (MCMC) simulation,
including posterior simulation over
conditional dependence and independence. The latter involves a
variable dimensional parameter space. We
use a reversible jump MCMC scheme. The motivating example are data
from ovarian cancer patients.
January 29, 4:155:15pm , Room MCS 149
Michael Rosenblum
Center for AIDS Prevention Studies (CAPS)
University of California, San Francisco
Regression models are often used to test for causeeffect relationships from
data collected in randomized trials or experiments. This practice has
deservedly come under heavy scrutiny, since commonly used models such as
linear and logistic regression will often not capture the actual
relationships between variables, and incorrectly specified models potentially
lead to incorrect conclusions. In this paper, we focus on hypothesis tests of
whether the treatment given in a randomized trial has any effect on the mean
of the primary outcome, within strata of baseline variables such as age, sex,
and health status. Our primary concern is ensuring that such hypothesis tests
have correct Type I error for large samples. Our main result is that for a
surprisingly large class of commonly used regression models, standard
regressionbased hypothesis tests (but using robust variance estimators) are
guaranteed to have correct Type I error for large samples, even when the
models are incorrec!
tly specified. To the best of our knowledge, this
robustness of such modelbased hypothesis tests to incorrectly specified
models was previously unknown for Poisson regression models and for other
commonly used models we consider. Our results have practical implications
for understanding the reliability of commonly used, modelbased tests for
analyzing randomized trials.
January 30, 11noon, Room MCS 137 (Friday)
Sebastien Darses
Department of Mathematics and Statistics,
Boston University
We will introduce general notions of stochastic derivatives on stochastic processes. We will first show how the tools from Malliavin Calculus allow to study these objects. The involved techniques yield applications on various structures of the processes under study: gradient diffusions, asymptotic expansions of fractional SDE. Second, we will extend ODE on stochastic processes through these derivatives. We will focus on the relationships between these stochastic equations and various PDE, including the NavierStokes equation.
January 30, 45pm, Room MCS 135 (Friday)
Miklós Rásonyi
Vienna University of Technology.
We consider a stable Gaussian autoregressive process and
wish to estimate its mean, variance and autoregression coefficient
based on roundedoff observations. We present a law of large numbers,
uniform in the parameters which is the decisive step towards establishing
consistency of the corresponding maximum likelihood estimates.
February 5, 45pm, Room MCS 149
Natallia Katenka
Department of Statistics
University of Michigan
Wireless Sensor Networks (WSN) are a new technology with many
applications, including environmental monitoring, habitat monitoring,
surveillance, and health care. In many applications, each sensor
records a signal (temperature, vibration, etc) emitted from a target,
makes a decision about the target's presence, and then transmits
either the signal or the decision to a central node. Transmission of
decisions instead of signals offers significant savings in
communication costs, but binary decisions are unreliable in noisy
environments. We develop a new algorithm to improve reliability of
binary decisions, the Local Vote Decision Fusion (LVDF). Using LVDF,
we develop new data fusion algorithms for target detection (making a
networklevel decision about the presence of a target), target
localization (estimating the target's position), and target tracking
(estimating trajectories of multiple moving targets over time). We
apply our framework based on binary corrected decisions to two case
studies  an experiment involving tracking people and a project of
tracking zebras. Our tracking approach based on corrected decisions
exhibits a competitive performance even compared to maximum likelihood
estimation based on the signals themselves. Motivated by the success
of the LVDF algorithm in WSNs, we are currently developing a general
classification framework for based on data fusion from correlated
classifiers
February 6, 10AMnoon, Room MCS 135 (Friday)
Mark Veillette
Department of Mathematics and Statistics,
Boston University
Abstract: This talk will be based on lecture notes by Géard Ben Arous of NYU. I will begin by discussing the continuoustime random walk (CTRW) on Z^{d}, which is a simple random walk that waits for a random time at each step. The waiting times we consider are sampled from a heavytailed distribution. With appropriate scaling of the CTRW, we obtain another process, called the fractional kinetic process, which is Brownian motion with a time change given by the inverse of an alphastable subordinator which is independent of the Brownian motion. We will then introduce the Bouchaud trap model, which generalizes the CTRW by adding dependence between the steps of the random walker and the waiting times. We will look at the scaling limit of this model and see some surprising results.
February 19, 45pm, Room MCS 149
Hao Xing
Department of Mathematics
University of Michigan
We will discuss two new regularity properties of the optimal stopping
problem for Levy processes with a nondegenerate diffusion component.
First, under some conditions on the Levy measure, we will show that
the value function of the optimal stopping problem is a classical
solution of the variational inequality. This result has been known for
Levy processes of finite activity. We extend it to the general Levy
processes of infinite activity. Second, we will discuss a specific
example of the optimal stopping problem: the American option pricing
problem. We will show that the optimal exercise boundary / free
boundary of the American put in jump diffusion models is continuously
differentiable (except at the maturity). Moreover, the boundary is
shown to be infinitely differentiable under a regularity assumption on
the jump distribution. These are joint works with Erhan Bayraktar.
February 27, 10AMnoon, Room MCS 137 (Friday)
Emmanuel Denis
Department of Mathematics and Statistics,
Boston University
In 1985 Leland suggested an approach to pricing contingent claims under
proportional transaction costs. Its main idea is to use the classical Black{
Scholes formula with a suitably enlarged volatility for a periodically revised
portfolio whose terminal value approximates the payo h(ST ) = (ST ..K)+
of the calloption. In subsequent studies, Lott, Kabanov and Safarian, Gamys
and Kabanov provided a rigorous mathematical analysis and established that
the hedging portfolio approximates this payo in the case where the trans
action costs decrease to zero as the number of revisions tends to in
March 19, 45pm, Room MCS 149
Erin Conlon
Department of Mathematics and Statistics,
University of Massachusetts, Amherst
Biologists often conduct multiple independent microarray studies that all target the same biological system or pathway. Pooling information across studies can help more accurately identify true target genes. Here, we introduce a Bayesian hierarchical model to combine microarray data across studies to identify differentially expressed genes. Each study has several sources of variation, i.e. replicate slides within repeated experiments. Our model produces the genespecific posterior probability of differential expression, which is the basis for inference. We further develop the models to identify up and downregulated genes separately, and by including gene dependence information. We evaluate the models using both simulation data and biological data for the model organisms Bacillus subtilis and Geobacter sulfurreducens.
March 26, 45pm, Room MCS 149
Ivan Nourdin
Laboratoire de Probabilités et Modèles Aléatoires
Universit´e Paris VI
Let X_{1}, X_{2}, ... denote any sequence of centered independent random variables with unitary variance, and verifying moreover that there exists q>2 such that the qth (absolute) moment of X_{i} is uniformly bounded.
Fix an integer d>1, and let Q_{d}(n,X) denote the sum, for
i_{1}, ..., i_{d} from 1 to n, of
f_{n}(i_{1},...,i_{d})X_{i1}...X_{id}.
Here, f_{n}:1,...,n^{d} > R denotes a symmetric function, vanishing on diagonals and normalized in such a way that d! f_{n}^{2}=1.
During my talk, I will show the following invariance principle: "If Q_{d}(n,G) converges in law to N(0,1) then Q_{d}(n,X) converges in law to N(0,1) for all sequences X as above."
This talk is based on a work in progress with Giovanni Peccati (Paris Ouest) and Gesine Reinert (Oxford)
March 27, 10am12noon, Room MCS 137
Constantinos Kardaras
Department of Mathematics and Statistics,
Boston University
A set of weak axioms is proposed to model consumption choice rules of agents that are numeraireinvariant. We obtain that this corresponds to logarithmic utility maximization (albeit in a weaker, behavioralbased, sense) under a subjective agent's probability.
Further, the question of general equilibrium in an incomplete financial market model is undertaken, where economic agents have numeraireinvariant preferences. The market contains a borrowing and lending account in zero net supply, as well as a stock in positive unit netsupply providing certain dividend stream, exogenously specified. A characterization of existence and uniqueness of equilibrium is provided in terms of stochastic differential equations. Importantly, the proposed framework naturally allows for equilibria where assets in positive net supply contain bubbles. This is true even in the case of complete markets with unconstrained acting agents, a fact appearing inconsistent with the traditional "representative agent" framework of assetpricing theory.
April 2, 45pm, Room MCS 149
Mark Veillette
Department of Mathematics and Statistics,
Boston University
The firstpassage time of a nondecreasing Levy process (which is sometimes
referred to as an inverse subordinator) is a process which has arisen in many
applications. This firstpassage time process is, in general, nonMarkovian with
nonstationary and nonindependent increments. In this talk, we give methods for computing joint moments of these processes. In order to implement our method, one must
invert a Laplace transform which depends on the characteristic
exponent of the original Levy process. We give two numerical methods for
inverting this Laplace transform which are
based on the the Bromwich integral and the PostWidder inversion
formula. We will consider examples including compound poisson
processes and mixtures of alphastable subordinators.
April 16, 45pm, Room MCS 149
Stuart Geman
Department of Applied Mathematics,
Brown University
Google engineers routinely train query classifiers, for ranking
advertisements or search results, on more words than any human being sees
or hears in a lifetime. A human being who sees a meaningfully new image
every second for onehundred years will not see as many images as Google
has in its libraries, all of which are available for training object
detectors and image classifiers. Yet by human standards the
stateoftheart, in computer understanding of language and
computergenerated image analysis, is primitive. What explains the gap?
Why can't learning theory tell us how to make machines that learn as
efficiently as humans? Upper bounds on the number of training samples
needed to learn a classifier as rich and competent as the human visual
system can be derived using the VapnikChervonenkis dimension, or the
metric entropy, but these suggest that not only does Google need more
examples, but all of evolution might fall short. I will make some
proposals for efficient learning and offer some mathematics to support
them.
April 30, 45pm, Room MCS 149
Semyon Malamud
ETH, Zurich,
Switzerland
We solve for the equilibrium dynamics of information sharing in a large population.
Each agent is endowed with signals regarding the likely outcome of a
random variable of common concern. Individuals choose the effort with which
they search for others from whom they can gather additional information. When
two agents meet, they share their information. The information gathered is further
shared at subsequent meetings, and so on. Equilibria exist in which agents
search maximally until they acquire sufficient information precision, and then minimally.
A tax whose proceeds are used to subsidize the costs of search improves
information sharing and can in some cases increase welfare. On the other hand,
endowing agents with public signals reduces information sharing and can in some
cases decrease welfare.
(joint work with Darrell Duffie and Gustavo Manso)
May 1, 11 am 12 noon, Room MCS 137
Semyon Malamud
ETH, Zurich,
Switzerland
We establish sufficient conditions for dynamic completeness of ArrowDebreu
equilibria in finite horizon financial markets with state variables following a multidimensional
diffusion process. Our conditions can be
verified for a large class of multidimensional diffusion processes, including arbitrary,
timeinhomogeneous, multidimensional OrnsteinUhlenbeck processes as well
as vectors of onedimensional diffusions. We show that dynamic completeness holds
under a nondegeneracy condition on the dividend rates even when assets do not
have terminal dividends. Furthermore, our conditions can be applied to situations
when the Jacobian of the dividend rates is highly degenerate, e.g., when one of the
assets is an annuity. If the horizon is infinite, we show the market is dynamically
complete for all (except, maybe, a countable set of) discount rates.
We also give examples of dynamic incompleteness
when the dividend rates are smooth, but not analytic. In fact, we show
that any price function can be realized in equilibrium.
(joint work with Julien Hugonnier and Eugene Trubowitz) Information on seminars from previous semesters may be found here: Fall 2005  Spring 2006  Fall 2006 Spring 2007 Fall 2007 Spring 2008 Fall 2008 .
