SEMINAR de TEORIA PROBABILITATILOR, STATISTICA si APLICATII
Rezumat: We revisit some classical facts about cumulants and substantially refine some of them to obtain fine information about sums of dependent random variables. We illustrate our results with some examples with models from statistical mechanics and mathematical finance.
Rezumat: In this talk we will present a few techniques and approaches in data analysis that have in common the use of geometric/topological ideas and concepts for computation and interpretation. In particular, we will discuss multiscale kernel exploratory data analysis, persistent homology, and manifold regression. We will describe the challenges these techniques present, in theory and practice, as well as some recent progress. Several applications will be discussed, in statistical genetics and other fields.
We present a qualitative analysis for stochastic variational inequalities with oblique subgradients,
by advancing from the convex framework to the non-convex one. We first provide the existence and
uniqueness of a solution for the multivalued equation
The mixture between the maximal monotonicity property of the (convex) subdifferential operator ∂φ
and the Lipschitz property of matrix mapping X → H(X) preserves, for the product H∂φ, neither
the first property nor the second one. The existence result is based on a deterministic approach: a
generalized Skorohod problem with oblique reflection is first analyzed. Replacing in (1) the subdifferential
operator with the (non-convex) Fréchet subdifferential ∂-φ of a semiconvex function φ, the
difficulties are enhanced because, even for the penalizing smooth deterministic problem, the existence
of a solution is not provided by existing results. In order to achieve our goals we first extend the
well-known results of Brézis concerning the regularization of convex functions and we obtain the existence
and uniqueness of the solution for the penalized problem. Imposing some geometrical assumptions on
the domain, the study continues with the analysis of a non-convex Skorohod problem with oblique
subgradients, followed by stochastic variational inequalities with generalized reflection.
A similar approach is used for backward stochastic variational inequalities (BSVIs, for short) with multivalued operators of convex type. However, a gap arises when we consider these kind of equations governed by Fréchet type generalized subgradients. The solution consists in using piecewise deterministic Markov processes for driving the considered backward equations. The study is accompanied by a model which aims to the analysis of the infection time in some multistable gene networks.
Rezumat: In this talk we consider inverse problems both in a continuous and a discrete statistical framework. We review the latest developments in the methodology, emphasizing the similarities but also the specifics related to the nature of the setting. Moreover, several notions of convergence and the corresponding analysis results are presented. In the end, new ideas are suggested for the theoretical study of this class of inverse problems.
Rezumat: We investigate the numerical reconstruction of the missing thermal boundary conditions on an inaccessible part of the boundary in the case of steady-state heat conduction in anisotropic solids from the knowledge of over-prescribed noisy data on the remaining accessible boundary. This inverse boundary value problem is approached by employing a variational formulation which transforms it into an equivalent control problem. Four such approaches are presented and both a parameter-dependent and a parameter independent gradient based algorithms are obtained in each case. The numerical implementation is realized for the 2D case by employing the boundary element method (BEM) and assuming that the available boundary data are either exact or noisy. For perturbed Cauchy data the numerical solution is stabilized/regularised by stopping the iterative procedure according to Morozov's discrepancy principle.
Rezumat: In my previous talk in Bucharest (the 11th of January 2018) I was presenting a new algorithm, ABC Shadow, a versatile method for fitting point processes to data. This talk presents several Gibbs point interaction models (Geyer, Connected Components and Area-Interaction) that are fitted to real three dimensional datasets from the SDSS galaxy catalogue via this algorithm. Under the hypothesis of the considered models, the fitted point processes allow a morphological and statistical characterization of the galaxies distributions. Several model validation techniques are also used. They are based on Monte Carlo likelihood asymptotics, summary statistics (K-Ripleys function, pair correlation function...) and residual analysis for point processes (residuals plots, q-q plots...). Conclusions and perspectives, are finally depicted.
Rezumat: Sarmanov's family of multivariate distributions recently gained the interest of researchers in various domains due to its flexible structure that can model a large range of dependencies for given marginals. Therefore, we start by presenting the distribution's main characteristics and some of its extensions studied in the literature.
In particular, the flexible dependence structure motivated the consideration of Sarmanov's distribution in the fields of insurance and finance, from which we will present several applications. More precisely, as a first example, we shall discuss the fit of the bivariate Sarmanov distribution with different types of truncated marginal distributions to a bivariate losses data set. As a second example, we introduce some trivariate Sarmanov distributions with Generalized Linear Models for marginals with the aim to incorporate some individual characteristics of the policyholders when modeling a real trivariate data set of claims frequencies.
Finally, we consider the capital allocation problem, which consists in fairly allocating the capital needed to cover the aggregate loss of a company (e.g., insurance company) among its various lines of business. Risk measures are well-known tools used for this purpose, and one of the most popular such risk measure is the Tail-Value-at-Risk (TVaR). Based on this risk measure, we present some closed-type allocation formulas for risks modeled by Sarmanov's distribution.
Rezumat: Recent theoretical and empirical studies involving nonparametric conditional frontier models stress the importance of conditional efficiency measures as the only way to treat appropriately the presence of external factors and/or environmental variables (Z) in a production process.
Conditional efficiency measures are based on the idea that the production process can be described as being conditioned by given values of the external/environmental factors. These factors can be included in the frontier model as exogenous variables and can help explaining the efficiency differentials and improving the managerial policy of the evaluated units. Conditional efficiency measures are estimated by means of a nonparametric estimator of the conditional distribution function of the inputs and outputs, conditionally on values of Z. For doing this, smoothing procedures and smoothing parameters, the bandwidths, are involved. The bandwidths for the conditioning variables play a crucial role in the process of estimating the efficiency measures since they "tune" the localization for computing the conditional efficiencies. Another important aspect is related to the second stage analysis and the explanation of differences in the efficiency levels achieved by economic producers that are facing different external/environmental conditions.
We present the most recent methodological developments in nonparametric estimation of conditional efficiency, completed by numerical illustrations on simulated data and useful insights on practical implementation.
*The presentation is based on joint work with Cinzia Daraio (Sapienza University of Rome, Italy) and Léopold Simar (Université Catholique de Louvain, Belgium).
Rezumat: In this presentation we consider the nonparametric robust estimation problem for regression models in continuous time with particular semi-Markov noises.
To be more specific, we are interested in estimating an unknown function S on the basis of observations that can be in continuous or discrete time. This problem of nonparametric estimation in regression models is an important chapter of theoretical and applied statistics that has been considered in many frameworks ("signal + white noise" models, "signal + color noise" regressions based on Ornstein-Uhlenbeck processes, etc.). Our main goal is to develop nonparametric adaptive robust estimation, with the noise process with large dependence; to this end, we use a particular cases of semi-Markov processes to model the dependent noises.
We construct a series of estimators by projection and thus we approximate the unknown function by a finite Fourier series. As we consider the estimation problem in an adaptive setting, i.e. in situation when the regularity of the function is unknown, we develop a new adaptive method based on the model selection procedure proposed by Konev and Pergamenshchikov (2012). First, this procedure gives us a family of estimators; second, we choose the best possible one by minimizing a cost function. Under general moment conditions on the noise distribution, a sharp non-asymptotic oracle inequality for the robust risks is obtained.
Our talk is based on:
· V. S. Barbu, S. Beltaief, S. Pergamenshchikov, "Robust adaptive efficient estimation for semi-Markov nonparametric regression models", to appear in Statistical inference for stochastic processes, 1-48, 2018 (available also at https://arxiv.org/abs/1604.04516v2);
· V. S. Barbu, S. Beltaief, S. Pergamenshchikov, "Robust adaptive efficient estimation for a semi-Markov continuous time regression from discrete data", 1-37, 2017 (available at http://arxiv.org/abs/1710.10653).
Rezumat: Stationary processes form an important class of stochastic processes that has been extensively studied in the literature. Their applications include modelling and forecasting numerous real life phenomenon including natural disasters, sustainable energy sources, sales and market movements.
One of the most essential families of stationary processes is the ARMA family. When modelling existing data with ARMA process, the first step is to fix the orders of the model. After that, one can estimate the related parameters by using standard methods such as maximum likelihood (ML) or least squares (LS) estimators. The final step is to conduct various diagnostic tests in order to determine the quality of the model.
In this talk we present a novel way of fitting a model to a data that is assumed to be a realization from a discrete time stationary process. Our approach is based on a recently proved AR(1) characterisation of stationary processes, where the noise is not assumed to be white. As a result, we obtain more general and easier way to fit a model into a stationary time series, thus outperforming traditional ARMA approaches. In particular, we obtain closed form consistent estimators of various model parameters and their asymptotic normality under general conditions. The results are then applied to the ARCH model with a memory effect. ARCH models can be employed, e.g. in modeling time-varying volatility. We also discuss continuous time extensions.
Rezumat: In financial mathematics, we often model a financial market as a vector of stochastic processes on a given filtered probability space. These processes are describing the evolution in time of financial prices of securities (stocks, bonds, derivatives). The arbitrage pricing theory is a powerful tool for analyzing these prices and the change of the underlying probability measure has become the classical tool. The aim of this talk is to explain that a different technique, the change of the underlying filtration, provides a characterization risk premiums attached to particular events that have impact on the security prices (such as the default event of a firm). Intuitively, the change of a filtration redefines the available information within the model.
Rezumat: Unlike empirical propositions (e.g., that ulcer is caused by a bacteria), which are refutable and contingent, mathematical propositions are (considered) certain and necessary. The key-difference seems to be that in mathematics we have *proofs* - from axioms, using deductive logic. But then what is the status of the axioms, in particular those in the ZFC system? In what sense, if any, do we know them? Is our knowledge of a mathematical truth dependent on our knowledge of the axioms? And in what sense, if any, are the axioms even true? Since we typically do not say that we -prove- the axioms, then what kind of justification can we present for them? (The same questions can of course be raised for the logical truths involved in proofs, such as modus ponens.) This talk will survey some philosophical views proposed to address these questions. A suggestion I will gesture toward is that the difference between empirical and logico-mathematical knowledge may not be as deep as usually thought.
Rezumat: A stick of length 1 is broken in pieces by random iid cuts X_n . Sort the pieces after the n'th cut ascendently and make the Lorenz curves of them. What happens asymptotically?
We prove that the limit of these Lorenz curves do exist in some cases and conjecture that the most egalitarian distribution of the cuts is the uniform one.
Rezumat: În prima parte, voi ilustra rolul fundamental al inegalitatilor lui Hardy în teoria spatiilor de functii prin doua exemple de baza: calculul functional în spatiile Sobolev si teoria spatiilor Sobolev cu ponderi. În partea a doua, voi prezenta aplicatii ale acestor teorii la studiul functiilor Sobolev unimodulare.
Rezumat: The object of this talk is to extend the classical definition of the multidimensional discrete scan statistic with the help of a score function. In this new framework, problems like finding the distribution of a monotone run in a sequence of i.i.d. random variables or scanning with different window shapes (rectangle, circle, ellipse or annulus) in a two-dimensional setting will be discussed. We propose several approximations for the distribution of the scan statistic and illustrate their accuracy by conducting a numerical comparison study.
Rezumat: This paper presents an original ABC algorithm, ABC Shadow, that can be applied to sample posterior densities that are continuously differentiable. The proposed algorithm solves the main condition to be fulfilled by any ABC algorithm, in order to be useful in practice. This condition requires enough samples in the parameter space region, induced by the observed statistics. The algorithm is tuned on the posterior of a Gaussian model which is entirely known, and then, it is applied for the statistical analysis of several spatial patterns. These patterns are issued or assumed to be outcomes of point processes. The considered models are: Strauss, Candy and area-interaction.
Rezumat: Repartitiile cvasi-stationare descriu miscarea conditionata de non-extinctie a unor procese Markov cu timp de viata aproape sigur finit. Prezentam rezultate recente privind existenta si unicitatea repartitiilor cvasi-stationare, precum si conditiile in care are loc convergenta exponentiala a probabilitatilor conditionate de non-extinctie catre o repartitie cvasi-stationara.
Rezumat: Prezentam o metoda de constructie de procese Markov de ramificare-fragmentare pe spatiu dimensiunilor de fragmente, induse de un nucleu de fragmentare continuu, sau discontinuu, cu aplicatii la un model stocastic pentru faza de fragmentare a unei avalanse. Prezentam apoi simulari ale traiectoriilor si ale distributiei proceselor, utilizand o metoda de aproximare numerica pentru solutii de ecuatii diferentiale stocastice de fragmentare. Calculam in final distributiile proceselor de ramificare care aproximeaza procesul de fragmentare. Expunerea se bazeaza pe lucrari cu Lucian Beznea si Madalina Deaconu.
Rezumat: Date represented by curves are considered as paths of a L_2 continuous stochastic process. The Karhunen-Loeve expansion is then used in the regression and vizualisation frameworks with such data. Application with data from industry illustrate the theory.