DSpace Collection:
http://hdl.handle.net/2099/3721
2015-01-30T17:19:43ZOn-line nonparametric estimation
http://hdl.handle.net/2099/3748
Title: On-line nonparametric estimation
Authors: Khasminskii, Rafail
Abstract: A survey of some recent results on nonparametric on-line estimation is presented. The first result deals with an on-line estimation for a smooth signal S(t) in the classic ‘signal plus Gaussian white noise’ model. Then an analogous on-line estimator for the regression estimation problem with equidistant design is described and justified. Finally some preliminary results related to the on-line estimation for
the diffusion observed process are described.2004-01-01T00:00:00ZAsymptotic normality of the integrated square error of a density estimator in the convolution model
http://hdl.handle.net/2099/3747
Title: Asymptotic normality of the integrated square error of a density estimator in the convolution model
Authors: Butucea, Cristina
Abstract: In this paper we consider a kernel estimator of a density in a convolution model and give a central limit theorem for its integrated square error (ISE). The kernel estimator is rather classical in minimax theory when the underlying density is recovered from noisy observations. The kernel is fixed and depends
heavily on the distribution of the noise, supposed entirely known. The bandwidth is not fixed, the results hold for any sequence of bandwidths decreasing to 0. In particular the central limit theorem holds for the bandwidth minimizing the mean integrated square error (MISE). Rates of convergence are sensibly different in the case of regular noise and of super-regular noise. The smoothness of the underlying unknown density is relevant for the evaluation of the MISE.2004-01-01T00:00:00ZOn best affine unbiased covariance-preserving prediction of factor scores
http://hdl.handle.net/2099/3746
Title: On best affine unbiased covariance-preserving prediction of factor scores
Authors: Neudecker, Heinz
Abstract: This paper gives a generalization of results presented by ten Berge, Krijnen, Wansbeek & Shapiro. They examined procedures and results as proposed by Anderson & Rubin, McDonald, Green and Krijnen, Wansbeek & ten Berge. We shall consider the same matter, under weaker rank assumptions. We allow some moments, namely the variance of the observable scores vector and that of the
unique factors,to be singular. We require T′ T > 0, where T T′ is a Schur decomposition of. As usual the variance of the common factors, , and the loadings matrix Awill have full column rank.2004-01-01T00:00:00ZLocal superefficiency of data-driven projection density estimators in continuous time
http://hdl.handle.net/2099/3745
Title: Local superefficiency of data-driven projection density estimators in continuous time
Authors: Bosq, Denis; Blanke, Delphine
Abstract: We construct a data-driven projection density estimator for continuous time processes. This estimator reaches superoptimal rates over a class F0 of densities that is dense in the family of all possible densities, and a «reasonable» rate elsewhere. The class F0 may be chosen previously by the analyst. Results apply to Rd-valued processes and to N-valued processes. In the particular case where squareintegrable local time does exist, it is shown that our estimator is strictly better than the local time estimator over F0.2004-01-01T00:00:00ZModelling stock returns with AR-GARCH processes
http://hdl.handle.net/2099/3744
Title: Modelling stock returns with AR-GARCH processes
Authors: Ferenstein, Elzbieta; Gasowski, Miroslaw
Abstract: Financial returns are often modelled as autoregressive time series with random disturbances having conditional heteroscedastic variances, especially with GARCH type processes. GARCH processes have been intensely studying in financial and econometric literature as risk models of many financial time series. Analyzing two data sets of stock prices we try to fit AR(1) processes with GARCH or EGARCH errors to the log returns. Moreover, hyperbolic or generalized error distributions occur to be
good models of white noise distributions.2004-01-01T00:00:00ZImproving both domain and total area estimation by composition
http://hdl.handle.net/2099/3741
Title: Improving both domain and total area estimation by composition
Authors: Costa, Àlex; Satorra, A.; Ventura, Eva
Abstract: In this article we propose small area estimators for both the small and large area parameters. When the objective is to estimate parameters at both levels, optimality is achieved by a sample design that combines fixed and proportional allocation. In such a design, one fraction of the sample is distributed proportionally among the small areas and the rest is evenly distributed. Simulation is used to assess
the performance of the direct estimator and two composite small area estimators, for a range of sample sizes and different sample distributions. Performance is measured in terms of mean squared errors for both small and large area parameters. Small area composite estimators open the possibility of reducing the sample size when the desired precision is given, or improving precision for a given sample size.2004-01-01T00:00:00ZIncorporating patients' characteristics in cost-effectiveness studies with clinical trial data: a flexible Bayesian approach
http://hdl.handle.net/2099/3740
Title: Incorporating patients' characteristics in cost-effectiveness studies with clinical trial data: a flexible Bayesian approach
Authors: Vázquez Polo, Francisco J.; Negrín Hernández, Miguel Angel
Abstract: Most published research on the comparison between medical treatment options merely compares the results (effectiveness and cost) obtained for each treatment group. The present work proposes the incorporation of other patient characteristics into the analysis. Most of the studies carried out in this context assume normality of both costs and effectiveness. In practice, however, the data are not always distributed according to this assumption. Altervative models have to be developed.In this paper, we present a general model of cost-effectiveness, incorporating both binary effectiveness
and skewed cost. In a practical application, we compare two highly active antiretroviral treatments applied to asymptomatic HIV patients. We propose a logit model when the effectiveness is measured depending on whether an initial purpose is achieved. For this model, the measure to compare treatments is the difference in the probability of success. Besides, the cost data usually present a right skewing. We propose the use of the logtransformation
to carry out the regression model. The three models are fitted demonstrating the
advantages of this modelling. The cost-effectiveness acceptability curve is used as a measure for decision-making.2004-01-01T00:00:00Z