• Keine Ergebnisse gefunden

Conclusion and outlook 127

Im Dokument Non-Linear Mean Impact Analysis (Seite 132-137)

In this setup as well we defined and investigated a common linear partial mean impact.

Applications of this common linear mean impact are again polynomial impacts which account for possible polynomial influences or more general we can fit (almost) any ad-ditive model in X1 and account for (almost) any influences of X2, ..., Xk which can be expressed by additive models.

For the kernel-smoother based mean impact we also derived a partial mean impact.

The partial mean impact from Scharpenberg (2012) uses orthogonal projections. In this thesis we also derived an approach that does not need such projections. It quantifies the influence ofX1 onY which goes beyond the possible influence of other covariates by the difference of the common mean impact of all variables and the common mean impact of all variables exceptX1. In all partial non-linear impact analyses we also derived partial non-linear mean slopes and partial non-linear measures for determination.

Simulations indicated that in the single covariate case the kernel-smoother based mean impact is the most powerful approach except when the true underlying regression rela-tionship is linear. In that case, obviously, the linear mean impact performed best. The results from the simulation for the non-linear partial mean impact analysis showed that the performance of the different methods are more dependent on the underlying scenario than in the single covariate case. The linear partial mean impact analysis did by far outperform the other methods in a linear scenario. In moderately non-linear setups the polynomial partial mean impact performed best, while the kernel smoother based par-tial mean impact analysis was the only method that still had reasonable power in highly non-linear scenarios.

The framework of the mean impact analysis still offers many opportunities for further research. First of all it is desirable to justify the use of data dependent bandwidth in the case of kernel-smoothing theoretically. We are also interested in a theoretically justified method allowing for the use of a kernel-smoother where we do not need to drop the denominator, which uses the full data set available and maintains the coverage probabil-ity. Furthermore, it might be valuable to allow for splines with a data dependent knot sequence. Another interesting topic for further research is the application of the mean impact analysis to high dimensional setups. It might be possible to apply data reduc-ing methods like the principal component analysis and use the mean impact analysis to obtain an interpretable and sensible measure of association.

References

Bickel, P. J. and D. Freedman (1981). Some asymptotic theory for the bootstrap. The Annals of Statistics 9.

Brannath, W. and M. Scharpenberg (2014). Interpretation of linear regression coefficients under mean model miss-specification. arXiv:1409.8544v4 [stat.ME].

Cleveland, W. S. (1979). Robust locally weighted regression and smoothing scatterplots.

Journal of the American Statistical Association 74.

Davison, A. and D. Hinkley (2009). Bootstrap Methods and their Application (11th printing). Cambridge University Press.

Doksum, K. and A. Samarov (1995). Nonparametric estimation of global functionals and a measure of the explanatory power of covariates in regression. The Annals of Statistics 23.

Efron, B. (1979). Bootstrap methods: Another look at the jacknife. Annals of Statis-tics 7.

Efron, B. (1987). Better bootstrap confidence intervals. Journal of the American Sta-tistical Association 82.

Epanechnikov, V. A. (1969). Non-parametric estimation of a multivariate probability density. Theory of Probability & Its Applications 14.

Fischer, G. (2005). Lineare Algebra (15.Auflage). Wiesbaden: Vieweg.

Gasser, T. and H.-G. M¨uller (1979). Kernel estimation of regression functions. Heidel-berg: Springer-Verlag.

Gasser, T., H.-G. M¨uller, W. Kohler, L. Molinari, and A. Prader (1984). Nonparametric regression analysis of growth curves. The Annals of Statistics 12.

Gasser, T., H.-G. M¨uller, and V. Mammschitz (1985). Kernels for nonparametric curve estimation. Journal of the Royal Statistical Society. Series B (Methodological) 47.

Hall, P. (1988). Theoretical comparison of bootstrap confidence intervals. The Annals of Statistics 16.

Hall, P. (1992). The Bootstrap and Edgeworth Expansion. New York, Berlin, Heidelberg, London, Paris, Tokyo, Hong Kong, Barcelona, Budapest: Springer Verlag.

H¨ardle, W. and J. Marron (1991). Bootstrap simultaneous error bars for nonparametric regression. The Annals of Statistics 19.

Hastie, T., R. Tibshirani, and J. Friedman (2001). The Elements of Statistical Learning (2nd edition). New York, Berlin, Heidelberg: Springer-Verlag.

Hoeffding, W. (1948). A class of statistics with asymptotically normal distribution.

Annals of Mathematical Statistics 19.

Huber, P. J. (1981). Robust Statistics. New York, Chichester, Brisbane, Toronto: John Wiley & Sons.

Karunamuni, R. and T. Alberts (2005). On boundary correction in kernel density esti-mation. Statistical Methodology 2.

Klenke, A. (2008).Wahrscheinlichkeitstheorie (2. Auflage). Berlin, Heidelberg: Springer-Verlag.

Kowalski, J. and X. Tu (2008). Modern Applied U-Statistics. Wiley.

Mack, Y. P. and B. W. Silverman (1982). Weak and strong uniform consistency of kernel regression estimates. Zeitschrift fuer Wahrscheinlichkeitstheorie und verwandte Gebiete 61.

Nadaraya, E. A. (1964). On estimating regression. Theory of Probability & Its Applica-tions 9.

Paulson, D. S. (2007). Handbook of Regression And Modeling - Applications for the Clinical and Pharmaceutical Industries. Chapman & Hall /CRC.

Pollard, D. (1984). Convergence of Stochastic Processes. New York, Berlin, Heidelberg, Tokyo: Springer-Verlag.

Powell, J. L., J. Stock, and T. Stoker (1989). Semiparametric estimation of index coef-ficients. Econometrica 57.

Scharpenberg, M. (2012). A population-based approach to analyze the influence of co-variates. University of Bremen: Diploma thesis.

Serfling, R. J. (1980). Approximation Theorems of Mathematical Statistics. John Wiley

& Sons.

van der Vaart, A. (2000). Asymptotic Statistics. Cambridge University Press.

von Mises, R. (1947). On the asymptotic distribution of differentiable statistical func-tions. Annals of Mathematical Statistics 18.

Watson, G. S. (1964). Smooth regression analysis. Sankhya: The Indian Journal of Statistics 26.

White, H. (1980a). A heteroskedasticity-consistent covariance matrix estimator and a direct test for heteroskedasticity. Econometrica 48.

White, H. (1980b). Least squares to approximate unknown regression functions. Inter-national Economic Review 21.

Wu, C. F. J. (1968). Jacknife, bootstrap and other resampling methods in regression analysis. The Annals of Statistics 14.

A. Methodology

All literature used in this appendix can be found in the main literature list.

A.1. Nonparametric regression

Since we make extensive use of non-linear regression techniques in the course of this thesis we will give an introduction to them in the following sections. Besides polynomial regression, which is expected to be known to the reader we will make use of kernel- and spline-methods.

Im Dokument Non-Linear Mean Impact Analysis (Seite 132-137)