From High-dimensional Linear Discriminant Analysis to Markowitz Portfolio Optimization
Abstract

High-dimensional linear discriminant analysis (HLDA) suffers from the difficulty of consistent estimation of covariance matrix. Recently, a linear programming discriminant (LPD) rule was proposed for high dimensional linear discriminant analysis. It is shown that the LPD rule is Bayes consistent in high-dimensional settings. We further show that the LPD rule is sign consistent under the sparsity assumption. We then bridge HLDA to high dimensional Markowitz portfolio optimization, and propose a linear portfolio optimizer (LPO). Moreover, the LPO estimator is shown to asymptotically yield the maximum expected return while conserving the risk constraint. Simulations on both synthetic and empirical data validate the performance of the proposed method.

SpeakerDr Zhang Zhen
Date:
 29 January 2019 (Tue)
Time: 10:00am - 11:00am
PosterClick here

Biography

Dr Zhen Zhang received the Ph.D. degree in Applied Mathematics from The Hong Kong University of Science and Technology in 2013, and the B.S. degree in Mathematics from University of Science and Technology of China in 2007. He worked as a postdoctoral fellow at Department of Mathematics in National University of Singapore from 2013 to 2015. He is the winner of 2015 Hong Kong Mathematics Society Best Thesis Award, 2015 and the holder of one US Patent. He is now an associate Professor at Department of Mathematics, Southern University of Science and Technology, China. His research interests include numerical PDEs, multiscale modeling, and high-dimensional data analysis.