id author title date pages extension mime words sentence flesch summary cache txt cr56n013179 Yinan Li Noise Injection and Noise Augmentation for Model Regularization, Differential Privacy and Statistical Learning 2020 .txt text/plain 382 8 -4 My dissertation includes whiteout in Neural Networks, which adaptively inject noise into nodes to achieve regularization effects and promote robustness; fast Converging and Robust Optimal Path Selection (CROPS) in continuous-time Markov-switching generalized autoregressive conditional heteroskedasticity (COMS-GARCH) process, CROPS is a Bernoulli NI enhanced Markov Chain Expectation Maximization (MC-EM) algorithm that improves accuracy in both hidden path identification and volatility estimation and achieve ensemble learning and robustness effects; AdaPtive Noise Augmentation (PANDA) in Generalized Linear Models (GLMs), PANDA realizes a wide range of existing regularization effects and also exact L0 regularization with little computational burden through the orthogonal regularization I proposed, PANDA also provides tighter confidence intervals with higher coverage probability for both zero and non-zero estimated parameters under variable selection regularization; PANDA in Undirected Graphical Models (UGMs), PANDA realized both likelihood based graphical L0, I proposed for Gaussian Graphical Models (GGMs) and existing Neighborhood Selection methods in UGMs; adaptive Noise Augmentation for differentially Private (NAP) As part of the future work, I extended PANDA L0 regularization into Support Vector Machine (SVM), I generalized the concept of orthogonal regularization to realize rank regularization in both multiple response GLMs and Tensor Regressions, I expect the combination of rank regularization and NAP-ERM show high utility while guaranteeing DP, I also expect there to be a more rigorous proof for graphical L0 by using duality. cache/cr56n013179.txt txt/cr56n013179.txt