光華講壇——社會(huì)名流與企業(yè)家論壇第6721期
主 題:Deep Over-parameterized Smoothing Quantile Regression深度過(guò)參數(shù)化平滑分位數(shù)回歸
主講人:上海財(cái)經(jīng)大學(xué) 馮興東教授
主持人:統(tǒng)計(jì)學(xué)院 林華珍教授
時(shí)間:1月22日 13:00-14:00
舉辦地點(diǎn):柳林校區(qū)弘遠(yuǎn)樓408會(huì)議室
主辦單位:統(tǒng)計(jì)研究中心和統(tǒng)計(jì)學(xué)院 科研處
主講人簡(jiǎn)介:
馮興東,博士畢業(yè)于美國(guó)伊利諾伊大學(xué)香檳分校,現(xiàn)任上海財(cái)經(jīng)大學(xué)統(tǒng)計(jì)與管理學(xué)院院長(zhǎng)、統(tǒng)計(jì)學(xué)教授、博士生導(dǎo)師。研究領(lǐng)域?yàn)閿?shù)據(jù)降維、穩(wěn)健方法、分位數(shù)回歸以及在經(jīng)濟(jì)問(wèn)題中的應(yīng)用、大數(shù)據(jù)統(tǒng)計(jì)計(jì)算、強(qiáng)化學(xué)習(xí)等,在國(guó)際頂級(jí)統(tǒng)計(jì)學(xué)/計(jì)量經(jīng)濟(jì)學(xué)期刊JASA、AoS、JRSSB、Biometrika、JoE以及人工智能期刊/頂會(huì)JMLR、NeurIPS上發(fā)表論文多篇。2018年入選國(guó)際統(tǒng)計(jì)學(xué)會(huì)推選會(huì)員(Elected member),2019年擔(dān)任全國(guó)青年統(tǒng)計(jì)學(xué)家協(xié)會(huì)副會(huì)長(zhǎng)以及全國(guó)統(tǒng)計(jì)教材編審委員會(huì)第七屆委員會(huì)專(zhuān)業(yè)委員(數(shù)據(jù)科學(xué)與大數(shù)據(jù)應(yīng)用組),2020年擔(dān)任第八屆國(guó)務(wù)院學(xué)科評(píng)議組(統(tǒng)計(jì)學(xué))成員,2022年擔(dān)任全國(guó)應(yīng)用統(tǒng)計(jì)專(zhuān)業(yè)碩士教指委委員,2023年擔(dān)任全國(guó)工業(yè)統(tǒng)計(jì)學(xué)教學(xué)研究會(huì)副會(huì)長(zhǎng)以及中國(guó)數(shù)學(xué)會(huì)概率統(tǒng)計(jì)分會(huì)常務(wù)理事,2022年起兼任國(guó)際統(tǒng)計(jì)學(xué)權(quán)威期刊Annals of Applied Statistics和Statistica Sinica編委(Associate Editor)以及國(guó)內(nèi)統(tǒng)計(jì)學(xué)權(quán)威期刊《統(tǒng)計(jì)研究》編委。
內(nèi)容簡(jiǎn)介:
In this work, we provide a rigorous statistical guarantee (oracle inequalities) for deep nonparametric smoothing quantile regression utilizing over-parameterized ReLU neural networks. Taking into account the weight norm control of neural networks, we derive these oracle inequalities by balancing the trade-off between the approximation and statistical (generalization) errors. We establish a novel result regarding the approximation capabilities of over-parameterized ReLU neural networks for H?lder functions. Furthermore, we demonstrate the attainment of a statistical error that is independent of the network size, which is a notable advantage. Importantly, our findings indicate that the curse of dimensionality can be effectively mitigated if data are supported on a low-dimensional Riemannian Manifold.
在這項(xiàng)工作中,主講人利用超參數(shù)化ReLU神經(jīng)網(wǎng)絡(luò)為深度非參數(shù)平滑分位數(shù)回歸提供了嚴(yán)格的統(tǒng)計(jì)保證(oracle不等式)??紤]到神經(jīng)網(wǎng)絡(luò)的權(quán)范數(shù)控制,主講人通過(guò)平衡近似誤差和統(tǒng)計(jì)(泛化)誤差來(lái)推導(dǎo)這些oracle不等式,并就過(guò)參數(shù)化ReLU神經(jīng)網(wǎng)絡(luò)對(duì)H?lder函數(shù)的逼近能力建立了一個(gè)新的結(jié)果。此外,主講人成果的其中一個(gè)顯著優(yōu)勢(shì)是證明了統(tǒng)計(jì)誤差的實(shí)現(xiàn)與網(wǎng)絡(luò)規(guī)模無(wú)關(guān)。重要的是,研究結(jié)果表明,如果在低維黎曼流形上得到支持?jǐn)?shù)據(jù),可以有效地減輕維數(shù)災(zāi)難。