正规beat365旧版绿色注册(中国)官方网站-App Store

当前位置: 首页 >> 科学研究 >> 学术报告 >> 正文
报告题目:

Convergence for Kernel Minimum Error Entropy Principle

报告人:

胡婷 教授(西安交通大学)

报告时间:

报告地点:

理学院西北楼一楼报告厅(103)

报告摘要:

Information theoretic learning is a learning paradigm that uses concepts of entropies and divergences from information theory. A variety of signal processing and machine learning methods fall into this framework. Minimum error entropy principle is a typical one amongst them. In this talk, we study a kernel version of minimum error entropy methods that can be used to find nonlinear structures in the data. We show that the kernel minimum error entropy can be implemented by kernel based gradient descent algorithms with or without regularization. Convergence rates for both algorithms are deduced.

Copyright (C) 2018. 正规beat365旧版绿色注册 版权所有 中国·武汉·武昌·珞珈山