Tensor methods have been attracting increasing documented interest over the past decade, finding a plethora of important signal processing, data analysis, and machine learning applications. Tensors have been extensively employed in several research areas such as computer vision, chemometrics, bioinformatics, communications, array processing, network analysis, data mining, and deep learning, among others. The wide success of tensor methods can be attributed to their inherent ability to better model, analyze, predict, recognize, and learn from multi-modal data. This symposium wishes to serve as a global forum for researchers to meet, exchange ideas, and present new theoretical and algorithmic findings related to tensor methods. Based on broad signal processing and machine learning foundations, this symposium aspires to foster interdisciplinary discussions and bring together researchers from both academia and the industry.
University of Virginia
We reveal an interesting link between tensors and multivariate statistics. The rank of a multivariate probability tensor can be interpreted as a nonlinear measure of statistical dependence of the associated random variables. Rank equals one when the random variables are independent, and complete statistical dependence corresponds to full rank; but we show that rank as low as two can already model strong statistical dependence. In practice we usually work with random variables that are neither independent nor fully dependent -- partial dependence is typical, and can be modeled using a low-rank multivariate probability tensor. Directly estimating such a tensor from sample averages is impossible even for as few as ten random variables taking ten values each --yielding a billion unknowns; but we often have enough data to estimate lower-order marginalized distributions. We prove that it is possible to identify the higher-order joint probabilities from lower order ones, provided that the higher-order probability tensor has low-enough rank, i.e., the random variables are only partially dependent. We also provide a computational identification algorithm that is shown to work well on both simulated and real data. The insights and results have numerous applications in estimation, hypothesis testing, completion, machine learning, and system identification. Low-rank tensor modeling thus provides a `universal' non-parametric (model-free) alternative to probabilistic graphical models.
Nikos Sidiropoulos received the Diploma in Electrical Engineering from the Aristotelian University of Thessaloniki, Greece, and M.S. and Ph.D. degrees in Electrical Engineering from the University of Maryland–College Park, in 1988, 1990 and 1992, respectively. He served as Assistant Professor at the University of Virginia (1997-1999); Associate Professor at the University of Minnesota–Minneapolis (2000-2002); Professor at the Technical University of Crete, Chania–Crete, Greece (2002-2011); and Professor at the University of Minnesota–Minneapolis (2011-), where he holds an ADC Chair in Digital Technology (2015-). His research interests are in signal processing, communications, optimization, tensor decomposition, and factor analysis. His current research focuses primarily on signal and tensor analytics and optimization-based algorithms, with applications in machine learning and communications. He received the NSF/CAREER award in 1998, and the IEEE Signal Processing Society (SPS) Best Paper Award in 2001, 2007, and 2011. He served as IEEE SPS Distinguished Lecturer (2008-2009), and as Chair of the IEEE Signal Processing for Communications and Networking Technical Committee (2007-2008), and was recently elected Vice President - Membership of the IEEE Signal Processing Society. He received the 2010 IEEE Signal Processing Society Meritorious Service Award, and the 2013 Distinguished Alumni Award from the University of Maryland, Dept. of ECE. He is a Fellow of IEEE (2009) and a Fellow of EURASIP (2014).
|Wednesday, November 28|
|09:40 - 10:40|
|DL DL-TM.1: Nikos Sidiropoulos: "Tensors and Probability: An Intriguing Union"|
|11:00 - 12:30|
|TM-L.1: Theory/Algorithms I|
|14:00 - 15:30|
|TM-L.2: Theory/Algorithms II|
|15:50 - 17:20|
Submissions are welcome on topics including:
Prospective authors are invited to submit full-length papers (up to 4 pages for technical content including figures and possible references, and with one additional optional 5th page containing only references) and extended abstracts (up to 2 pages, for paper-less industry presentations and Ongoing Work presentations).. Manuscripts should be original (not submitted/published anywhere else) and written in accordance with the standard IEEE double-column paper template. Accepted full-length papers will be indexed on IEEE Xplore. Accepted abstracts will not be indexed in IEEE Xplore, however the abstracts and/or the presentations will be included in the IEEE SPS SigPort. Accepted papers and abstracts will be scheduled in lecture and poster sessions.
|Paper Submission Deadline|
|Review Results Announced||September 7, 2018|
|Camera-Ready Papers Due||September 24, 2018|
|November 5, 2018||Hotel Room Reservation Deadline|