CMX is a research group aimed at the development and analysis of novel algorithmic ideas underlying emerging applications in the physical, biological, social and information sciences.  We are distinguished by a shared value system built on the development of foundational mathematical understanding, and the deployment of this understanding to impact on emerging key scientific and technological challenges.


Faculty

Venkat Chandrasekaran
Mathieu Desbrun
Thomas Hou
Houman Owhadi
Peter Schröder
Andrew Stuart
Joel Tropp

Von Karman
Instructors

Franca Hoffmann
Ka Chun Lam

Postdoctoral
Researchers

Alfredo Garbuno-Inigo
Bamdad Hosseini
Pengfei Liu
Krithika Manohar
Melike Sirlanci

Grad Students

Max Budninskiy
Utkan Candogan
JiaJie Chen
De Huang
Nikola Kovachki
Matt Levine
Riley Murray
Florian Schaefer
Yong Shen Soh
Yousuf Soliman
Armeen Taeb
Gene R. Yoo
Shumao Zhang

Lunch Seminars

(Will be held at 12 noon in Annenberg 213, unless otherwise specified.)


May 7, 2019
James Saunderson
Certifying polynomial nonnegativity via hyperbolic optimization
     Certifying nonnegativity of multivariate polynomials is fundamental to solving optimization problems modeled with polynomials. One well-known way to certify nonnegativity is to express a polynomial as a sum of squares. Furthermore, the search for such a certificate can be carried out via semidefinite optimization. An interesting generalization of semidefinite optimization, that retains many of its good algorithmic properties, is hyperbolic optimization. Are there natural certificates of nonnegativity that we can search for via hyperbolic optimization, and that are not obviously captured by sums of squares? If so, these could have the potential to generate hyperbolic optimization-based relaxations of optimization problems with that may be stronger, in some sense, than semidefinite optimization-based relaxations.
In this talk, I will describe one candidate for such "hyperbolic certificates of nonnegativity", and discuss what is known about their relationship with sums of squares.
September 25, 2019
Jose Antonio Carrillo
Topic to be Announced
     

Other Seminars


May 16, 2019
C.-C. Jay Kuo
Interpretable Convolutional Neural Networks (CNNs) via Feedforward Design
     Given a convolutional neural network (CNN) architecture, its network parameters are determined by backpropagation (BP) nowadays. The underlying mechanism remains to be a black-box after a large amount of theoretical investigation. In this talk, I describe a new interpretable and feedforward (FF) design with the LeNet-5 as an example. The FF-trained CNN is a data-centric approach that derives network parameters based on training data statistics layer by layer in one pass. To build the convolutional layers, we develop a new signal transform, called the Saab (Subspace approximation with adjusted bias) transform. The bias in filter weights is chosen to annihilate nonlinearity of the activation function. To build the fully-connected (FC) layers, we adopt a label-guided linear least squared regression (LSR) method. The classification performances of BP- and FF-trained CNNs on the MNIST and the CIFAR-10 datasets are compared. The computational complexity of the FF design is significantly lower than the BP design and, therefore, the FF-trained CNN is ideal for mobile/edge computing. We also comment on the relationship between BP and FF designs by examining the cross-entropy values at nodes of intermediate layers.


Meetings and Workshops







Past Events

Lunch Seminars Other Seminars Meetings & Workshops