Spring 2020 SIP Seminars

Prof. Sidharth Jaggi

Title: Covert Communication, or, How to Whisper

Abstract: Covert communication tries to answer the following question — if Alice wishes to whisper to Bob while ensuring that the eavesdropper Eve cannot even detect whether or not Alice is whispering, how much can she whisper. Ensuring such a stringent security requirement can be met requires new ideas from information theory, coding theory, and cryptography. In this talk I will survey the state of the existing literature (recent information-theoretic capacity-style results for a variety of settings), and then discuss even more recent results. Specifically, I will highlight: Code constructions: Computationally efficient code constructions that achieve the information-theoretic capacity bounds. Resilience to jamming: In some settings, Eve may not just be a passive eavesdropper, but actively attempt to jam Alice’s communication, even if she isn’t sure whether or not Alice is actually whispering. I will discuss covert communication schemes that are resilient to such malicious jamming. Impact of environmental uncertainty: Often, noise levels on the communication medium are not static, but stochastically varying (for instance, in fading channels). It turns out such natural variation can dramatically impact the capacity — indeed, in general such variation hurts Eve’s detector much more than it hurts Bob’s decoder.

Biography: Sidharth Jaggi received his B. Tech. from I.I.T. Bombay 2000, his M.S. and Ph.D. degrees from the CalTech in 2001 and 2006 respectively, all in EE. He spent 2006 as a Postdoctoral Associate at LIDS MIT. He joined the Department of Information Engineering at the Chinese University of Hong Kong in 2007, where he is now an Associate Professor. His interests lie at the intersection of network information theory, coding theory, and algorithms. His research group thus (somewhat unwillingly) calls itself the CAN-DO-IT team (Codes, Algorithms, Networks: Design and Optimization for Information Theory). Examples of topics he has dabbled in include network coding, sparse recovery/group-testing, covert communication, and his current obsession is with adversarial channels.

Prof. Adam Charles

Title: Modern Methods for Calcium Imaging of Neural Population Activity

Abstract: Deciphering how the brain works requires new methods for recording and interpreting activity from large neural populations at single-neuron resolution. In this talk I will focus on one important and widely used optical neural recording modality: two-photon microscopy (TPM). Specifically, I will describe recent methodological advances for increasing the quality and quantity of inferred neural activity from TPM recordings, as well as novel assessment techniques. First, I will discuss new volumetric two-photon imaging of neurons using stereoscopy (vTwINS): a computational imaging (co-designed hardware/algorithm) approach that projects an entire volume onto each image and thus increases the quantity of imaged neurons with no reduction of frame-rate. To infer the neural locations and activities, vTwINS relies on a co-designed greedy algorithm we developed that leverages knowledge of the optics to seed an adaptive matching-pursuit-type algorithm. Second I will discuss the importance of accurate neural activity inference from TPM data and show that the basic model underlying all state-of-the-art algorithms can lead to critical errors. Specifically, bursts of activity, or transients, can contaminate the inferred activity of neighboring cells in ways that impact the ensuing scientific results. I will demonstrate a new algorithm that directly models unexplained activity with spatial structure and significantly removes such cross-talk to the benefit of scientific discovery. Finally, all advances in imaging must be understood and rigorously validated. Methods, such as TPM, that image otherwise un-observable data suffer from a lack of needed ground-truth with which to perform such validation. I will thus present a simulation-based approach to such validation that uses known statistics of neural anatomy and optical propagation to generate realistic synthetic data. Multiple microscopy techniques and algorithms can be assessed using such data, enabling more rapid and confident development of new TPM techniques.

Biography: Adam Charles received both a B.E. and M.E in Electrical and Computer Engineering in 2009 from The Cooper Union in New York City. He received his Ph.D. in Electrical and Computer Engineering in 2015 working under Dr. Christopher Rozell at The Georgia Institute of Technology, where his research was awarded a Sigma Xi Best Doctoral Thesis award as well as an Electrical and Computer Engineering Research Excellence award. Post-graduation, Adam joined the Princeton Neuroscience Institute, working with Dr. Jonathan Pillow on computational neuroscience and data analysis methods. Currently Adam is joining the Biomedical Engineering Department at Johns Hopkins University, where his research includes neural imaging technologies, inference and tracking of sparse and structured signals, and mathematical modeling of neural networks.

Dr. Ruobin Gong

Title: Exact Statistical Inference for Differentially Private Data

Abstract: Differential privacy (DP) is a mathematical framework that protects confidential information in a transparent and quantifiable way. I discuss how two classes of approximate computation techniques can be systematically adapted to produce exact statistical inference using DP data. For likelihood inference, we call for an importance sampling implementation of Monte Carlo expectation-maximization, and for Bayesian inference, an approximate Bayesian computation (ABC) algorithm suitable for possibly complex likelihood. Both approaches deliver exact statistical inference with respect to the joint statistical model inclusive of the differential privacy mechanism, yet do not require analytical access of such joint specification. Highlighted is a transformation of the statistical tradeoff between privacy and efficiency, into the computational tradeoff between approximation and exactness. Open research questions on two fronts are posed: 1) how to afford computationally accessible and (approximately) correct statistical analysis tools to DP data users; 2) how to understand and remedy the effect of any necessary post-processing with statistical analysis.

Biography: Ruobin Gong is Assistant Professor of Statistics at Rutgers University. Her research interests lie at the theoretical foundations of Bayesian and generalized Bayesian methodologies, statistical modeling, inference, and computation with differentially private data, and ethical implications of aspects of modern data science. Her current research on Bayesian methods for differential privacy is supported by the National Science Foundation. Ruobin received her Ph.D. in statistics from Harvard University in 2018. She is currently an associate editor of the Harvard Data Science Review.

Prof. Min Xu

Title: Inference for the History of a Randomly Growing Tree

Abstract: The spread of infectious disease in a human community or the proliferation of fake news on social media can be modeled as randomly growing tree-shaped networks. The history of the random growth process is often unobserved but contains important information such as the source of the infection. We propose to infer aspects of the latent history through an approximate resampling framework which produces a confidence set with honest Frequentist coverage and certain optimality properties. In some common models such as preferential attachment, our sampling method is exact and has runtime linear in the number of nodes in the network.

Biography: Min Xu is an assistant professor in the department of statistics at Rutgers University. He obtained his Ph.D. in the Department of Machine Learning from Carnegie Mellon University and was the departmental post-doctoral researcher at the statistics department in the Wharton School of the University of Pennsylvania. His research interests include nonparametric estimation in machine learning and network data analysis.

Dr. Patrick Johnstone

Title: Projective Splitting: A New Breed of First-Order Proximal Algorithms

Abstract: Projective splitting is a proximal operator splitting framework mainly used for solving convex optimization problems. Unlike many optimization methods, projective splitting is not based on a fixed-point iteration. Instead, at each iteration a hyperplane is constructed which separates the current estimate from the solution set. It turns out that this allows for more freedom in terms of stepsize selection, incremental updates, and asynchronous parallel computation than the traditional fixed-point approach. However, prior to this work projective splitting suffered from an overwhelming drawback that made it impractical in most applications. The drawback was that the method required computing proximal operators for all functions, and for many functions this is an expensive computation. In this work, we develop new calculations based on forward steps – explicit evaluations of the gradient – that can be used whenever the function is Lipschitz differentiable. This extends the scope of the method to a much wider class of problems.

Biography: Patrick Johnstone is a Postdoctoral Associate in the Management Sciences and Information Systems Department of the Rutgers Business School where he is advised by Prof. Jonathan Eckstein. In May 2017 he received his PhD in Electrical and Computer Engineering from the University of Illinois at Urbana-Champaign (advised by Prof. Pierre Moulin). He also received the MSc degree in ECE from UIUC. He received the BSc degree in Electrical Engineering from the University of New South Wales (UNSW) in Sydney, Australia. At UNSW he received the university medal, given to the top student in electrical engineering. He has worked as a research intern at Qualcomm Research, Rambus Labs, and CSIRO. For his work at Qualcomm, he received the Roberto Padovani award for outstanding interns. His research interests are in continuous optimization, first-order proximal splitting methods, parallel, asynchronous, and distributed algorithms, machine learning, and signal processing.

Prof. Ying Hung

Title: Statistical Modeling and Uncertainty Quantification for Computer Simulations with Non-Gaussian Responses

Abstract: Non-Gaussian observations such as binary responses are common in some computer simulations, but most of the work in the literature is limited to the analysis of continuous responses. Motivated by the analysis of a class of cell adhesion experiments, we introduce a generalized Gaussian process model for binary responses, which shares some common features with Gaussian process models. We also propose a new calibration framework for binary responses. Its application to the T cell adhesion data provides insight into the unknown values of the kinetic parameters which are difficult to determine by physical experiments due to the limitation of the existing experimental techniques.

Biography: Dr. Hung is an Associate Professor in Statistics Department at Rutgers. She graduated from Industrial and Systems Engineering at Georgia Institute of Technology in 2008. Dr. Hung received NSF CAREER award and IMS Tweedie Award in 2014. Her research areas include experimental design, statistical modeling for computer experiments and uncertainty quantification, with applications to science and engineering.