Research

Current projects:

Pure Mathematics:

Past projects:

Brief thesis research summary:

My research is in the area of analysis and geometry in several complex variables, with a focus on \(L^2\)-techniques and their applications. In particular, I have been working on a “twisted” adaptation of Bo Berndtsson’s complex Brunn-Minkowski theory: I retain the same results under weaker assumptions on the weight functions defining the \(L^2\) spaces of holomorphic functions. Berndtsson’s Nakano-positivity result possesses several applications in complex analysis and geometry: notably plurisubharmonic variation results for Bergman kernels (which can be interpreted in terms of the Bergman metric), a proof of the Ohsawa-Takegoshi theorem with sharp estimates, and applications to Kähler-Einstein geometry.

Although Berndtsson presents his results for pseudoconvex domains in \(\mathbb{C}^n\), it is known that the same results hold for Stein manifolds, with the \(L^2\) spaces now consisting of sections of a holomorphic line bundle \(L\) over the Stein manifold \(X\) equipped with a Hermitian metric \(e^{-\varphi}\). The “twist” comes from twisting the line bundle by a trivial bundle and rewriting a metric \(e^{-\psi}\) for \(L\) as \(\tau e^{-\varphi}\) with \(\tau > 0\). Using this, Donnelly and Fefferman obtained a basic estimate for the \(\bar{\partial}\)-operator which is different from the classical one. This basic estimate results in a theorem for the \(\bar{\partial}\)-operator with \(L^2\)-estimates on complete Kähler manifolds which differs from Hörmander’s classical theorem on \(L^2\)-estimates for the \(\bar{\partial}\)-operator. Indeed, in contrast with Hörmander’s classical result – which Berndtsson uses to prove his Nakano-positivity result – the twisted \(\bar{\partial}\)-theorem with \(L^2\)-estimates does not require the (family of) Hermitian metric(s) for the line bundle to have positive curvature. In fact, when the Stein manifold possesses a negative plurisubharmonic function \(-e^{-\eta}\), the (family of) Hermitian metric(s) for the line bundle may be chosen to have some amount of negative curvature. For instance, the curvature for the Hermitian metric for the line bundle can be as negative as \(-2e^{\eta}\partial_X \bar{\partial}_X (-e^{-\eta})\) along the base \(X\).

This allows us to directly extend Berndtsson’s Nakano-positivity result to weights that are not necessarily plurisubharmonic. In particular, in the case of Griffiths-positivity, I have extended Berndtsson’s result to general trivial families of Stein manifolds. We are then able to recover Berndtsson’s plurisubharmonic variation results under our reduced positivity assumptions. When the families are non-trivial, I have some more restricted extensions of the plurisubharmonic variation results.

In particular, two further questions of interest are:

  1. How big can the lower curvature bound \(-2e^{\eta}\partial_X \bar{\partial}_X (-e^{-\eta})\) be in the case of the unit ball, for example?
  2. How can these results be used to prove \(L^2\)-extension theorems for non-plurisubharmonic weights?

In particular, answers to question 2 in the case of the unit ball can lead to novel \(L^2\) interpolation theorems for the unit ball.

Talks about my research work:

Other research interests:

In addition to my research in several complex variables, I also have interests in probability theory and statistical theory. I’m particularly interested in two main themes.

1. Statistical Learning and Applications:

In machine learning, a major problem is that of overfitting, and the restriction of the hypothesis space \(\mathcal{H}\) can solve this problem. Reproducing kernel Hilbert spaces (RKHS) are a useful choice for \(\mathcal{H}\). The more general setting of my research to date is that of vector bundles, and in that case, the Bergman kernels can be thought of as the reproducing kernels of Hilbert spaces of vector-valued functions. By considering various learning problems in a functional setting, one can use the theory of RKHS to widen the scope of applications of Hilbert space methods to machine learning. Reproducing kernel Hilbert spaces of vector-valued functions play a central role in machine learning.

More generally, H. Zhang, Y. Xu and J. Zhang established a theory of reproducing kernel Banach spaces for machine learning. Further research using functional-analytic methods along the lines of Zhang’s work could very well lead to more general methods and applications adapted to settings which are naturally less regular, like that of time series. I am interested in exploring such methods given my previous experience with highly sophisticated versions of the reproducing kernel Hilbert space theory. An example of this would be my current project in collaboration with Pawel Polak.

2. Information geometry:

Burbea and Rao (1984) show more generally that when the parameter space for a family of probability distributions is complex, the Fisher information metric coincides with the Bergman metric, for a certain class of probability densities. This relation is of considerable importance as far as applications of statistical inference are concerned.

I am generally interested in furthering the scope of these connections between complex analysis and geometry, and information geometry. Early instances of such connections in the real-variable setting trace back to Efron’s Annals of Statistics paper (1975) defining the curvature of a statistical problem. I am confident that further work in these directions could lead to important results shedding light on more natural connections between various differential geometric quantities and statistical ones, in the presence of complex parameters.