
Dr Jesús Rubio
About
Biography
Jesús Rubio is a Theoretical Physicist and a Surrey Future Fellow. Following his studies for the Spanish Baccalaureate, he completed a Licenciatura degree at the Complutense University of Madrid and a MSc at the Institute for Theoretical Physics of Madrid, specialising in particle physics, cosmology and quantum information theory. He was then awarded a SEPnet scholarship to study for a doctorate at the University of Sussex. There, he wrote a thesis on quantum optics and Bayesian metrology, supervised by Jacob Dunningham. Upon receiving his PhD in 2019, he joined the group of Janet Anders at the University of Exeter as a postdoc. During this time, he developed global thermometry as a new subfield of quantum thermodynamics. In 2022, Jesús was awarded a Surrey Future Fellowship and he is now based at the University of Surrey. His research on quantum sensing, metrology and estimation aims to exploit quantum technologies as a new framework to formulate and answer fundamental questions in physics.
Areas of specialism
Affiliations and memberships
ResearchResearch interests
Measurements play a fundamental role in physics as they allow us to interrogate nature. Motivated by this, I am developing a new theoretical framework for quantum sensing, metrology and estimation in extreme regimes. The goal is to exploit such a framework as a new path to uncover insights in fundamental physics, keeping in mind that understanding time is one of the most important tasks of science. Relevant themes include:
Quantum estimation and foundations
- Quantum metrology and multiparameter estimation
- Calculus of variations and Bayesian techniques
- Uncertainty relations and correlations
- Quantum optics and thermodynamics
- Relativistic quantum mechanics
Quantum technologies
- Principles of quantum sensing
- Global quantum thermometry
- Quantum imaging and interferometry
- Quantum networks and distributed sensing
- Physics-inspired data science
Research interests
Measurements play a fundamental role in physics as they allow us to interrogate nature. Motivated by this, I am developing a new theoretical framework for quantum sensing, metrology and estimation in extreme regimes. The goal is to exploit such a framework as a new path to uncover insights in fundamental physics, keeping in mind that understanding time is one of the most important tasks of science. Relevant themes include:
Quantum estimation and foundations
- Quantum metrology and multiparameter estimation
- Calculus of variations and Bayesian techniques
- Uncertainty relations and correlations
- Quantum optics and thermodynamics
- Relativistic quantum mechanics
Quantum technologies
- Principles of quantum sensing
- Global quantum thermometry
- Quantum imaging and interferometry
- Quantum networks and distributed sensing
- Physics-inspired data science
Publications
We address the propagation of the spin along classical trajectories for a spin-1/2 particle obeying the Dirac equation with scalar potentials. Focusing on classical trajectories as the exact propagation of wave-function discontinuities, we find an explicit spin-transport law for the case of the Dirac oscillator. In the general case we examine the spin propagation along classical trajectories emerging as an approximation of the quantum dynamics via the mechanical analog of the optical eikonal asymptotic approach. Throughout we establish as many parallels as possible with the equivalent situation for the electromagnetic field.
We report a comparison of two photonic techniques for single-molecule sensing: fluorescence nanoscopy and optoplasmonic sensing. As the test system, oligonucleotides with and without fluorescent labels are transiently hybridized to complementary "docking" strands attached to gold nanorods. Comparing the measured single-molecule kinetics helps to examine the influence of the fluorescent labels as well as factors arising from different sensing geometries. Our results demonstrate that DNA dissociation is not significantly altered by the fluorescent labels and that DNA association is affected by geometric factors in the two techniques. These findings open the door to exploiting plasmonic sensing and fluorescence nanoscopy in a complementary fashion, which will aid in building more powerful sensors and uncovering the intricate effects that influence the behavior of single molecules.
Precise temperature measurements on systems of few ultracold atoms is of paramount importance in quantum technologies, but can be very resource intensive. Here, we put forward an adaptive Bayesian framework that substantially boosts the performance of cold atom temperature estimation. Specifically, we process data from real and simulated release-recapture thermometry experiments on few potassium atoms cooled down to the microkelvin range in an optical tweezer. From simulations, we demonstrate that adaptively choosing the release-recapture times to maximize information gain does substantially reduce the number of measurements needed for the estimate to converge to a final reading. Unlike conventional methods, our proposal systematically avoids capturing and processing uninformative data. We also find that a simpler nonadaptive method exploiting all the a priori information can yield competitive results, and we put it to the test on real experimental data. Furthermore, we are able to produce much more reliable estimates, especially when the measured data are scarce and noisy, and they converge faster to the real temperature in the asymptotic limit. Importantly, the underlying Bayesian framework is not platform specific and can be adapted to enhance precision in other setups, thus opening new avenues in quantum thermometry.
Many results in the quantum metrology literature use the Cramér-Rao bound and the Fisher information to compare different quantum estimation strategies. However, there are several assumptions that go into the construction of these tools, and these limitations are sometimes not taken into account. While a strategy that utilizes this method can considerably simplify the problem and is valid asymptotically, to have a rigorous and fair comparison we need to adopt a more general approach. In this work we use a methodology based on Bayesian inference to understand what happens when the Cramér-Rao bound is not valid. In particular we quantify the impact of these restrictions on the overall performance of a wide range of schemes including those commonly employed for the estimation of optical phases. We calculate the number of observations and the minimum prior knowledge that are needed such that the Cramér-Rao bound is a valid approximation. Since these requirements are state-dependent, the usual conclusions that can be drawn from the standard methods do not always hold when the analysis is more carefully performed. These results have important implications for the analysis of theory and experiments in quantum metrology.
We argue that analysing schemes for metrology solely in terms of the average particle number can obscure the number of particles effectively used in informative events. For a number of states we demonstrate that, in both frequentist and Bayesian frameworks, the average number of a state can essentially be decoupled from the aspects of the total number distribution associated with any metrological advantage.
The theoretical framework for networked quantum sensing has been developed to a great extent in the past few years, but there are still a number of open questions. Among these, a problem of great significance, both fundamentally and for constructing efficient sensing networks, is that of the role of inter-sensor correlations in the simultaneous estimation of multiple linear functions, where the latter are taken over a collection local parameters and can thus be seen as global properties. In this work we provide a solution to this when each node is a qubit and the state of the network is sensor-symmetric. First we derive a general expression linking the amount of inter-sensor correlations and the geometry of the vectors associated with the functions, such that the asymptotic error is optimal. Using this we show that if the vectors are clustered around two special subspaces, then the optimum is achieved when the correlation strength approaches its extreme values, while there is a monotonic transition between such extremes for any other geometry. Furthermore, we demonstrate that entanglement can be detrimental for estimating non-trivial global properties, and that sometimes it is in fact irrelevant. Finally, we perform a non-asymptotic analysis of these results using a Bayesian approach, finding that the amount of correlations needed to enhance the precision crucially depends on the number of measurement data. Our results will serve as a basis to investigate how to harness correlations in networks of quantum sensors operating both in and out of the asymptotic regime.
A paradigm shift in quantum thermometry is proposed. To date, thermometry has relied on local estimation, which is useful to reduce statistical fluctuations once the temperature is very well known. In order to estimate temperatures in cases where few measurement data or no substantial prior knowledge are available, we build instead a method for global quantum thermometry. Based on scaling arguments, a mean logarithmic error is shown here to be the correct figure of merit for thermometry. Its full minimization provides an operational and optimal rule to postprocess measurements into a temperature reading, and it establishes a global precision limit. We apply these results to the simulated outcomes of measurements on a spin gas, finding that the local approach can lead to biased temperature estimates in cases where the global estimator converges to the true temperature. The global framework thus enables a reliable approach to data analysis in thermometry experiments.
A longstanding problem in quantum metrology is how to extract as much information as possible in realistic scenarios with not only multiple unknown parameters, but also limited measurement data and some degree of prior information. Here we present a practical solution to this: We derive a Bayesian multi-parameter quantum bound, construct the optimal measurement when our bound can be saturated for a single shot, and consider experiments involving a repeated sequence of these measurements. Our method properly accounts for the number of measurements and the degree of prior information, and we illustrate our ideas with a qubit sensing network and a model for phase imaging, clarifying the nonasymptotic role of local and global schemes. Crucially, our technique is a powerful way of implementing quantum protocols in a wide range of practical scenarios that tools such as the Helstrom and Holevo Cramer-Rao bounds cannot normally access.
We introduce a genetic algorithm that designs quantum optics experiments for engineering quantum states with specific properties. Our algorithm is powerful and flexible, and can easily be modified to find methods of engineering states for a range of applications. Here we focus on quantum metrology. First, we consider the noise-free case, and use the algorithm to find quantum states with a large quantum Fisher information (QFI). We find methods, which only involve experimental elements that are available with current or near-future technology, for engineering quantum states with up to a 100 fold improvement over the best classical state, and a 20 fold improvement over the optimal Gaussian state. Such states are a superposition of the vacuum with a large number of photons (around 80), and can hence be seen as Schrödinger-cat-like states. We then apply the two most dominant noise sources in our setting-photon loss and imperfect heralding-and use the algorithm to find quantum states that still improve over the optimal Gaussian state with realistic levels of noise. This will open up experimental and technological work in using exotic non-Gaussian states for quantum-enhanced phase measurements. Finally, we use the Bayesian mean square error to look beyond the regime of validity of the QFI, finding quantum states with precision enhancements over the alternatives even when the experiment operates in the regime of limited data.
Quantum metrology protocols are typically designed around the assumption that we have an abundance of measurement data, but recent practical applications are increasingly driving interest in cases with very limited data. In this regime the best approach involves an interesting interplay between the amount of data and the prior information. Here we propose a new way of optimising these schemes based on the practically-motivated assumption that we have a sequence of identical and independent measurements. For a given probe state we take our measurement to be the best one for a single shot and we use this sequentially to study the performance of different practical states in a Mach-Zehnder interferometer when we have moderate prior knowledge of the underlying parameter. We find that we recover the quantum Cramér-Rao bound asymptotically, but for low data counts we find a completely different structure. Despite the fact that intra-mode correlations are known to be the key to increasing the asymptotic precision, we find evidence that these could be detrimental in the low data regime and that entanglement between the paths of the interferometer may play a more important role. Finally, we analyse how close realistic measurements can get to the bound and find that measuring quadratures can improve upon counting photons, though both strategies converge asymptotically. These results may prove to be important in the development of quantum enhanced metrology applications where practical considerations mean that we are limited to a small number of trials.
Quantum scale estimation, as introduced and explored here, establishes the most precise framework for the estimation of scale parameters that is allowed by the laws of quantum mechanics. This addresses an important gap in quantum metrology, since current practice focuses almost exclusively on the estimation of phase and location parameters. For given prior probability and quantum state, and using Bayesian principles, a rule to construct the optimal probability-operator measurement is provided. Furthermore, the corresponding minimum mean logarithmic error is identified. This is then generalised as to accommodate the simultaneous estimation of multiple scale parameters, and a procedure to classify practical measurements into optimal, almost-optimal or sub-optimal is highlighted. As a means of illustration, the new framework is exploited to generalise scale-invariant global thermometry, as well as to address the estimation of the lifetime of an atomic state. On a more conceptual note, the optimal strategy is employed to construct an observable for scale parameters, an approach which may serve as a template for a more systematic search of quantum observables. Quantum scale estimation thus opens a new line of enquire-the precise measurement of scale parameters such as temperatures and rates-within the quantum information sciences.