Non‑polynomial quasi‑topological gravity: a single Hubble function that can mimic dark energy
Researchers study a new class of modified gravity called non‑polynomial quasi‑topological gravity (NPQTG). In this framework the large‑scale expansion of the universe is encoded in a single function of the Hubble parameter, the number that measures the Universe’s expansion rate. The theories keep the equations of motion at second order, which avoids the usual higher‑derivative instabilities that plague many higher‑curvature modifications of general relativity.
To make the idea concrete, the authors build explicit models within the NPQTG family. They show how different choices for that single function lead to different expansion histories. In particular they present polynomial, quartic, power‑law and genuinely non‑polynomial examples, and then focus on the quartic and power‑law cases for a detailed study. The work proceeds by reducing a higher‑dimensional action to an effective two‑dimensional scalar–tensor form. An important simplifying condition in this reduction is an “integrability” relation (denoted ω = 0) that holds for homogeneous and isotropic (Friedmann–Lemaître–Robertson–Walker) cosmologies.
At a high level the dynamics in NPQTG simplifies to an algebraic relation between a curvature invariant and the matter energy density. Put another way, the whole expansion history can be written in terms of that single Hubble‑function. The modified gravity then appears as an effective dark‑energy component that comes from geometry rather than a new matter field. The resulting dark energy can have a dynamical equation of state that lies in the quintessence regime (equation of state w > −1) or in the phantom regime (w < −1), depending on the model choice.
The authors test the quartic and power‑law models against current late‑time data. They use Type Ia supernovae, Cosmic Chronometers (measurements of the expansion rate at different redshifts), and Baryon Acoustic Oscillations. Constraints are obtained through a Bayesian Markov Chain Monte Carlo (MCMC) analysis. According to the paper, both models fit these data sets well and are statistically competitive with the standard ΛCDM model (Lambda Cold Dark Matter), meaning they are compatible with current observational bounds at the level probed.