Two prominent limitations have long hampered the relevance of optimal transport methods for machine learning. First the computational cost of standard sample-based solvers (when used in batches of samples) is prohibitive. Second, the massive conservation constraint makes OT solvers too rigid in practice: because they must match \textit{all} points of both measurements, their result can be strongly influenced by outliers. A number of recent works have addressed these computational and modeling limitations. Still, it has given rise to two different types of methods: although the computational perspective was greatly improved by entropic regularization, the most recent methods Low-range \textit{linear-time} solvers promise to further extend OT. In terms of modeling flexibility, the rigidity of mass conservation has been alleviated for regularized entropic OT thanks to unbalanced variants of OT that can penalize couplings whose marginals deviate from those specified by the source and target distributions. The goal of this paper is to merge these two strains, low-ranking and imbalanced, to achieve the promise of solvers that are scalable and versatile. We propose custom algorithms to implement these extensions for the linear OT problem and its fused Gromov-Wasserstein generalization, and demonstrate their practical relevance for challenging spatial transcriptomics matching problems. These algorithms are implemented in the ott-jax toolbox.