Spiking recurrent neural networks (SRNNs) rival gated RNNs on various tasks, yet they still lack several hallmarks of biological neural networks. We introduce a biologically grounded SRNN that implements Dale\'s law with voltage-dependent AMPA and GABA reversal potentials. These reversal potentials modulate synaptic gain as a function of the postsynaptic membrane potential, and we derive theoretically how they make each neuron\'s effective dynamics and subthreshold resonance input-dependent. We trained SRNNs on the Spiking Heidelberg Digits dataset, and show that SRNN with reversal potentials cuts spike energy by up to 4x, while increasing task accuracy. This leads to high-performing Dalean SRNNs, substantially improving on Dalean networks without reversal potentials. Thus, Dale\'s law with reversal potentials, a core feature of biological neural networks, can render SRNNs more accurate and energy-efficient.