During successful language comprehension, speech sounds (phonemes) are encoded within a series of neural patterns that evolve over time. Here we tested whether these neural dynamics of speech encoding are altered for individuals with a language disorder. We recorded EEG responses from individuals with post-stroke aphasia and healthy age-matched controls (i.e., older adults) during 25 minutes of natural story listening. We estimated the duration of phonetic feature encoding, speed of evolution across neural populations, and the spatial location of encoding over EEG sensors. First, we establish that phonetic features are robustly encoded in EEG responses of healthy older adults. Second, when comparing individuals with aphasia to healthy controls, we find significantly decreased phonetic encoding in the aphasic group after shared initial processing pattern (0.08-0.25s after phoneme onset). Phonetic features were less strongly encoded over left-lateralized electrodes in the aphasia group compared to controls, with no difference in speed of neural pattern evolution. Finally, we observed that healthy controls, but not individuals with aphasia, encode phonetic features longer when uncertainty about word identity is high, indicating that this mechanism - encoding phonetic information until word identity is resolved - is crucial for successful comprehension. Together, our results suggest that aphasia may entail failure to maintain lower-order information long enough to recognize lexical items.