Bioactive peptides have become strong candidates for a variety of clinical therapies due to their diverse advantages, which promotes the development of deep generative models for peptide design. Considering that existing methods cannot effectively deal with the conformational flexibility of peptides and are difficult to capture accurate residue-to-residue interaction dependencies, we propose a unified weakly order-dependent autoregressive language modeling architecture (PepGenWOA) for bioactive peptide generative design with tolerating out-of-order input as inductive bias. The superiority of PepGenWOA is demonstrated by generating three classes of therapeutic peptides, including antimicrobial peptides, anticancer peptides, and peptide binders. For antimicrobial and anticancer peptide generation, PepGenWOA not only comprehensively outperforms state-of-the-art baseline models, but also exhibits a significant propensity to incorporate specific types of residues that are beneficial for antimicrobial or anticancer bioactivity. For characteristic-guided peptide binder generation, the pretrained PepGenWOA is fine-tuned with Mixture-of-Experts-style plugins with lifelong learning paradigm, thereby achieving the best trade-off between memory stability for world knowledge and learning plasticity for new downstream tasks. The subsequent structure-based virtual screening anchors the 7 most promising candidates from the mega-scale synthetic sequence space, reducing the number of candidates for in vitro experimental validation by 5 orders of magnitude. A target binding rate of 28.6% with binding specificity further confirms the effectiveness of the fine-tuning strategy. Overall, PepGenWOA is a unified architecture for peptide generative design that can be flexibly customized for different task requirements, thereby harnessing generative language models to reach the "dark space" of biological language.