Generating EEG signals alongside behavioural actions introduces substantial biological complexity, akin to an abstract model that mimics rich oscillatory brain dynamics by simultaneously producing dynamic neural activity and corresponding behavioural responses. In this work, we propose a novel class of bio-inspired classifier guided Deep Oscillatory Neural Network (cDONN), designed to simultaneously model and capture both neural signal dynamics and behavioural patterns. The cDONN integrates a feedforward neural network with Hopf oscillator neurons, specialized in handling sequential data generation tasks. The model features a Y-shaped structure with two output branches: one for classifying the input image that signifies behaviour, and the other for generating corresponding EEG signals that represent neural dynamics. We evaluate the performance of the behaviour classification branch using the standard ramp classification accuracy used for the cDONN network. For the EEG signal generation branch, we benchmark the generated signals using classification accuracy via a pre-trained DONN signal classifier, alongside comparison with real EEG signals using metrics such as Inception Score, Frechet Inception Distance (FID), power spectrum analysis and topoplot analysis. Our analysis reveals a strong correspondence between the generated and real EEG data, particularly in the midline parietal region within the alpha frequency band, and in the left hemisphere within the delta band, indicating high fidelity and neurophysiological realism in the synthesized signals. Our approach offers a novel perspective on modelling the simultaneous relationship between neural activity and behaviour, pushing the boundaries of EEG signal generation and behavioural analysis.