Paper Title
Transformer-Based Intent Recognition in Educational Chatbots: Performance Analysis of BERT Variants

Abstract
AI-driven educational chatbots play a vital role in providing personalized learning support by leveraging Natural Language Understanding (NLU) for accurate intent recognition. This research compares three transformer-based models—BERT, RoBERTa, and DistilBERT—to evaluate their effectiveness in classifying user intents within educational chatbots. Two datasets, Chatbot_Intents (primary) and Conversation_Intents (secondary), were used to assess performance based on training loss, validation accuracy, and confusion matrix analysis. The results indicate that DistilBERT achieved the highest accuracy on the primary dataset, balancing efficiency and performance, while RoBERTa outperformed others but required dataset balancing. On the secondary dataset, BERT performed best, whereas RoBERTa struggled due to dataset complexity. Findings highlight DistilBERT's efficiency for real-world applications, BERT's robustness in complex queries, and RoBERTa's potential for structured data. The study highlights the trade-offs between accuracy, efficiency, and adaptability, offering insights into optimizing chatbot frameworks for enhanced student engagement and personalized career guidance. Keywords - Educational Chatbots, Natural Language Understanding (NLU), Intent Recognition,Transformer Models (BERT, RoBERTa, DistilBERT), Conversational AI