Download PDFOpen PDF in browserCurrent version

Multi-Granulariy Time-Based Transformer for Knowledge Tracing

EasyChair Preprint no. 9950, version 2

Versions: 123history
5 pagesDate: April 11, 2023

Abstract

In this paper, we present a transformer architecture for predicting student performance on standardized tests. Specifically, we leverage students’ historical data, including their past test scores, study habits, and other relevant information, to create a personalized model for each student. We then use these models to predict their future performance on a given test. Applying this model to the RIIID dataset, we demonstrate that using multiple granularities for temporal features as the decoder input significantly improve model performance. Our results also show the effectiveness of our approach, with substantial improvements over the LightGBM method. Our work contributes to the growing field of AI in education, providing a scalable and accurate tool for predicting student outcomes.

Keyphrases: deep learning, Education, multi-granularity, RIIID, transformer

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:9950,
  author = {Tong Zhou},
  title = {Multi-Granulariy Time-Based Transformer for Knowledge Tracing},
  howpublished = {EasyChair Preprint no. 9950},

  year = {EasyChair, 2023}}
Download PDFOpen PDF in browserCurrent version