Skip to content

FFL2022/FFL

Repository files navigation

FFL

Dataset Downloads

Extract the 2 dataset into data_codeflaws and data_nbl respectively

Environment:

The following dependencies are required to train or run the model:

  • pytorch (1.6.0+)
  • fasttext (0.9.2)
  • dgl (1.6.0)
  • networkx (2.5.1)

Optional (for visualization):

  • pygraphviz and graphviz

Java: version 1.11+

Extract jars.zip into folder jars

Training:

# Codeflaws node-level
python3 -m dgl_version.codeflaws.train_nx_a_nc_old
# NBL node-level
python3 -m dgl_version.nbl.train_nx_a_nc

# Codeflaws statement-level
python3 -m dgl_version.codeflaws.train_nx_astdiff_nocontent_gumtree
# Prutor statement-level
python3 -m dgl_version.nbl.train_nx_astdiff_nocontent_gumtree

Evaluation

The training script already contain evaluation, by disabling train() procedure, the script will script directly to evaluation. Copy the pretrained model into train_dirs in the configured utils/utils.py for evaluate.

Pretrained model:

Others

Please note that while this is not required in our original settings, codebert pretrain file can be placed in preprocess folder for each AST content to be used instead of just nodetype.

Please cite the following article if you find FFL to be useful:

@article{Nguyen2022,
   author = {Thanh-Dat Nguyen and Thanh Le-Cong and Duc-Minh Luong and Van-Hai Duong and Xuan-Bach D Le and David Lo and Quyet-Thang Huynh},
   city = {Cyprus},
   journal = {The 38th IEEE International Conference on Software Maintenance and Evolution},
   keywords = {Graph Neural Network,Index Terms-Fault Localization,Programming Education},
   month = {11},
   title = {FFL: Fine-grained Fault Localization for Student Programs via Syntactic and Semantic Reasoning},
   url = {https://github.com/FFL2022/FFL},
   year = {2022},
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •