Skip to content

Source Code for "A multi-modal transformer-based code summarization approach for smart contracts"

License

Notifications You must be signed in to change notification settings

yz1019117968/ICPC-21-MMTrans

Repository files navigation

ICPC'21 MMTrans

A General Introduction for the Whole Framework.

  • data_process: data pre-process.
  • Dictionary: vocabulary related code.
  • modules: model related code.
  • DataOutput.py: input data pipeline.
  • Train.py: train the model.
  • Evaluation.py: evaluate the model.
  • EvaluationMetrics.py: automated evaluation metrics adopted in the experiment.
  • Configs.py: model hyper-parameters setup.
  • There's another code repo that won't be maintained 👉 Link.

Data and Trained Model

  • Dataset is Available Here: dataset, put the datasets folder under the root directory.

  • Trained Models: Models, put each of the checkpoint folder under the root directory, and the program will automatically load the latest model. (The number behind them is the head number setup of each experiment)

Other useful tools

  • The SBT sequences and Graphs (xml format) generation tools are provided here.

  • The model is implemented by TensorFlow 2.3 based on the Transformer tutorial.

Welcome to Cite!

  • If you find this paper or related tools useful and would like to cite it, the following would be appropriate:
@misc{yang2021multimodal,
      title={A Multi-Modal Transformer-based Code Summarization Approach for Smart Contracts}, 
      author={Zhen Yang and Jacky Keung and Xiao Yu and Xiaodong Gu and Zhengyuan Wei and Xiaoxue Ma and Miao Zhang},
      year={2021},
      eprint={2103.07164},
      archivePrefix={arXiv},
      primaryClass={cs.SE}
}

About

Source Code for "A multi-modal transformer-based code summarization approach for smart contracts"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages