Skip to content

amurtadha/BERT-ASC

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

91 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BERT-ASC

A model for implicit aspect sentiment analysis.

This is the source code for the paper: Murtadha, Ahmed, et al. "BERT-ASC: Auxiliary-Sentence Construction for Implicit Aspect Learning in Sentiment Analysis" [1].

Data

The datasets used in our experminents can be downloaded from this SemEval.

Prerequisites:

Required packages are listed in the requirements.txt file:

pip install -r requirements.txt

Pre-processing

  • Generate seed words for a given dataset (e.g., semeval):
    • Go to L-LDA/ and run the following code
     python run_l_LDA.py --dataset semeval
    

The original code of L-LDA is publicly available

  • Generate the semantic candidates:
    • Go to ASC_generating/
    • The processed data and embedings for restaurant is available. Note that these files were orginally proccessed by Ruidan He
    • To process your own data and embeddings, put your data file in datasets then run this code:
     python preprocessing.py
     python generate_domain_embedding.py	
    
    • Run the following code to extract the semantic candidates
     python semantic_candidate_generating.py --dataset semeval
    
  • Generate the synticatic candidates:
    • Run the following code to generate the synticatic informatiom
     python ASC_generating/opinion_words_extracting.py --dataset semeval
    

Training:

  • To train BERT-ASC:
    • Go to code/ and run the following code
     python code/run.py --dataset semeval 
    
    • The params could be :
      • --dataset ={semeval,sentihood}
    • Or run this scripts code/scripts
    sh training.sh 0 bert-base-uncased 
    

Evalutaion:

  • To evaluate BERT-ASC:

    • Go to code/ and run the following code
     python code/evaluate.py --dataset semeval
    
    • The params could be :
      • --dataset ={semeval,sentihood}
    • Or run this scripts code/scripts
     sh evaluate.sh 0 bert-base-uncased 
    

If you use the code, please cite the paper:

@article{murtadha2022bert,
  title={BERT-ASC: Auxiliary-Sentence Construction for Implicit Aspect Learning in Sentiment Analysis},
  author={Murtadha, Ahmed and Pan, Shengfeng and Wen, Bo and Su, Jianlin and Zhang, Wenze and Liu, Yunfeng},
  journal={arXiv preprint arXiv:2203.11702},
  year={2022}
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published