Due to the large size of MNRE dataset, please download the dataset from original repository.
Unzip the data and place them in the directory data
mkdir ckpt
We also use the detected visual objects provided in previous work, which can be downloaded using the commend:
cd data/
wget 120.27.214.45/Data/re/multimodal/data.tar.gz
tar -xzvf data.tar.gz
Install all necessary dependencies:
conda create -n focalmre python==3.7
conda activate focalmre
pip install -r requirements.txt
The best hyperparameters we found have been witten in run_mre.sh file.
You can simply run the script for multimodal relation extraction:
sh run_mre.sh
You can simply run the script to test saved checkpoint:
sh run_test.sh
You can use our fine-tuned models for testing, which can be downloaded from the following link: https://drive.google.com/file/d/1Nff_sSB4n7p_qoE9ryK7qxmAYSW1TF2z/view?usp=sharing