Neural Module Networks for Reasoning over Text

This is the official code for the ICLR 2020 paper, Neural Module Networks for Reasoning over Text. This repository contains the code for replicating our experiments and can be used to extend our model as you wish.

Live Demo: – Goto ‘Reading Comprehension’ and select ‘NMN (trained on DROP)’ in the Model section.


  1. Download the data and a trained model checkpoint from here. Unzip the downloaded contents and place the resulting directory iclr_cameraready inside a convenient location, henceforth referred to as – MODEL_CKPT_PATH

  2. Clone the allennlp-semparse repository from here to a convenient location, henceforth referred to as – PATH_TO_allennlp-semparse. Checkout using git checkout 937d594 the specific commit that this code is built on. Such issues will be resolved soon when allennlp-semparse becomes pip installable.


The code is written in python using AllenNLP and allennlp-semparse.

The following commands create a miniconda environment, install allennlp, and creates symlinks for allennlp-semparse and the downloaded resources.

# Make conda environment
conda create -name nmn-drop python=3.6
conda activate nmn-drop

# Install required packages
pip install allennlp==0.9
pip install dateparser==0.7.2
python -m spacy download en_core_web_lg

# Clone code and make symlinks
git clone
cd nmn-drop/
mkdir resources; cd resources; ln -s MODEL_CKPT_PATH/iclr_cameraready ./; cd ..    
ln -s PATH_TO_allennlp-semparse/allennlp-semparse/allennlp_semparse/ ./ 


To make predictions on your data, format your data in a json lines formatinput.jsonl where each line is a valid JSON value containing the keys "question" and "passage".

Run the command

allennlp predict \
    --output-file output.jsonl \
    --predictor drop_demo_predictor \
    --include-package semqa \
    --silent \
    --batch-size 1 \ 
    resources/iclr_cameraready/ckpt/model.tar.gz \

The output output.jsonl contains the answer in an additional key "answer".


To evaluate the model on the dev set, run the command – bash scripts/iclr/

The model_ckpt/data path in the script can be modified to evaluate a different model on a different dataset.


To generate text based visualization of the model’s prediction on the development data, run the command – bash scripts/iclr/

A file drop_mydev_verbosepred.txt is written to MODEL_CKPT_PATH/iclr_cameraready/ckpt/predictions containing this visualization.

An interactive demo of our model will be available soon.


We already provide a trained model checkpoint and the subset of the DROP data used in the ICLR2020 paper with the resources above.

If you would like to re-train the model on this data, run the command – bash scripts/iclr/

The model checkpoint would be saved at MODEL_CKPT_PATH/iclr_cameraready/my_ckpt.

Note that this code needs the DROP data to be preprocessed with additional information such as, tokenization, numbers, and dates, etc. To train a model on a different subset of the DROP data, this pre-processing can be performed using the python script datasets/drop/preprocess/ on any DROP-formatted json file.


Please consider citing our work if you found this code or our paper beneficial to your research.

  author = {Nitish Gupta and Kevin Lin and Dan Roth and Sameer Singh and Matt Gardner},
  title = {Neural Module Networks for Reasoning over Text},
  booktitle = {International Conference on Learning Representations (ICLR)},
  year = {2020}

Contributions and Contact

This code was developed by Nitish Gupta, contact

If you’d like to contribute code, feel free to open a pull request. If you find an issue with the code or require additional support, please open an issue.