Datasets:

ArXiv:
Tags:
License:
code-text-java / README.md
dkhati56's picture
Update README.md
f73aa95
metadata
license: mit
Programminglanguage: Java
version: N/A
Date: Codesearchnet(Jun 2020 - paper release date)
Contaminated: Very Likely
Size: Standard Tokenizer (TreeSitter)

Dataset is imported from CodeXGLUE and pre-processed using their script.

Where to find in Semeru:

The dataset can be found at /nfs/semeru/semeru_datasets/code_xglue/code-to-text/java in Semeru

CodeXGLUE -- Code-To-Text

Task Definition

The task is to generate natural language comments for a code, and evaluted by smoothed bleu-4 score.

Dataset

The dataset we use comes from CodeSearchNet and we filter the dataset as the following:

  • Remove examples that codes cannot be parsed into an abstract syntax tree.
  • Remove examples that #tokens of documents is < 3 or >256
  • Remove examples that documents contain special tokens (e.g. <img ...> or https:...)
  • Remove examples that documents are not English.

Data Format

After preprocessing dataset, you can obtain three .jsonl files, i.e. train.jsonl, valid.jsonl, test.jsonl

For each file, each line in the uncompressed file represents one function. One row is illustrated below.

  • repo: the owner/repo

  • path: the full path to the original file

  • func_name: the function or method name

  • original_string: the raw string before tokenization or parsing

  • language: the programming language

  • code/function: the part of the original_string that is code

  • code_tokens/function_tokens: tokenized version of code

  • docstring: the top-level comment or docstring, if it exists in the original string

  • docstring_tokens: tokenized version of docstring

Data Statistic

Programming Language Training Dev Test
Java 164,923 5,183 10,955

Reference

@article{husain2019codesearchnet,
  title={Codesearchnet challenge: Evaluating the state of semantic code search},
  author={Husain, Hamel and Wu, Ho-Hsiang and Gazit, Tiferet and Allamanis, Miltiadis and Brockschmidt, Marc},
  journal={arXiv preprint arXiv:1909.09436},
  year={2019}
}