Edit model card

UniXcoder Base Unimodal

This is an unofficial reupload of microsoft/unixcoder-base-unimodal in the SafeTensors format using transformers 4.41.2. The goal of this reupload is to prevent older models that are still relevant baselines from becoming stale as a result of changes in HuggingFace. Additionally, I may include minor corrections, such as model max length configuration.

Properties

Property Value
Number Of Parameters 124,842,240
Torch Dtype Float32
Architectures RobertaModel
Bos Token Id 0
Pad Token Id 1
Eos Token Id 2
Transformers Version 4.41.2
Model Type Roberta
Vocab Size 50,000
Hidden Size 768
Num Hidden Layers 12
Num Attention Heads 12
Hidden Act Gelu
Intermediate Size 3,072
Hidden Dropout Prob 0.10
Attention Probs Dropout Prob 0.10
Max Position Embeddings 1,026
Type Vocab Size 10
Initializer Range 0.02
Layer Norm Eps 0.00
Position Embedding Type Absolute

Original model card of unixcoder-base below:


Model Card for UniXcoder-base

Model Details

Model Description

UniXcoder is a unified cross-modal pre-trained model that leverages multimodal data (i.e. code comment and AST) to pretrain code representation.

  • Developed by: Microsoft Team
  • Shared by [Optional]: Hugging Face
  • Model type: Feature Engineering
  • Language(s) (NLP): en
  • License: Apache-2.0
  • Related Models:
    • Parent Model: RoBERTa
  • Resources for more information:

Uses

Direct Use

Feature Engineering

Downstream Use [Optional]

More information needed

Out-of-Scope Use

More information needed

Bias, Risks, and Limitations

Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.

Recommendations

Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.

Training Details

Training Data

More information needed

Training Procedure

Preprocessing

More information needed

Speeds, Sizes, Times

More information needed

Evaluation

Testing Data, Factors & Metrics

Testing Data

More information needed

Factors

The model creators note in the associated paper:

UniXcoder has slightly worse BLEU-4 scores on both code summarization and generation tasks. The main reasons may come from two aspects. One is the amount of NL-PL pairs in the pre-training data

Metrics

The model creators note in the associated paper:

We evaluate UniXcoder on five tasks over nine public datasets, including two understanding tasks, two generation tasks and an autoregressive task. To further evaluate the performance of code fragment embeddings, we also propose a new task called zero-shot code-to-code search.

Results

The model creators note in the associated paper:

Taking zero-shot code-code search task as an example, after removing contrastive learning, the performance drops from 20.45% to 13.73%.

Model Examination

More information needed

Environmental Impact

Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).

  • Hardware Type: More information needed
  • Hours used: More information needed
  • Cloud Provider: More information needed
  • Compute Region: More information needed
  • Carbon Emitted: More information needed

Technical Specifications [optional]

Model Architecture and Objective

More information needed

Compute Infrastructure

More information needed

Hardware

More information needed

Software

More information needed

Citation

BibTeX:

@misc{https://doi.org/10.48550/arxiv.2203.03850,
 doi = {10.48550/ARXIV.2203.03850},
 
 url = {https://arxiv.org/abs/2203.03850},
 
 author = {Guo, Daya and Lu, Shuai and Duan, Nan and Wang, Yanlin and Zhou, Ming and Yin, Jian},
 
 keywords = {Computation and Language (cs.CL), Programming Languages (cs.PL), Software Engineering (cs.SE), FOS: Computer and information sciences, FOS: Computer and information sciences},
 
 title = {UniXcoder: Unified Cross-Modal Pre-training for Code 

Glossary [optional]

More information needed

More Information [optional]

More information needed

Model Card Authors [optional]

Microsoft Team in collaboration with Ezi Ozoani and the Hugging Face Team.

Model Card Contact

More information needed

How to Get Started with the Model

Use the code below to get started with the model.

Click to expand
from transformers import AutoTokenizer, AutoModel
 
tokenizer = AutoTokenizer.from_pretrained("microsoft/unixcoder-base")
 
model = AutoModel.from_pretrained("microsoft/unixcoder-base")
 
Downloads last month
4
Safetensors
Model size
125M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.