maciek-g's picture
Update README.md
b9a420c verified
---
license: apache-2.0
language:
- en
library_name: transformers
---
# AnalysisObjectTransformer Model
This repository contains the implementation of the AnalysisObjectTransformer model, a deep learning architecture designed for event classification with data from the CERN LHC.
The model operates reconstructed-object and event-level features. MultiHeadAttention is used to extract the correlation between reconstructed objects such as jets (hadrons) or leptons in the final state, while event-level features capture the event summary, such as total hadronic energy or missing transverse energy.
Achieves state-of-the-art performance on final states which can be summarized as jets accompanied by missing transverse energy.
## Model Overview
The AnalysisObjectTransformer model is structured to process object-level features, in the case of jets: energy, mass, area, btag score, in any order (permutation invariance) and event-level features (HT, MET) to classify signal from background processes to enhance the sensitivity to rare BSM signatures.
### Components
See [**here**](https://excalidraw.com/#json=tCXGu1s6Az9wh4md45JU6,A3ezTIoqB10HVxOt4hhRSA) for complete architecure.
- **Embedding Layers**: Transform input data into a higher-dimensional space for subsequent processing.
- **Attention Blocks (AttBlock)**: Utilize multi-head attention to capture dependencies between different elements of the input data.
- **Class Blocks (ClassBlock)**: Extend attention mechanisms to incorporate class tokens, enabling the model to focus on class-relevant features. Implementation based on "Going deeper with transformers": https://arxiv.org/abs/2103.17239
- **MLP Head**: A sequence of fully connected layers that maps the output of the transformer blocks to the final prediction targets.
## Usage
Firstly, clone the repository:
```bash
git clone https://huggingface.co/maciek-g/AnalysisObjectTransformer
```
You can then import the model object and use within standard PyTorch and PyTorch lightning training workflows.
```python
from particle_transformer import AnalysisObjectTransformer
model = AnalysisObjectTransformer(input_dim_obj=..., input_dim_event=..., embed_dims=..., linear_dims1=..., linear_dims2=..., mlp_hidden_1=..., mlp_hidden_2=..., num_heads=...)
```
## Parameter definitions
`input_dim_obj`: The number of features associated with each event object (features per jet, lepton etc..)
`input_dim_events`: The number of features associated with the event (Number of jets, total hadronic energy, total missing transverse energy etc..)
`embed_dims`: Sequence embedding dims