Papers
arxiv:2407.02880

Knowledge Composition using Task Vectors with Learned Anisotropic Scaling

Published on Jul 3
ยท Submitted by fredzzhang on Jul 10
Authors:
,

Abstract

Pre-trained models produce strong generic representations that can be adapted via fine-tuning. The learned weight difference relative to the pre-trained model, known as a task vector, characterises the direction and stride of fine-tuning. The significance of task vectors is such that simple arithmetic operations on them can be used to combine diverse representations from different domains. This paper builds on these properties of task vectors and aims to answer (1) whether components of task vectors, particularly parameter blocks, exhibit similar characteristics, and (2) how such blocks can be used to enhance knowledge composition and transfer. To this end, we introduce aTLAS, an algorithm that linearly combines parameter blocks with different learned coefficients, resulting in anisotropic scaling at the task vector level. We show that such linear combinations explicitly exploit the low intrinsic dimensionality of pre-trained models, with only a few coefficients being the learnable parameters. Furthermore, composition of parameter blocks leverages the already learned representations, thereby reducing the dependency on large amounts of data. We demonstrate the effectiveness of our method in task arithmetic, few-shot recognition and test-time adaptation, with supervised or unsupervised objectives. In particular, we show that (1) learned anisotropic scaling allows task vectors to be more disentangled, causing less interference in composition; (2) task vector composition excels with scarce or no labeled data and is less prone to domain shift, thus leading to better generalisability; (3) mixing the most informative parameter blocks across different task vectors prior to training can reduce the memory footprint and improve the flexibility of knowledge transfer. Moreover, we show the potential of aTLAS as a PEFT method, particularly with less data, and demonstrate that its scalibility.

Community

Paper author Paper submitter

The paper exploits the idea of using task vectors as knowledge carriers and presents an algorithm to learn task vector compositions to either produce multi-task models or transfer the knowledge to applications where training data is scarce.

Sorry I don't quite understand what the difference is between aTLAS and AdaMerging?
https://arxiv.org/abs/2310.02575

ยท
Paper author

Hi @s-JoL ,

Thanks for taking an interest in our work.

AdaMerging is indeed similar to our method. However, we presented a more general formulation of the anisotropic scaling of task vectors to scale different parameter blocks differently. Note that these parameter blocks can be very flexible. They could refer to an entire weight matrix, in which case the method is the same as AdaMerging, or they could be parts of a matrix such as rows, columns, or random partitions. We detailed this idea of randomly partitioning a weight matrix and learning different coefficients for different partitions in Section 6.2 of the paper. This effectively scales up the method.

In addition, AdaMerging only investigated the application of model merging, where the objective is to retain the abilities of different models. Our paper, on the other hand, also investigated the idea of transferring the knowledge in task vectors to a new task. And we presented a detailed study on few-shot recognition to demonstrate how such knowledge can be transferred to a data-scarce domain.

Hope that answers your question.

Cheers,
Fred.

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2407.02880 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2407.02880 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2407.02880 in a Space README.md to link it from this page.

Collections including this paper 2