ValueZeroing
PublicThe official repo for the EACL 2023 paper "Quantifying Context Mixing in Transformers"
attention-mechanismcontextualized-representationfeature-attributioninterpretabilitypre-trained-language-modelstransformers
Creat:2022-10-27T16:24:11
Update:2024-11-06T22:42:16
12
Stars
0
Stars Increase