Locality-Delicate Hashing-Primarily based Environment friendly Level Transformer with Purposes in Excessive-Power Physics
Authors: Siqi Miao, Zhiyuan Lu, Mia Liu, Javier Duarte, Pan Li
Summary: This research introduces a novel transformer mannequin optimized for large-scale level cloud processing in scientific domains corresponding to high-energy physics (HEP) and astrophysics. Addressing the constraints of graph neural networks and normal transformers, our mannequin integrates native inductive bias and achieves near-linear complexity with hardware-friendly common operations. One contribution of this work is the quantitative evaluation of the error-complexity tradeoff of assorted sparsification strategies for constructing environment friendly transformers. Our findings spotlight the prevalence of utilizing locality-sensitive hashing (LSH), particularly OR & AND-construction LSH, in kernel approximation for large-scale level cloud knowledge with native inductive bias. Primarily based on this discovering, we suggest LSH-based Environment friendly Level Transformer (textbf{HEPT}), which mixes E2LSH with OR & AND constructions and is constructed upon common computations. HEPT demonstrates exceptional efficiency in two essential but time-consuming HEP duties, considerably outperforming present GNNs and transformers in accuracy and computational velocity, marking a major development in geometric deep studying and large-scale scientific knowledge processing. Our code is out there at url{https://github.com/Graph-COM/HEPT}.