Home > Books, Theses & Reports > Theses > Optimisation of the Data Reduction for the Belle II Pixel Detector and Development of a new Track Finding Algorithm using the Belle II Vertex Detector |
Thesis | BELLE2-PTHESIS-2022-005 |
Christian Wessel ; Jochen Dingfelder ; Florian Bernlochner
2022
Christian Wessel
Bonn
Abstract: The Belle II experiment, located at the SuperKEKB collider in Tsukuba, Japan, is recording data at world record luminosities in order to provide world-leading results in the field of particle physics in the future. To achieve this goal, it is equipped with two layers of pixel detector (PXD) and a four layered strip detector (SVD). Due to the strong final focusing of the beam at the interaction point (IP), and the close proximity of the PXD to the IP, the number of hits on the PXD is dominated by beam backgrounds. The data rate is expected to be in the order of several GB/s, which is too high for the readout to cope with, and an unnecessary amount of data to store, as most of the hits are from background. Thus an online data reduction system is used to discard most of the PXD data already online, with the risk of potentially losing valuable data from interesting collision events. Information from the surrounding tracking detectors are used to reconstruct tracks and extrapolate them to the PXD sensors, and only PXD hits inside Regions of Interest (ROI) around the extrapolated positions are stored. One of the systems for online data reduction is the DATCON using the Hough transformation (HT), which presented in this work. It operates on Field Programmable Gate Arrays (FPGA), which are freely programmable pieces of hardware that can execute several different tasks in parallel. The development of the DATCON algorithms takes place in the C++ programming language, but in this case the algorithms that were previously developed for the FPGA are implemented in the Belle II software framework to evaluate them. After a careful optimisation, the likelihood of an ROI containing a PXD hit originating from an interesting collision event, called ROI efficiency, is estimated to be about 90 % while achieving a data reduction by a factor of 5.27 ± 2.76 for an occupancy of 1 % on the first PXD layer. While the target value of a data reduction of factor 10 is not achieved, several possible future improvements are found. Based on the HT, a new track finding algorithm, called SVDHoughTracking, is developed and presented in this thesis, too. After performing the HT, tracks candidates are evaluated for being true or false tracks. Employing additional algorithms to discard random combinations of hits to be identified as a track, the newly developed algorithm for finding tracks in the SVD provides a higher finding efficiency compared to the currently used SVD standalone algorithm VXDTF2. At the same time the new algorithm finds less duplicate, wrong or background tracks. The same benefits of the new algorithm are also seen when using the SVDHoughTracking as a replacement for the VXDTF2 in the full tracking chain and for finding ROI on the PXD. Overall, 0.5 % more tracks are reconstructed while reducing the fake rate by 0.4 % at the same time.
Note: Presented on 27 06 2022
Note: PhD
The record appears in these collections:
Books, Theses & Reports > Theses > PhD Theses