The DARWIN experiment (DARk matter WImp search with liquid xenoN) will be a multi-ton liquid xenon time projection chamber (TPC) used for the direct detection of dark matter. With an active 40-ton liquid xenon target, it will be capable of reaching exposures above 200 ton-years that will greatly exceed the exposure of DARWIN’s predecessor, XENONnT, by several orders of magnitude. The large exposure coupled with a low-background environment will allow DARWIN to probe new parameter space of the spin-independent WIMP-nucleon cross section, as well as to probe several other science channels.
The dual-phase TPC is a cylindrical volume containing a layer of gaseous and a layer of liquid xenon. The top and bottom planes of the cylindrical volume are covered with photosensors to record events in the detector. When energy is deposited in the liquid xenon, both scintillation light and ionization is produced. The light is promptly measured by the photosensor and is recorded as the “S1” signal. The electrons are drifted through the TPC using an electric field until the reach the liquid-gas interface. At which point, a stronger electric field is applied to extract and accelerate the electrons through the gaseous xenon, creating secondary scintillation light that is recorded as the “S2” signal. The ratio between the S1 and S2 signal is used to distinguish between electron-recoil and nuclear-recoil interaction.
More information about DARWIN can be found here.
As the volume of the liquid xenon target scales up, so too does the number of photosensors need to increase. DARWIN is expected to have over 1000 photosensors (i.e. readout channels), which is notably higher than the ~500 photosensors used in XENONnT. However, merely scaling up the data acquisition and readout system of XENONnT to the number of channels for DARWIN presents numerous challenges due to the high volume of data that needs to be read out, processed, and stored. Our group is working on data acquisition and readout R&D to investigate methods of reducing or mitigating the data rate. For example, parallelizing certain parts of the data acquisition pipeline, intelligent trigger algorithms, and using dynamic sampling rates to lower the amount of data stored.