Comment on bro pls
Waraugh@lemmy.dbzer0.com 1 year agoSo why don’t they just use post processing to remove all the known particles and start looking at the particles that remain, discover a new one, remove it, continue until there’s none left?
Comment on bro pls
Waraugh@lemmy.dbzer0.com 1 year agoSo why don’t they just use post processing to remove all the known particles and start looking at the particles that remain, discover a new one, remove it, continue until there’s none left?
Sodis@feddit.de 1 year ago
There are multiple reasons for that. We don’t know the decay channels of already discovered particles precisely. So there might be very rare processes, that contribute to already known particles. It is all a statistical process. While you can give statements on a large number of events, it is nearly impossible to do it for one event. Most of the particles are very short-lived and won’t be visible themselves in a detector (especially neutral particles). Some will not interact with anything at all (neutrinos). Then your detectors are not 100% efficient, so you can’t detect all the energy, that was released in the interaction or the decay of a particle. The calorimeters, that are designed to completely stop any hadrons (particles consisting of quarks) have a layer of a very dense material, to force interactions, followed by a detector material. All the energy lost in the dense material is lost for the analysis. In the end you still know, how much energy was not detected, because you know the initial energy, but everything else gets calculated by models, that are based on known physics. A neutral weakly interacting particle would just be attributed as a neutrino.