On March 13, 2021, in the middle of the installation campaign (from February 17 till April 4), the Minister of Science and Higher Education of the Russian Federation visited the site of the neutrino telescope to officially launch the detector and to sign the Memorandum of Understanding between the Ministry of Science and Higher Education of the Russian Federation and the Joint Institute for Nuclear Research (JINR) on development of the BAIKAL-GVD Neutrino Telescope. In 2021, BAIKAL-GVD increased its effective volume for showering neutrino interactions up to 0.4 km3.
Winter expedition team with distinguished guests. (Credits: Bair Shaybonov, DLNP)
Nowadays, neutrino telescopes are important instruments of multi-messenger astronomy providing a new powerful method for exploring the Universe. BAIKAL-GVD is one of four such installations in the world. The other three are IceCube at the South Pole, KM3NeT and ANTARES in the Mediterranean Sea. All of them make up the Global Neutrino Network aimed at expertise exchange, cooperative data processing and better general sensitivity due to different locations across the globe.
Neutrino telescopes are intended to investigate the most powerful natural accelerators emitting ultra-high-energy neutrinos, such as Active Galactic Nuclei, promising but not the only candidates. The corresponding research should help to understand the evolution of galaxies, formation of supermassive black holes and mechanisms of particle acceleration.
About cubic-kilometre scale neutrino detectors should be sensitive to tiny neutrino fluxes from the very distant objects. In 2021, BAIKAL-GVD, the largest neutrino telescope in the Northern Hemisphere, is successfully taking data with an effective volume of 0.4 km3. In 2027, BAIKAL-GVD is expected to observe showering neutrino interactions with a volume of one cubic km.
Minister Valery Falkov and JINR director Grigory Trubnikov signing the Memorandum of Understanding between the Ministry and JINR. Leftmost: INR director Maxim Libanov. (Credits: Bair Shaybonov, DLNP)
On March 13, 2021, BAIKAL-GVD was officially inaugurated. Valery Falkov, the Minister of Science and Higher Education of the Russian Federation, and Grigory Trubnikov, the Director of JINR, signed the Memorandum of Understanding between the Ministry of Science and Higher Education of the Russian Federation and JINR.
The ceremony took place on the ice just above the neutrino telescope at the special ice table created by a local Siberian artist. Maxim Libanov, the Director of the Institute for Nuclear Research of the Russian Academy of Sciences (INR, RAS), welcomed distinguished guests and talked about the history, current status and development plans of the Baikal Neutrino Telescope.
The Minister officially launched the detector pressing the start-data-taking button. This event was widely covered by the mass media. The year 2021 was declared in Russia to be the Year of Science and Technology, and the inauguration of BAIKAL-GVD is already regarded as one of its top events.
This year, the International Scientific BAIKAL-GVD Collaboration comprises the Institute for Nuclear Research of the Russian Academy of Sciences (Moscow), the Joint Institute for Nuclear Research (Dubna), Irkutsk State University, Skobeltsyn Institute for Nuclear Physics MSU (Moscow), Nizhny Novgorod State Technical University, St. Petersburg State Marine Technical University, the Institute of Experimental and Applied Physics of Czech Technical University in Prague, the Faculty of Mathematics, Physics and Informatics of the Comenius University in Bratislava (Slovakia), the Institute of Nuclear Physics of the Polish Academy of Sciences (Krakow, Poland) and EvoLogics GmbH (Berlin, Germany).
The 2021 expedition was organized by the Institute for Nuclear Research of the Russian Academy of Sciences (Moscow) and the Joint Institute for Nuclear Research (Dubna).
G.V. Domogatsky (INR, RAS), spokesperson of the BAIKAL-GVD Collaboration
On 1 April the APPEC General Assembly came together online for their first meeting in this year. The new Chair Andreas Haungs announced some changes in personnel at the beginning of the session:
A. Kouchner, new Co-Chair of the APPEC General Assembly.
Antoine Kouchner, director of the APC laboratory, was endorsed as Co-Chair of the General Assembly. He is since many years closely related to APPEC and leads the French functional office of the joint secretary of APPEC. Congratulations to Antoine and we look forward working with him in his new role!
Additionally we welcome our new representatives Nicu Marginean (IFIN) and Alexandra Saftoiu (IFIN) for Romania, Christos Markou (NCSR Demokritos) for Greece and Matthias Marklund (VR) for Sweden.
Last year the Neutrinoless Double Beta Decay APPEC Sub-Committee gave advise on the European (and global) program and as a follow-up action an 0νββ European – North American Summit will be organised from 29 Sep to 1 Oct 2021.
Then Katharina Henjes-Kunst gave an overview about current activities from the General Secretariat including the status of the Town Meeting which will be held 9-10 June 2022 in Berlin and the next APPEC TechForum on Robotics in Harsh Environments, which shall take place in Prague.
The last topic on the agenda was the report from the Direct Detection of Dark Matter Sub-Committee led by Leszek Roskowzki. He gave a summary on the report and then the chair of the Scientific Advisory committee, Sijbrand de Jong, congratulated the members of the DDMD sub-committee for their excellent report and thank them for the hard and thorough work they have put in.
After the report was discussed, the GA endorsed it and it is now published on our website and on arXiv. The follow up actions will be a topic during the next General Assembly meeting in June.
During a week-long sea campaign, 8-14 April 2021, the seafloor infrastructure offshore Sicily has been successfully upgraded. In addition, five new detection units of the kilometre cube neutrino telescope KM3NeT/ARCA have been connected and are operational.
Located in the Mediterranean Sea at a depth of 3500 m, about 80 km offshore Capo Passero, Sicily, the ARCA telescope together with its sister detector ORCA, located offshore Toulon, France will allow scientists to identify the astrophysical sources of high-energy cosmic neutrinos and to study the fundamental properties of the neutrinos, the most elusive and pervasive of the known elementary particles. The two detectors will also provide unprecedented opportunities for Earth and Sea science studies.
Once complete, the KM3NeT/ARCA detector will form an array of more than two hundred detection units. Each of these 700 m tall structures comprises 18 modules equipped with ultra-sensitive light sensors that register the faint flashes of light generated by neutrino interactions in the pitch-black abyss of the Mediterranean Sea.
During the first part of the sea operation, a new junction box, a hub for the power distribution and data transmission of the detection units, was added to the sea floor infrastructure. The junction box is connected via an electro-optical cable to the recently renovated onshore INFN laboratory located in Portopalo di Capo Passero.
In the second part of the operation, five new KM3NeT detection units were deployed, individually connected by a remotely operated submersible to the junction box and unfurled to their final vertical configuration. As a final step, the first detection unit of the apparatus, which had been deployed as early as 2015, was connected to the new junction box.
In total, six detection units are now in operation, representing the initial core of the KM3NeT/ARCA neutrino telescope. With the six ORCA detection units already taking data, the KM3NeT neutrino observatory has now comparable sensitivity to that of its predecessor, the ANTARES neutrino telescope.
KM3NeT is an international collaboration of over 250 scientists from more than fifty scientific institutes around the World. KM3NeT has been included in the list of high priority projects selected by the European Strategy Forum on Research Infrastructures (ESFRI). Paschal Coyle, Spokesperson of the Collaboration emphasises: “The successful deployment and operation of multiple ARCA detection units is another major step forward for the KM3NeT project. Now it’s full steam ahead with the construction of the hundreds of detection units to be deployed at the French and Italian sites.”
The five detection units of KM3NeT onboard the deployment ship. (Credits: KM3NeT)
Deployment of a detection unit of KM3NeT. (Credits: KM3NeT)
ECFA currently organises the development of a Detector R&D roadmap, which is organised in 9 Task Forces, one of them beeing on Training. The ECFA early career researcher panel is currently collecting input from early career researchers on opportunities for training on instruemental related tasks. Also early career researchers from the astroparticle physics community are invited to give feedback:
“The ECFA early career researcher panel has been asked to collect input from early career researchers of all backgrounds, whether you have worked in instrumentation or not, on the opportunities for training that are available for instrumentation-related tasks. In this spirit, we request that all junior researchers (students, post-docs, non-tenured/non-permanent researchers and engineers) take a few minutes of your time to fill out the following survey. Once again, please fill it out even if you have not worked on instrumentation; you will not be asked for as much information in this case, but we still want to hear from you!
Please make sure to fill out the below-linked survey by the deadline of Thursday, April 22! This is important as we will then review your feedback and prepare to present it on behalf of the early career researcher community it at the relevant ECFA working group meeting.
In this survey, you will be asked a series of questions about yourself in order that we can get a high-level overview of who we are representing, followed by a series of questions about the training experiences you may (or may not) have encountered during different stages of your career. In case specific questions are not applicable to your situation, you can choose to not reply to them, as they are not mandatory.
All of the responses are fully anonymous, and results will only ever be shared in aggregate form, thus ensuring your privacy. There are some text boxes, but in such cases we would encourage you to provide answers while keeping the replies anonymous (not mentioning names/etc). If you wish to discuss with us in a non-anonymous way, you are welcome to contact us using the email listed below.
The results of this survey will be shared at the ECFA Detector R&D Roadmap Symposium on April 30th, 2021. You are welcome to participate in that session as well, which can be found at the following link: https://indico.cern.ch/event/1001747/
If you have any questions, or otherwise want to provide non-anonymous feedback, please contact the survey organizers at ecfa-ecr-detector@cern.ch.
Best regards,
The ECFA Early Career Researcher Detector R&D working group”
Interview with Tommaso Dorigo about the MODE collaboration
As reply to the JENAA call for Expression of Interest Tommaso Dorigo and his colleagues proposed a program for Machine-learning Optimized Design of Experiments – MODE. Their main target is the use of differentiable programming in design optimization of detectors for our research field. The first Kick-off Meeting took place last September and they just published a preprint of a short article on their research plan on INSPIRE which will be published by Nuclear Physics News International. In this interview, Tommaso Dorigo will tell us more about MODE and next steps.
Can you explain more about the program and aim of MODE?
For over a century now, physicists have designed instruments to detect elementary particles, and radiation in general, exploiting cutting-edge technologies, and in some cases developing entirely new ones. As the complexity of the apparatuses and of the required tasks grew, so did our inventiveness. This has brought a stream of new developments, which culminated in the past two decades with the construction and operation of giant detectors like ATLAS and CMS, which are mind boggling instruments.
Precisely because of their complexity the design of such apparatuses has followed well-defined paradigms, which have served us well until now, and guided us toward robust design choices and well-established techniques. However, those choices are not – and cannot be – perfectly aligned with our true experimental goals. The reason is that the task of optimizing the design of these apparatuses is absolutely super-human, as it requires the study of configuration spaces of hundreds, if not thousands of dimensions. In fact, a global optimization is usually not even attempted: we use as success-metrics simplified surrogates of our real goals, and this potentially results in huge losses in performance.
A proposed pipeline for the optimization of a muon tomography apparatus (Figure taken from the article on the MODE collaboration published in Nuclear Physics News International, March 2021).
Yet today we can, in principle, rely on artificial intelligence for the exploration of those hugely complex parameter spaces. Differentiable programming techniques allow us to navigate through them, if we provide the right interfaces and construct models of the whole experiment, from the simulation of the events of interest, particle interaction with matter, detector response and reconstruction, and inference extraction. This is very hard, I am not hiding that. But we need to start doing it. MODE has the goal of proving how such a path can be undertaken, to realign our experimental choices with our true goals, and to vastly improve the effectiveness of future detectors.
But MODE is not specifically targeting those giant multipurpose detectors for fundamental physics – quite the contrary, in fact. MODE researchers are starting this ambitious program by working towards smaller-scale practical applications of particle detectors, such as proton therapy or imaging with cosmic muons. In these areas, the detectors are relatively small and their geometry is way less complex than those of particle colliders. Nevertheless, design optimization is far from trivial also in these applications. In fact, the first practical implementation of the MODE program may well be in one of these areas, where the typical timescales from design to operation are relatively short.
At the JENAS 2019 the call for EoI was issued and you came up with the MODE program. Did you and your colleagues already work on this topic before, or did you just start after this event?
I have worked on machine-learning-driven optimization of physics measurements in the past, but my idea of applying the techniques developed in that context to the design of instruments was born while sitting in board meetings of accelerator physics coordination. There, I observed that the design of new detectors for future colliders was being proposed and starting, by the hands of colleagues with decades of experience in instrumentation, without any consideration for the elephant in the room, AI. In 20 years, the extraction of information from detector signals will be entirely automated and in the hands of much more complex and performant algorithms than those in use today. This means that constructing devices with the same paradigms as before is doomed to be enormously suboptimal.
Of course, and fortunately, I am not the only one who realizes this, and in fact efforts in the use of advanced computer science techniques to the optimization of detectors and instruments have started to appear in the past few years. Some of the MODE members are in fact leaders in this area of research, with some important publications already produced. With the help of these colleagues, we thus formed the MODE collaboration, to provide the ground where to build the required interfaces for a more systematic approach to detector design.
To what extent does your collaboration represent the three communities Particle, Astroparticle and Nuclear Physics ?
A view of the CMS experiment at CERN. The complexity of modern particle physics experiments is too high to allow for human-driven optimization. Or, better put, the space of design choices is so vast that the potential for improvement in relevant metrics (discovery potential, data quality) is huge. (Credits:CERN)
Our group for now is small, but highly motivated. I cannot cite everybody here, but MODE includes physicists who are experts in machine learning and already working for calorimetry optimization (Jan Kieseler, at CERN, Fedor Ratnikov, at HSE University and Yandex Data school, and colleagues at National Research University Moscow), track reconstruction (Mia Tosi, at University of Padova), inference extraction (Pietro Vischia, at UCLouvain, and Giles Strong, at INFN-Padova), and muon tomography (Andrea Giammanco, at UCLouvain) – all those tasks are important use cases for MODE, and are not specific of HEP. And we have computer scientists with experience in collaboration with physicists (Atilim Gunes Baydin, at Oxford University, and Gilles Louppe, at Université de Liege); plus Ph.D. students in HEP (Hevjin Yarar and Lukas Layer, at INFN-Padova). But MODE tries to be as inclusive as possible, because of the extremely challenging nature of its research program. We need the interest of everybody who wants to extract information from devices that work by detecting radiation in any form, and therefore it is only natural to look beyond the playground of some of us, which is HEP. Hence we have started to involve colleagues from the astroparticle physics and nuclear physics community, as well as neutrino physics, by inviting them to take part to the advisory committee of a workshop we are organizing, which we hope will be the first of a series, and by asking them to chair sessions there and take part. In conjunction, we are advertising our research plan within those communities, as we believe that our studies will benefit them just as much as HEP.
It is important to realize that particle detectors can be improved quite significantly in their performances by studying even very simple choices, such as moving detection elements around. Last year I did an exercise with a simply designed detector, MUonE, which will be built to reduce a theoretical uncertainty on the g-2 muon anomaly. The experiment aims to measure the differential muon-electron elastic scattering with layers of silicon impinged on by a beam of muons at CERN, and is very simple – so simple that I could study it with a fast simulation and demonstrate that with some optimization a factor of two gain in the relevant metric could be achieved without increase in cost or complexity. A publication ensued, and the collaboration is now using my results for an improved design. But this is just an example.
How can ECFA, NuPECC and in particular APPEC support your activities?
Help in making the MODE research program more visible and known within the communities is certainly important – we have indeed already benefited from the offer of publishing a short manifesto in the Nuclear Physics News International journal. Also, we presently have no explicit funding for MODE, so support for the organization of a yearly workshop will be very welcome.
You plan a MODE Workshop on Differentiable Programming this autumn. What are the aims of the workshop and who should participate?
The workshop aims at making these techniques more widely known, as well as at creating a stable bridge and a communication ground with the computer science community. Anybody who realizes that these tools, which today power artificial intelligent devices all around us, are needed for fundamental physics research in the future should consider coming, listening, or giving a contribution. I mention artificial intelligence in everyday life objects (cellphones, self-driving vehicles, targeted ads, spam filters, etcetera) because these things have changed the paradigms in our society, but this was only possible because it was economically favourable to invest in creating the right interfaces for the problems to be solved. In basic research, we have to create those interfaces ourselves, or we will be stuck to the ice age before we know it.
Are there other events planned or what are the next steps?
Besides the workshop, we are starting to hire – there is a Ph.D. position for a Joint doctorate at the University of Padova and at Université Clermont Auvergne, call open until May 12 at the University of Padova; the student will work on MODE research. We are also starting our activities in two important use cases, the optimization of muon tomography detectors and the study of hybrid calorimeters. We are writing a white paper on the use of differentiable programming for detector design. And we are participating in a proposal to join the ELLIS society within a larger community of HEP and astro-HEP scientists. Finally, we are participating in competitive funding, to provide ourselves with the needed fuel for a long journey.
How can interested scientists join and benefit from MODE?
To join mode you only need to declare your genuine interest in our research plan and to devote a fraction of your research time to some of our activities, or propose others within our interests. We hold online meetings every month or so, and everybody is welcome to attend.
Tommaso Dorigo (Ph.D. 1999) is a particle physicist and machine learning expert who works as a First Researcher for the INFN and teaches Particle Physics and Data Analysis courses at the University of Padova, Italy. He participates to the CMS experiment at the CERN LHC collider, where he is a member of the Statistics Committee, which he chaired in the years of the Higgs boson discovery. In 2020 Dorigo founded and since then coordinates the MODE collaboration. He is an author of over 1600 peer-reviewed scientific publications, and is an editor of the Elsevier “Reviews in Physics” and “Physics Open” journals; since 2006 he has also run a popular blog, visited over 14 million times (http://www.science20.com/quantum_diaries_survivor).
On December 8, 2016, a high-energy particle called an electron antineutrino hurtled to Earth from outer space at close to the speed of light carrying 6.3 petaelectronvolts (PeV) of energy. Deep inside the ice sheet at the South Pole, it smashed into an electron and produced a particle that quickly decayed into a shower of secondary particles. The interaction was captured by a massive telescope buried in the Antarctic glacier, the IceCube Neutrino Observatory.
The electron antineutrino that created the Glashow resonance event traveled quite a distance before reaching IceCube. This graphic shows its journey; the blue dotted line is the antineutrino’s path. (Not to scale.) (Credits: IceCube Collaboration)
IceCube had seen a Glashow resonance event, a phenomenon predicted by Nobel laureate physicist Sheldon Glashow in 1960. With this detection, scientists provided another confirmation of the Standard Model of particle physics. It also further demonstrated the ability of IceCube, which detects nearly massless particles called neutrinos using thousands of sensors embedded in the Antarctic ice, to do fundamental physics. The result was published on March 10 in Nature.
“This result proves the feasibility of neutrino astronomy—and IceCube’s ability to do it—which will play an important role in future multimessenger astroparticle physics,” says Christian Haack, who was a graduate student at RWTH Aachen while working on this analysis. “We now can detect individual neutrino events that are unmistakably of extraterrestrial origin.”
Since IceCube started full operation in May 2011, the observatory has detected hundreds of high-energy astrophysical neutrinos and has produced a number of significant results in particle astrophysics, including the discovery of an astrophysical neutrino flux in 2013 and the first identification of a source of astrophysical neutrinos in 2018. But the Glashow resonance event is especially noteworthy because of its remarkably high energy; it is only the third event detected by IceCube with an energy greater than 5 PeV.
To confirm the detection and make a decisive measurement of the neutrino-to-antineutrino ratio, the IceCube Collaboration wants to see more Glashow resonances. A proposed expansion of the IceCube detector, IceCube-Gen2, would enable the scientists to make such measurements in a statistically significant way. The collaboration recently announced an upgrade of the detector that will be implemented over the next few years, the first step toward IceCube-Gen2.
The IceCube Laboratory at the South Pole. This building holds the computer servers that collect data from IceCube’s sensors under the ice. (Credits: Martin Wolf, IceCube/NSF)
Glashow, now an emeritus professor of physics at Boston University, echoes the need for more detections of Glashow resonance events. “To be absolutely sure, we should see another such event at the very same energy as the one that was seen,” he says. “So far there’s one, and someday there will be more.”
“The detection of this event is another ‘first,’ demonstrating yet again IceCube’s capacity to deliver unique and outstanding results,” says Olga Botner, professor of physics at Uppsala University in Sweden and former spokesperson for the IceCube Collaboration.
Last but not least, the result demonstrates the value of international collaboration. IceCube is operated by over 400 scientists, engineers, and staff from 53 institutions in 12 countries, together known as the IceCube Collaboration. The main analyzers on this paper worked together across Asia, North America, and Europe.
The IceCube Neutrino Observatory is funded primarily by the US National Science Foundation but also with significant European contributions. Research at IceCube, including major contributions to the construction and operation of the detector, is supported in Europe by funding agencies from Belgium, Denmark, Germany, Sweden, Switzerland, and the United Kingdom.
. Pierre Auger Observatory (Credits: Pierre Auger Observatory)
The Pierre Auger Collaboration is releasing 10% of the data recorded using the world’s largest cosmic ray detector. These data are being made available publicly with the expectation that they will be used by a wide and diverse community including professional and citizen-scientists and for educational and outreach initiatives. While the Auger Collaboration has released data in a similar manner for over a decade, the present release is much wider with regard to both the quantity and type of data, making them suitable both for educational purposes and for scientific research. The data can be accessed at www.auger.org/opendata.
Operation of the Pierre Auger Observatory, by a Collaboration of about 400 scientists from over 90 institutions in 18 countries across the world, has enabled the properties of the highest-energy cosmic rays to be determined with unprecedented precision. These cosmic rays are predominantly the nuclei of the common elements and reach the Earth from astrophysical sources. The data from the Observatory have been used to show that the highest-energy particles have an extra-galactic origin. The energy spectrum of cosmic rays has been measured beyond 1020 eV, corresponding to a macroscopic value of about 16 joules in a single particle. It has been demonstrated that there is a sharp fall of the flux at high energy, and emerging evidence of emission from particular near-by sources has been uncovered. Analyses of the data have allowed characterisation of the type of particles that carry these remarkable energies, which include elements ranging from hydrogen to silicon. The data can also be used to test particle physics at energies beyond the reach of the LHC.
At the Pierre Auger Observatory, located in Argentina, cosmic rays are observed indirectly, through extensive air-showers of secondary particles produced by the interaction of the incoming cosmic ray with the atmosphere. The Surface Detector of the Observatory covers 3000 km² and comprises an array of particle detectors separated by 1500 m. The area is overlooked by a set of telescopes that compose the Fluorescence Detector which is sensitive to the auroral-like light emitted as the air-shower develops, while the Surface Detector is sensitive to muons, electrons and photons that reach the ground. The data from the Observatory comprises the raw ones, obtained directly from these and other instruments, through reconstructed data sets generated by detailed analysis, up to those presented in scientific publications. Some of the data are routinely shared with other observatories to allow analyses with fullsky coverage and to facilitate multi-messenger studies.
As pointed out by the spokesperson, Ralph Engel, “the data from the Pierre Auger Observatory, which was founded more than 20 years ago, are the result of a vast and long-term scientific, human, and financial investment by a large international collaboration. They are of outstanding value to the worldwide scientific community.” By releasing data and analysis programs to the public, the Auger Collaboration upholds the principle that open access to the data will, in the long term, allow the maximum realization of their scientific potential.
The Auger Collaboration has adopted a classification of four levels of complexity of their data, following that used in high-energy physics, and adapted it for its open-access policy:
One of the water-Cherenkov detectors (foreground) and a fluorescence-detector station (background). (Credit: Pierre Auger Observatory)
(Level 1) Open-access publication with additional numerical data provided to facilitate re-use;
(Level 2) Regular release of cosmic-ray data in a simplified format, for education and outreach. This began in 2007 when 1% of the data was released and increased to 10% in 2019;
(Level 3) Release of reconstructed cosmic-ray events, example codes, selected with the best available knowledge of the detector performance and conditions at the time of data-taking. Example codes derived from those used by the Collaboration for published analyses are also provided;
The last two levels of information are added in the present release, which includes data from the two major instruments of the Observatory, the 1500 m array of the Surface Detector and the Fluorescence Detector. The dataset consists of 10% of all the events recorded at the Observatory, subjected to the same selection and reconstruction procedures used by the Collaboration in recent publications. The periods of data recording are the same as used for the physics results presented at the International Cosmic Ray Conference held in 2019. The examples of analyses use updated versions of the Auger data sets, which differ slightly from those used for the publications because of subsequent improvements to the reconstruction and calibration. On the other hand, as the fraction of data which is now available is currently 10% of the actual Auger data sample, the statistical significances of measured quantities are reduced with respect to what can be achieved with the full dataset, but the number of events is comparable to what was used in some of the first scientific publications by the Auger Collaboration.
The Pierre Auger Collaboration is committed to its open data policy, in order to increase the diversity of people accessing scientific data and so the common scientific potential for the future.
Interview with Clarisse Aujoux, Kumiko Kotera and Odile Blanchard on the first carbon footprint study of an astroparticle physics experiment
Environmental sustainability is becoming an increasingly important topic, especially in science. The approach of determining the annual carbon footprint of a future astroparticle experiment and identifying possible savings potential is new and will certainly become an important aspect in the future. As pioneers, three scientists have published a study on the carbon footprint of the GRAND experiment, taking a close look at the main emission sources, i.e. travel, digital technologies and hardware equipment. In this interview, we talk to Clarisse Aujoux, Kumiko Kotera and Odile Blanchard about their study.
With your work, you are the first to conduct such a carbon footprint study for an astrophysics experiment. How did it come about?
The GRAND collaboration is concerned about its environmental impact. We had several discussions about this subject in collaboration meetings, and a “GRAND Carbon Committee” was set up. As our experiment is in its prototyping stage, it is a good time to make decisions according to environmental criteria. Still, as long as we don’t have any quantification of the emissions, we cannot make consistent decisions. Therefore, a first step towards taking such measures was to estimate the carbon footprint of our experiment, and assess the major sources of emissions.
Can you shortly explain what GRAND is?
A prototype antenna being tested at the deployment site of the 300-antenna pathfinder, GRANDProto300, in the Qinhai Province, China. Credit: GRAND collaboration.
The working of the most violent phenomena in the Universe (compact object mergers, blazar jets, pulsar winds…) remains mysterious. These objects could be probed by deciphering the ultra-high energy astroparticle messengers that they send. The detection of these particles is however very challenging and requires to deploy large-scale experiments.
The GRAND (Giant Radio Array for Neutrino Detection) project aims primarily at detecting ultra-high energy neutrinos, cosmic rays and gamma rays, with a colossal array of 200,000 radio antennas over 200,000 km2, split into ~20 sub-arrays of ~10,000 km2 deployed worldwide. The strategy of GRAND is to detect air showers above 1017 eV that are induced by the interaction of high-energy particles in the atmosphere or in the Earth crust, through its associated coherent radio-emission in the 50-200 MHz range.
A staged construction plan ensures that key techniques are progressively validated, while simultaneously achieving important science goals in UHECR physics, radioastronomy, and cosmology early during construction. The 300-antenna pathfinder array, GRANDProto300, is planned to be deployed in 2021. It aims at demonstrating autonomous radio detection of inclined air-showers, and make measurements of the composition and the muon content of cosmic rays around the ankle energy. The first 10,000 antenna sub-array (GRAND10k) is planned to be deployed in the mid 2020s, and will have the sensitivity to detect the first ultra-high energy neutrinos. In its final configuration (GRAND200k), in the 2030s, GRAND plans to increase our sensitivity to neutrino detection of two orders of magnitude compared to current experiments, and to reach a sub-degree angular resolution, which should enable us to perform ultra-high energy neutrino astronomy.
GRAND will also be the largest detector of UHE cosmic rays and gamma rays. It will improve UHECR statistics at the highest energies ten-fold within a few years, and either discover UHE gamma rays or improve their limits ten-fold. Further, it will be a valuable tool in radioastronomy and cosmology, allowing for the discovery and follow-up of large numbers of radio transients — fast radio bursts, giant radio pulses — and for precise studies of the epoch of reionization.
Which parts of the experiment cause the greatest greenhouse gas (GHG) emissions?
Projected distribution of greenhouse gas emissions for all sources for GRANDProto300, GRAND10k and the full GRAND array. The title indicates the total amount of emissions per year due to each source at each experimental stage. (source: Aujoux, Kotera & Blanchard, 2021 https://arxiv.org/pdf/2101.02049.pdf)
In our study, we have focussed on the GHG emissions related to three sources: travel, digital technologies and hardware equipment. Interestingly, we find that these emission sources have a different impact depending on the stages of the experiment. Digital technologies and travel prevail for the small-scale prototyping phase (GRANDProto300), whereas hardware equipment (material production and transportation) and data transfer/storage largely outweigh the other emission sources in the large-scale phase (GRAND200k). In the mid-scale phase (GRAND10k), the three sources contribute equally.
Did you expect these results or was one result particularly surprising?
We did not expect that the emissions related to digital technologies would have such a large impact. We believe that people in general are more aware of the emissions due to travel and hardware equipment production, but tend to forget that large amount of data can actually lead to a huge carbon footprint.
How can these findings contribute to reducing GRAND’s carbon footprint?
The study has initiated numerous discussions within the collaboration. Various types of actions may be implemented to mitigate the carbon footprint of GRAND, at all stages of the project deployment.
Travel emissions may be reduced by encouraging local collaborators to perform the on-site missions or by having international collaborators stay longer on the site of the experiment rather than doing multiple trips, each lasting a few days ; they may also be reduced by optimizing collaboration meetings, through optimizing the location of the meetings, limiting the number of attendees from the collaboration, opting for some virtual meetings, and combining virtual and physical meetings.
Options to reduce digital emissions include the reduction in the volume of data to be archived. The collaboration is already developing data reduction strategies to reduce the carbon footprint of data transfer and storage by 4 or 5 orders of magnitude. It was also found that shipping regularly the archival data by air mail would be largely less emitting than transferring the data via the internet. As for the emissions from simulations and data analysis, the challenge is to reduce the millions of CPU hours projected to be spent yearly. Incentives to weigh the cost/benefit of the simulation runs may contribute to lower the carbon footprint in the years to come.
Mitigating the emissions from manufacturing and hauling the hardware equipment will be a top priority for the design of the GRAND200k phase, as these emissions are projected to weigh most in the carbon footprint of this phase. It is about optimizing the environmental cost of the materials used for the antennas, the solar panels and the batteries, establishing a recycling plan, and monitoring the transportation from the production sites to the array-sites.
The GRAND collaboration will take several actions in response to this study. The various action plans proposed for each emission source will be documented in a GRAND Green Policy, which each collaboration member will be encouraged to follow, in order to reduce the collective carbon footprint.
To what extent does the location of the experiment, in this case China, have an impact on the results?
The GRAND experiment requires to be deployed in a radio-quiet area, and such areas are remote by essence. The emissions related to on-site missions and the transportation of the hardware equipment have a large impact on the total carbon footprint, in the small- and mid-scale phases.
As an international collaboration, GRAND members originate from institutes located in several countries. The main countries presently involved are (in alphabetical order): Brazil, China, France, Germany, the Netherlands, and the United States. This geographical spread, not specific to GRAND but to any international collaboration, raises obvious concerns about communication (e.g., physically gathering collaborators regularly, and hence about travel, but also about the digital infrastructure).
However, in the large-scale phase, travel and hardware transportation appear to have less impact, as emissions due to digital and hardware material prevail. We caution however that the geographical locations of the various sub-arrays –to be scattered around the world at yet undecided locations– was not taken into account.
The location of the experiment also sets the electricity emission factor, which can vary of more than one order of magnitude from one country to another. The high electricity emission factor of China implies that all our GHG emissions related to local energy consumption are particularly enhanced.
Roadmap of the GRAND project. The different stages of the project are presented, with information on the envisionned set-up, growth of the collaboration, and major greenhouse gas emission sources with their contribution in tCO2e/yr and their corresponding percentage, as estimated in our work. (source: Aujoux, Kotera & Blanchard, 2021 https://arxiv.org/pdf/2101.02049.pdf)
Particularly through the COVID-19 pandemic, the topic of travel has been discussed a lot, especially in connection with online meetings. How has this pandemic influenced your findings?
While studying the travel habits of the GRAND collaboration members, we clearly saw a drop in their travel activity after March 2020. This obviously resulted in a cut in the GHG emissions due to travel. Our study indicates that travel constitutes one of the main emission sources of the small- and mid-scale stages of the project. Besides, it is our belief that mitigation measures should be taken on all possible fronts. The Covid-19 situation has demonstrated that cutting on travel is definitely a way to reduce the carbon footprint of the collaboration.
However, we will have to elaborate on hybrid solutions as we need to maintain a certain level of physical meetings. It will be about optimizing those meetings and trips. In any case, researchers need to travel to the experimental site in order to make measurements, check that the site is appropriate for the project, and deploy the array. Furthermore, in the process of building a collaboration, personal interactions and conversations at coffee breaks and shared lunches and dinners are viewed as crucial seeds for progress. For students and postdoctoral scholars, networking is often perceived as a sine qua non for a successful career, and this is more challenging to perform online.
Do you think that such studies will be part of every experiment in the future?
Large-scale physics and astrophysics experiments gather a large fraction of the scientific staff and absorb a significant volume of the science budget. As such, it seems essential to assess their environmental impact. Besides, we believe that these experiments could turn out to be interesting for other laboratories to elaborate and test ideas, and to appreciate the best practices to be implemented in other contexts.
In this token, it is likely that such studies become part of every experiment in the future, primarily because scientists feel in majority concerned about these questions.
What can other experiments learn from your study?
The specificity of the methodology presented in our paper is that it is fully transparent and uses open source data. Hence, the method is replicable to any other scientific consortium. We have already received feedback and solicitation from colleagues who are planning to use our methodology to assess the carbon footprint of their experiments. We also propose several lines of actions for the travel and digital emission sources, that could be implemented in other experiments. We are looking forward to exchanging ideas, data and methods in order to improve the carbon footprint of the physics and astrophysics community.
Clarisse Aujoux is currently completing her Master’s degree at Ecole des Ponts et Chaussées Paris Tech, with a major in energy transition. Through her student years, she progressively developed a strong interest for environmental impact of human activities and thus specialized in carbon footprint and Life Cycle Assessment. Joining the GRAND project in 2020 for a 6 months period, she provided a systemic approach to the environmental footprint of this collaboration, essential for the decision-making process.
Kumiko Kotera (Credit: Jean Mouette /IAP-CNRS-SU)
Kumiko Kotera is a researcher at the Institut d’Astrophysique de Paris of the French Centre National de la Recherche Scientifique (CNRS). She specializes in astroparticle physics and high-energy astrophysics. Today, she acts as co-spokesperson for the international GRAND project, to try to probe the most violent phenomena of the Universe, via the detection of their extremely energetic messengers (cosmic rays, gamma rays and neutrinos).
Odile Blanchard
Odile Blanchard is an associate professor of economics at Université Grenoble Alpes, France, and specializes in energy and climate economics. She currently facilitates the work of the “Carbon footprint” team of Labos 1point5 and contributes to the development of GES1point5, the carbon footprint calculator of French research laboratories. : https://labos1point5.org/ges-1point5
The Virgo interferometer is officially a IEEE Milestone, along with the two LIGO detectors. On 3rd February 2021 the ceremony of dedication of a IEEE Milestone to the three gravitational wave antennas ‘for the first gravitational waves detection and the launching of the era of Multi Messenger Astronomy with the coordinated detection of gravitational waves from a binary neutron star merger’ took place.
Pictured from the left Giovanni Losurdo – Virgo spokesperson, Marco Pallavicini – EGO Council president, Antonio Zoccoli – INFN President, Stavros Katsanevas – EGO director, Bernardo Tellini – IEEE Italy Section chair, Eugenio Giani – President of Tuscany, Massimo Carpinelli – EGO Deputy Director (Credits: EGO)
The ceremony was held as a global event, during which the Italian site of the European Gravitational Observatory – EGO in Cascina was connected via network with the equivalent US sites in Livingston in Louisiana and in Hanford in the state of Washington. The event saw the participation of, among others, the president of the IEEE Kathy Susan Land, the governors of the two US states, the President of Tuscany Eugenio Giani, the presidents of the US and European Funding Agencies involved: the American National Science Foundation – NSF, the Italian Istituto Nazionale di Fisica Nucleare -INFN, the French CNRS – Centre National de la Recherche Scientifique, the Dutch NWO – Netherlands Organisation for Scientific Research and the three Nobel laureates for the discovery of gravitational waves: Barry Barish, Kip Thorne and Rainer Weiss.
“The scientific endeavour of the detection of gravitational waves and of Virgo is an extraordinary story – said Stavros Katsanevas, Director of EGO – European Gravitational Observatory – in which the persistence and the visionary spirit of some scientists, like Adalberto Giazotto and Alain Brillet, have opened a new field of knowledge and inaugurated a new era of cosmic observations. Furthermore the same technologies that we have invented to detect echoes from the merging of black holes or stars millions of light years away from Earth can have important applications for society, for example to study earthquakes or climate change. This way gravitational observatories can become antennas listening to the environment near us in addition to exploring the far cosmos.”
The IEEE Milestone program was launched in 1983 by the Institute of Electrical and Electronics Engineers – IEEE to celebrate the most significant achievements in IEEE’s areas of interest.
The AHEAD2020 (Integrated Activities for High Energy Astrophysics) project has been funded under the Horizon 2020 Research Infrastructure Program. The AHEAD2020 main goal is to integrate and open research infrastructures for high energy and multi-messenger astrophysics. They offer a wide program of transnational access (TNA) to the best European test and calibration facilities and training/mentoring on X-ray data analysis and computational astrophysics at AHEAD2020 astronomical institutes and data centres. Moreover, they offer the possibility for scientists and engineers at all expertise levels to visit European institutes of their choice through their visitor program call. Proposals will be peer-reviewed by specific AHEAD2020 selection panels and ranked according to their merit. The access costs for the selected facility will be covered by AHEAD2020 as well as travel costs and daily allowances for the successful applicants.
The AHEAD2020 calls for a program of transnational visits and remote access activities to be performed starting April 2021. The main objectives are:
fostering new or strengthening existing collaborations on science and technology topics in high energy astrophysics (visitor program);
providing training and/or mentoring on high energy data analysis, use of advanced tools , computational astrophysics and multi messenger astronomy;
providing free access to some of the best European ground test and calibration facilities relevant for high-energy astrophysics.
Visitor grants include full reimbursement of travel and subsistence expenses. To face possible restrictions to travel as effect of the pandemic, the possibility of remote access for a number of services in the area of data analysis, tools and computational astrophysics will be provided.
AO-1 Calls Opening: 11 January 2021
Submission Deadline: 22 February 2021(**)
** For activities concerning access to experimental facilities, submission will remain open and proposals can be submitted anytime until August 2023; they will be evaluated typically within one month from delivery.