Scientific practice Archives – CERN Courier https://cerncourier.com/c/scientific-practice/ Reporting on international high-energy physics Fri, 07 Nov 2025 13:37:36 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 https://cerncourier.com/wp-content/uploads/2025/03/cropped-favicon-32x32.png Scientific practice Archives – CERN Courier https://cerncourier.com/c/scientific-practice/ 32 32 Ten windows on the future of particle physics https://cerncourier.com/a/ten-windows-on-the-future-of-particle-physics/ Fri, 07 Nov 2025 12:50:23 +0000 https://cerncourier.com/?p=114785 Paris Sphicas highlights key takeaways from the briefing book of the 2026 update of the European Strategy for Particle Physics.

The post Ten windows on the future of particle physics appeared first on CERN Courier.

]]>
A major step toward shaping the future of European particle physics was reached on 2 October, with the release of the Physics Briefing Book of the 2026 update of the European Strategy for Particle Physics. Despite its 250 pages, it is a concise summary of the vast amount of work contained in the 266 written submissions to the strategy process and the deliberations of the Open Symposium in Venice in June (CERN Courier September/October 2025 p24).

The briefing book compiled by the Physics Preparatory Group is an impressive distillation of our current knowledge of particle physics, and a preview of the exciting prospects offered by future programmes. It provides the scientific basis for defining Europe’s long-term particle-physics priorities and determining the flagship collider that will best advance the field. To this end, it presents comparisons of the physics reach of the different candidate machines, which often have different strengths in probing new physics beyond the Standard Model (SM).

Condensing all this in a few sentences is difficult, though two messages are clear: if the next collider at CERN is an electron–positron collider, the exploration of new physics will proceed mainly through high-precision measurements; and the highest physics reach into the structure of physics beyond the SM via indirect searches will be provided by the combined exploration of the Higgs, electroweak and flavour domains.

Following a visionary outlook for the field from theory, the briefing book divides its exploration of the future of particle physics into seven sectors of fundamental physics and three technology pillars that underpin them.

1. Higgs and electroweak physics

In the new era that has dawned with the discovery of the Higgs boson, numerous fundamental questions remain, including whether the Higgs boson is an elementary scalar, part of an extended scalar sector, or even a portal to entirely new phenomena. The briefing book highlights how precision studies of the Higgs boson, the W and Z bosons, and the top quark will probe the SM to unprecedented accuracy, looking for indirect signs of new physics.

Higgs self-coupling

Addressing these requires highly precise measurements of its couplings, self-interaction and quantum corrections. While the High-Luminosity LHC (HL-LHC) will continue to improve several Higgs and electroweak measurements, the next qualitative leap in precision will be provided by future electron–positron colliders, such as the FCC-ee, the Linear Collider Facility (LCF), CLIC or LEP3. And while these would provide very important information, it would fall upon the shoulders of an energy-frontier machine like the FCC-hh or a muon collider to access potential heavy states. Using the absolute HZZ coupling from the FCC-ee, such machines would measure the single-Higgs-boson couplings with a precision better than 1%, and the Higgs self-coupling at the level of a few per cent (see “Higgs self-coupling” figure).

This anticipated leap in experimental precision will necessitate major advances in theory, simulation and detector technology. In the coming decades, electroweak physics and the Higgs boson in particular will remain a cornerstone of particle physics, linking the precision and energy frontiers in the search for deeper laws of nature.

2. Strong interaction physics

Precise knowledge of the strong interaction will be essential for understanding visible matter, exploring the SM with precision, and interpreting future discoveries at the energy frontier. Building upon advanced studies of QCD at the HL-LHC, future high-luminosity electron–positron colliders such as FCC-ee and LEP3 would, like LHeC, enable per-mille precision on the strong coupling constant, and a greatly improved understanding of the transition between the perturbative and non-perturbative regimes of QCD. The LHeC would bring increased precision on parton-distribution functions that would be very useful for many physics measurements at the FCC-hh. FCC-hh would itself open up a major new frontier for strong-interaction studies.

A deep understanding of the strong interaction also necessitates the study of strongly interacting matter under extreme conditions with heavy-ion collisions. ALICE and the other experiments at the LHC will continue to illuminate this physics, revealing insights into the early universe and the interiors of neutron stars.

3. Flavour physics

With high-precision measurements of quark and lepton processes, flavour studies test the SM at energy scales far above those directly accessible to colliders, thanks to their sensitivity to the effects of virtual particles in quantum loops. Small deviations from theoretical predictions could signal new interactions or particles influencing rare processes or CP-violating effects, making flavour physics one of the most sensitive paths toward discovering physics beyond the SM.

The book highlights how precision studies of the Higgs boson, the W and Z bosons, and the top quark will probe the SM to unprecedented accuracy

Global efforts are today led by the LHCb, ATLAS and CMS experiments at the LHC and by the Belle II experiment at SuperKEKB. These experiments have complementary strengths: huge data samples from proton–proton collisions at CERN and a clean environment in electron–positron collisions at KEK. Combining the two will provide powerful tests of lepton-flavour universality, searches for exotic decays and refinements in the understanding of hadronic effects.

The next major step in precision flavour physics would require “tera-Z” samples of a trillion Z bosons from a high-luminosity electron–positron collider such as the FCC-ee, alongside a spectrum of focused experimental initiatives at a more modest scale.

4. Neutrino physics

Neutrino physics addresses open fundamental questions related to neutrino masses and their deep connections to the matter–antimatter asymmetry in the universe and its cosmic evolution. Upcoming experiments including long-baseline accelerator-neutrino experiments (DUNE and Hyper-Kamiokande), reactor experiments such as JUNO (see “JUNO takes aim at neutrino-mass hierarchy” and astroparticle observatories (KM3NeT and IceCube, see also CERN Courier May/June 2025 p23) will likely unravel the neutrino mass hierarchy and discover leptonic CP violation.

In parallel, the hunt for neutrinoless-double-beta decay continues. A signal would indicate that neutrinos are Majorana fermions, which would be indisputable evidence for new physics! Such efforts extend the reach of particle physics beyond accelerators and deepen connections between disciplines. Efforts to determine the absolute mass of neutrinos are also very important.

The chapter highlights the growing synergy between neutrino experiments and collider, astrophysical and cosmological studies, as well as the pivotal role of theory developments. Precision measurements of neutrino interactions provide crucial support for oscillation measurements, and for nuclear and astroparticle physics. New facilities at accelerators explore neutrino scattering at higher energies, while advances in detector technologies have enabled the measurement of coherent neutrino scattering, opening new opportunities for new physics searches. Neutrino physics is a truly global enterprise, with strong European partici­pation and a pivotal role for the CERN neutrino platform.

5. Cosmic messengers

Astroparticle physics and cosmology increasingly provide new and complementary information to laboratory particle-physics experiments in addressing fundamental questions about the universe. A rich set of recent achievements in these fields includes high-precision measurements of cosmological perturbations in the cosmic microwave background (CMB) and in galaxy surveys, a first measurement of an extragalactic neutrino flux, accurate antimatter fluxes and the discovery of gravitational waves (GWs).

Leveraging information from these experiments has given rise to the field of multi-messenger astronomy. The next generation of instruments, from neutrino telescopes to ground- and space-based CMB and GW observatories, promises exciting results with important clues for
particle physics.

6. Beyond the Standard Model

The landscape for physics beyond the SM is vast, calling for an extended exploration effort with exciting prospects for discovery. It encompasses new scalar or gauge sectors, supersymmetry, compositeness, extra dimensions and dark-sector extensions that connect visible and invisible matter.

Many of these models predict new particles or deviations from SM couplings that would be accessible to next-generation accelerators. The briefing book shows that future electron–positron colliders such as FCC-ee, CLIC, LCF and LEP3 have sensitivity to the indirect effects of new physics through precision Higgs, electroweak and flavour measurements. With their per-mille precision measurements, electron–positron colliders will be essential tools for revealing the virtual effects of heavy new physics beyond the direct reach of colliders. In direct searches, CLIC would extend the energy frontier to 1.5 TeV, whereas FCC-hh would extend it to tens of TeV, potentially enabling the direct observation of new physics such as new gauge bosons, supersymmetric particles and heavy scalar partners. A muon collider would combine precision and energy reach, offering a compact high-energy platform for direct and indirect discovery.

This chapter of the briefing book underscores the complementarity between collider and non-collider experiments. Low-energy precision experiments, searches for electric dipole moments, rare decays and axion or dark-photon experiments probe new interactions at extremely small couplings, while astrophysical and cosmological observations constrain new physics over sprawling mass scales.

7. Dark matter and the dark sector

The nature of dark matter, and the dark sector more generally, remains one of the deepest mysteries in modern physics. A broad range of masses and interaction strengths must be explored, encompassing numerous potential dark-matter phenomenologies, from ultralight axions and hidden photons to weakly interacting massive particles, sterile neutrinos and heavy composite states. The theory space of the dark sector is just as crowded, with models involving new forces or “portals” that link visible and invisible matter.

As no single experimental technique can cover all possibilities, progress will rely on exploiting the complementarity between collider experiments, direct and indirect searches for dark matter, and cosmological observations. Diversity is the key aspect of this developing experimental programme!

8. Accelerator science and technology

The briefing book considers the potential paths to higher energies and luminosities offered by each proposal for CERN’s next flagship project: the two circular colliders FCC-ee and FCC-hh, the two linear colliders LCF and CLIC, and a muon collider; LEP3 and LHeC are also considered as colliders that could potentially offer a physics programme to bridge the time between the HL-LHC and the next high-energy flagship collider. The technical readiness, cost and timeline of each collider are summarised, alongside their environmental impact and energy efficiency (see “Energy efficiency” figure).

Energy efficiency

The two main development fronts in this technology pillar are high-field magnets and efficient radio-frequency (RF) cavities. High-field superconducting magnets are essential for the FCC-hh, while high-temperature superconducting magnet technology, which presents unique opportunities and challenges, might be relevant to the FCC-hh as a second-stage machine after the FCC-ee. Efficient RF systems are required by all accelerators (CERN Courier May/June 2025 p30). Research and development (R&D) on advanced acceleration concepts, such as plasma-wakefield acceleration and muon colliders, also present much promise but necessitate significant work before they can present a viable solution for a future collider.

Preserving Europe’s leadership in accelerator science and technology requires a broad and extensive programme of work with continuous support for accelerator laboratories and test facilities. Such investments will continue to be very important for applications in medicine, materials science and industry.

9. Detector instrumentation

A wealth of lessons learned from the LHC and HL-LHC experiments are guiding the development of the next generation of detectors, which must have higher granularity, and – for a hadron collider – a higher radiation tolerance, alongside improved timing resolution and data throughput.

As the eyes through which we observe collisions at accelerators, detectors require a coherent and long-term R&D programme. Central to these developments will be the detector R&D collaborations, which have provided a structured framework for organising and steering the work since the previous update to the European Strategy for Particle Physics. These span the full spectrum of detector systems, with high-rate gaseous detectors, liquid detectors and high-performance silicon sensors for precision timing, precision particle identification, low-mass tracking and advanced calorimetry.

If detectors are the eyes that explore nature, computing is the brain that deciphers the signals they receive

All these detectors will also require advances in readout electronics, trigger systems and real-time data processing. A major new element is the growing role of AI and quantum sensing, both of which already offer innovative methods for analysis, optimisation and detector design (CERN Courier July/August 2025 p31). As in computing, there are high hopes and well-founded expectations that these technologies will transform detector design and operation.

To maintain Europe’s leadership in instrumentation, it is important to maintain sustained investments in test-beam infrastructures and engineering. This supports a mutually beneficial symbiosis with industry. Detector R&D is a portal to sectors as diverse as medical diagnostics and space exploration, providing essential tools such as imaging technologies, fast electronics and radiation-hard sensors for a wide range of applications.

10. Computing

Data challenge

If detectors are the eyes that explore nature, computing is the brain that deciphers the signals they receive. The briefing book pays much attention to the major leaps in computation and storage that are required by future experiments, with simulation, data management and processing at the top of the list (see “Data challenge” figure). Less demanding in resources, but equally demanding of further development, is data analysis. Planning for these new systems is guided by sustainable computing practices, including energy-efficient software and data centres. The next frontier is the HL-LHC, which will be the testing ground and the basis for future development, and serves as an example for the preservation of the current wealth of experimental data and software (CERN Courier September/October 2025 p41).

Several paradigm shifts hold great promise for the future of computing in high-energy physics. Heterogeneous computing integrates CPUs, GPUs and accelerators, providing hugely increased capabilities and better scaling than traditional CPU usage. Machine learning is already being deployed in event simulation, reconstruction and even triggering, and the first signs from quantum computing are very positive. The combination of AI with quantum technology promises a revolution in all aspects of software and of the development, deployment and usage of computing systems.

Some closing remarks

Beyond detailed physics summaries, two overarching issues appear throughout the briefing book.

First, progress will depend on a sustained interplay between experiment, theory and advances in accelerators, instrumentation and computing. The need for continued theoretical development is as pertinent as ever, as improved calculations will be critical for extracting the full physics potential of future experiments.

Second, all this work relies on people – the true driving force behind scientific programmes. There is an urgent need for academia and research institutions to attract and support experts in accelerator technologies, instrumentation and computing by offering long-term career paths. A lasting commitment to training the new generation of physicists who will carry out these exciting research programmes is equally important.

Revisiting the briefing book to craft the current summary brought home very clearly just how far the field of particle physics has come – and, more importantly, how much more there is to explore in nature. The best is yet to come!

The post Ten windows on the future of particle physics appeared first on CERN Courier.

]]>
Feature Paris Sphicas highlights key takeaways from the briefing book of the 2026 update of the European Strategy for Particle Physics. https://cerncourier.com/wp-content/uploads/2025/10/CCNovDec25_ESPPU_frontis.jpg
Polymath, humanitarian, gentleman https://cerncourier.com/a/polymath-humanitarian-gentleman/ Fri, 07 Nov 2025 12:38:07 +0000 https://cerncourier.com/?p=114822 Herwig Schopper, Director-General of CERN from 1981 to 1988, passed away on 19 August at the age of 101.

The post Polymath, humanitarian, gentleman appeared first on CERN Courier.

]]>
Towards LEP and the LHC

Herwig Schopper was born on 28 February 1924 in the German-speaking town of Landskron (today, Lanškroun) in the then young country of Czechoslovakia. He enjoyed an idyllic childhood, holidaying at his grandparents’ hotel in Abbazia (today, Opatija) on what is now the Croatian Adriatic coast. It was there that his interest in science was awakened through listening in on conversations between physicists from Budapest and Belgrade. In Landskron, he developed an interest in music and sport, learning to play both piano and double bass, and skiing in the nearby mountains. He also learned to speak English, not merely to read Shakespeare as was the norm at the time, but to be able to converse, thanks to a Jewish teacher who had previously spent time in England. This skill was to prove transformational later in life.

The idyll began to crack in 1938 when the Sudetenland was annexed by Germany. War broke out the following year, but the immediate impact on Herwig was limited. He remained in Landskron until the end of his high-school educ ation, graduating as a German citizen – and with no choice but to enlist. Joining the Luftwaffe signals corps, because he thought that would help him develop his knowledge of physics, he served for most of the war on the Eastern Front ensuring that communication lines remained open between military headquarters and the troops on the front lines. As the war drew to a close in March 1945, he was transferred west, just in time to see the Western Allies cross the Rhine at Remagen. Recalled to Berlin and given orders to head further west, Herwig instructed his driver to first make a short detour via Potsdam. This was a sign of the kind of person Herwig was that, amidst the chaos of the fall of Berlin, he wanted to see Schloss Sanssouci, Frederick the Great’s temple to the enlightenment, while he had the chance.

Academic overture

By the time Herwig arrived in Schleswig–Holstein, the war was over, and he found himself a prisoner of the British. He later recalled, with palpable relief, that he had managed to negotiate the war without having to shoot at anyone. On discovering that Herwig spoke English, the British military administration engaged him as a translator. This came as a great consolation to Herwig since many of his compatriots were dispatched to the mines to extract the coal that would be used to reconstruct a shattered Germany. Herwig rapidly struck up a friendship with the English captain he was assigned to. This in turn eased his passage to the University of Hamburg, where he began his research career studying optics, and later enabled him to take the first of his scientific sabbaticals when travel restrictions on German academics were still in place (see “Academic overture” image).

In 1951, Herwig left for a year in Stockholm, where he worked with Lise Meitner on beta decay. He described this time as his first step up in energy from the eV-energies of visible light to the keV-energies of beta-decay electrons. A later sabbatical, starting in 1956, would see him in Cambridge, where he worked under Meitner’s nephew, Otto Frisch, in the Cavendish laboratory. As Austrian Jews, both Meitner and Frisch had sought exile before the war. By this time, Frisch had become director of the Cavendish’s nuclear physics department and a fellow of the Royal Society.

Initial interactions

While at Cambridge, Herwig took his first steps in the emerging field of particle physics, and became one of the first to publish an experimental verification of Lee and Yang’s proposal that parity would be violated in weak interactions. His single-author paper was published soon after that by Chien-Shiung Wu and her team, leading to a lifelong friendship between the two (see “Virtuosi” image).

Following Wu’s experimental verification of parity violation, cited by Herwig in his paper, Lee and Yang received the Nobel Prize. Wu was denied the honour, ostensibly on the basis that she was one of a team and the prize can only be shared three ways. It remains in the realm of speculation whether Herwig would have shared the prize had his paper been the first to appear.

Virtuosi

A third sabbatical, arranged by Willibald Jentschke, who wanted Herwig to develop a user group for the newly established DESY laboratory, saw the Schopper family move to Ithaca, New York in 1960. At Cornell, Herwig learned the ropes of electron synchrotrons from Bob Wilson. He also learned a valuable lesson in the hands-on approach to leadership. Arriving in Ithaca on a Saturday, Herwig decided to look around the deserted lab. He found one person there, tidying up. It turned out not to be the janitor, but the lab’s founder and director, Wilson himself. For Herwig, Cornell represented another big jump in energy, cementing Schopper as an experimental particle physicist.

Cornell represented another big jump in energy, cementing Schopper as an experimental particle physicist

Herwig’s three sabbaticals gave him the skills he would later rely on in hardware development and physics analysis, but it was back in Germany that he honed his management skills and established himself a skilled science administrator.

At the beginning of his career in Hamburg, Herwig worked under Rudolf Fleischmann, and when Fleischmann was offered a chair at Erlangen, Herwig followed. Among the research he carried out at Erlangen was an experiment to measure the helicity of gamma rays, a technique that he’d later deploy in Cambridge to measure parity violation.

Prélude

It was not long before Herwig was offered a chair himself, and in 1958, at the tender age of 34, he parted from his mentor to move to Mainz. In his brief tenure there, he set wheels in motion that would lead to the later establishment of the Mainz Microtron laboratory, today known as MAMI. By this time, however, Herwig was much in demand, and he soon moved to Karlsruhe, taking up a joint position between the university and the Kernforschungszentrum, KfK. His plan was to merge the two under a single management structure as the Karlsruhe Institute for Experimental Nuclear Physics. In doing so, he laid the seeds for today’s Karlsruhe Institute of Technology, KIT.

Pioneering research

At Karlsruhe, Herwig established a user group for DESY, as Jentschke had hoped, and another at CERN. He also initiated a pioneering research programme into superconducting RF and had his first personal contacts with CERN, spending a year there in 1964. In typical Herwig fashion, he pursued his own agenda, developing a device he called a sampling total absorption counter, STAC, to measure neutron energies. At the time, few saw the need for such a device, but this form of calorimetry is now an indispensable part of any experimental particle physicists’ armoury.

In 1970, Herwig again took leave of absence from Karlsruhe to go to CERN. He’d been offered the position of head of the laboratory’s Nuclear Physics Division, but his stay was to be short lived (see “Prélude” image). The following year, Jentschke took up the position of Director-General of CERN alongside John Adams. Jentschke was to run the original CERN laboratory, Lab I, while Adams ran the new CERN Lab II, tasked with building the SPS. This left a vacancy at Germany’s national laboratory, and the job was offered to Herwig. It was too good an offer to refuse.

As chair of the DESY directorate, Herwig witnessed from afar the discovery of both charm and bottom quarks in the US. Although missing out on the discoveries, DESY’s machines were perfect laboratories to study the spectroscopy of these new quark families, and DESY went on to provide definitive measurements. Herwig also oversaw DESY’s development in synchrotron light science, repurposing the DORIS accelerator as a light source when its physics career was complete and it was succeeded by PETRA.

Architects of LEP

The ambition of the PETRA project put DESY firmly on course to becoming an international laboratory, setting the scene for the later HERA model. PETRA experiments went on to discover the gluon in 1979.

The following year, Herwig was named as CERN’s next Director-General, taking up office on 1 January 1981. By this time, the CERN Council had decided to call time on its experiment with two parallel laboratories, leaving Herwig with the task of uniting Lab I and Lab II. The Council was also considering plans to build the world’s most powerful accelerator, the Large Electron–Positron collider, LEP.

It fell to Herwig both to implement a new management structure for CERN and to see the LEP proposal through to approval (see “Architects of LEP” image). Unpopular decisions were inevitable, making the early years of Herwig’s mandate somewhat difficult. In order to get LEP approved, he had to make sacrifices. As a result, the Intersecting Storage Rings (ISR), the world’s only hadron collider, collided its final beams in 1984 and cuts had to be made across the research programme. Herwig was also confronted with a period of austerity in science funding, and found himself obliged to commit CERN to constant funding in real terms throughout the construction of LEP, and as it turns out, in perpetuity.

It fell to Herwig both to implement a new management structure for CERN and to see the LEP proposal through to approval

Herwig’s battles were not only with the lab’s governing body; he also went against the opinions of some of his scientific colleagues concerning the size of the new accelerator. True to form, Herwig stuck with his instinct, insisting that the LEP tunnel should be 27 km around, rather than the more modest 22 km that would have satisfied the immediate research goals while avoiding the difficult geology beneath the Jura mountains. Herwig, however, was looking further ahead – to the hadron collider that would follow LEP. His obstinacy was fully vindicated with the discovery of the Higgs boson in 2012, confirming the Brout–Englert–Higgs mechanism, which had been proposed almost 50 years earlier. This discovery earned the Nobel Prize for Peter Higgs and François Englert in 2013 (see “Towards LEP and the LHC” image).

The CERN blueprint

Difficult though some of his decisions may have been, there is no doubt that Herwig’s 1981 to 1988 mandate established the blueprint for CERN to this day. The end of operations of the ISR may have been unpopular, and we’ll never know what it may have gone on to achieve, but the world’s second hadron collider at the SPS delivered CERN’s first Nobel prize during Herwig’s mandate, awarded to Carlo Rubbia and Simon van der Meer in 1984 for the discovery of W and Z bosons.

Herwig turned 65 two months after stepping down as CERN Director-General, but retirement was never on his mind. In the years that followed, he carried out numerous roles for UNESCO, applying his diplomacy and foresight to new areas of science. UNESCO was in many ways a natural step for Herwig, whose diplomatic skills had been honed by the steady stream of high-profile visitors to CERN during his mandate as Director-General. At one point, he engineered a meeting at UNESCO between Jim Cronin, who was lobbying for the establishment of a cosmic-ray observatory in Argentina, and the country’s president, Carlos Menem. The following day, Menem announced the start of construction of the Pierre Auger Observatory. On another occasion, Herwig was tasked with developing the Soviet gift to Cuba of a small particle accelerator into a working laboratory. That initiative would ultimately come to nothing, but it helped Herwig prepare the groundwork for perhaps his greatest post-retirement achievement: SESAME, a light-source laboratory in Jordan that operates as an intergovernmental organisation following the CERN model (see “Science diplomacy” image). Mastering the political challenge of establishing an organisation that brings together countries from across the Middle East – including long-standing rivals – required a skill set that few possess.

Science diplomacy

Although the roots of SESAME can be traced to a much earlier date, by the end of the 20th century, when the idea was sufficiently mature for an interim organisation to be established, Herwig was the natural candidate to lead the new organisation through its formative years. His experience of running international science coupled with his post-retirement roles at UNESCO made him the obvious choice to steer SESAME from idea to reality. It was Herwig who modelled SESAME’s governing document on the CERN convention, and it was Herwig who secured the site in Jordan for the laboratory. Today, SESAME is producing world-class research – a shining example of what can be achieved when people set aside their differences and focus on what they have in common.

Establishing an organisation that brings together countries from across the Middle East required a skill set few possess

Herwig never stopped working for what he believed in. When CERN’s current Director-General convened a meeting with past Directors-General in 2024, along with the president of the CERN Council, Herwig was present. When initiatives were launched to establish an international research centre in the Balkans, Herwig stepped up to the task. He never lost his sense of what is right, and he never lost his mischievous sense of humour. Following an interview at his house in 2024 for the film The Peace Particle, the interviewer asked whether he still played the piano. Herwig stood up, walked to the piano and started to play a very simple arrangement of Christian Sinding’s “Rustle of Spring”. Just as curious glances started to be exchanged, he transitioned, with a twinkle in his eye, to a beautifully nuanced rendition of Liszt’s “Liebestraum No. 3”.

Herwig Schopper was a rare combination of genius, polymath, humanitarian and gentleman. Always humble, he could make decisions with nerves of steel when required. His legacy spans decades and disciplines, and has shaped the field of particle physics in many ways. With his passing, the world has lost a truly remarkable individual. He will be sorely missed.

The post Polymath, humanitarian, gentleman appeared first on CERN Courier.

]]>
Feature Herwig Schopper, Director-General of CERN from 1981 to 1988, passed away on 19 August at the age of 101. https://cerncourier.com/wp-content/uploads/2025/10/CCNovDec25_SCHOPPER_feature.jpg
The physicist who fought war and cancer https://cerncourier.com/a/the-physicist-who-fought-war-and-cancer/ Fri, 07 Nov 2025 12:34:03 +0000 https://cerncourier.com/?p=114858 Subatomic physics has shaped both the conduct of war and the treatment of cancer. Joseph Rotblat, who left the Manhattan Project on moral grounds and later advanced radiotherapy, embodies this dual legacy.

The post The physicist who fought war and cancer appeared first on CERN Courier.

]]>
The courage of his convictions

Joseph Rotblat’s childhood was blighted by the destruction visited on Warsaw, first by the Tsarist Army, followed by the Central Powers and completed by the Red Army from 1918 to 1920. His father’s successful paper-importing business went bankrupt in 1914, and the family became destitute. After a short course in electrical engineering, Joseph and a teenaged friend became jobbing electricians. A committed autodidact, Rotblat found his way into the Free University, where he studied physics under Ludwik Wertenstein. Wertenstein had worked with Marie Skłodowska-Curie in Paris and was the chief of the Radiological Institute in Warsaw as well as teaching at the Free University. He was the first to recognise Rotblat’s brilliance and retained him as a researcher at the Institute. Rotblat’s main research was neutron-induced artificial radioactivity: he was among the first to induce cobalt-60, which became a standard source in radiotherapy machines before reliable linear accelerators were available.

Chadwick described Rotblat as “very intelligent and very quick”

By the late 1930s, Rotblat had published more than a dozen papers, some in English journals after translation by Wertenstein; the name Rotblat was becoming known in neutron physics. The professor regarded him as the likely next head of the Radiological Institute and thought he should prepare by working outside Poland. Rotblat wanted to gain experience of the cyclotron and, although he could have joined the Joliot–Curie group in Paris, elected to go to Liverpool where James Chadwick was overseeing a machine expected to produce a proton beam within months. He arrived in Liverpool in April 1939 and was shocked by the city’s filth. He also found the scouse dialect of its citizens incomprehensible. Despite the trying circumstances, Rotblat soon impressed Chadwick with his experimental skill and was rewarded with a prestigious fellowship. Chadwick wrote to Wertenstein in June describing Rotblat as “very intelligent and very quick”.

Brimming with enthusiasm

Chadwick had formed a long-distance friendship with Ernest Lawrence, the cyclotron’s inventor, who kept him apprised of developments in Berkeley. At the time of Rotblat’s arrival, Lawrence was brimming with enthusiasm about the potential of neutrons and radioactive isotopes from cyclotrons for medical research, especially in cancer treatment. Chadwick hired Bernard Kinsey, a Cambridge graduate who spent three years with Lawrence, to take charge of the Liverpool cyclotron, and he befriended Rotblat. Liverpool had limited funding: Chadwick complained to Lawrence that the money “this laboratory has been running on in the past few years – is less than some men spend on tobacco.” Chadwick served on a Cancer Commission in Liverpool under the leadership of Lord Derby, which planned to bring cancer research to the Liverpool Radium Institute using products from the cyclotron.

James Chadwick

The small stipend from the Oliver Lodge fellowship encouraged Rotblat to return to Warsaw in August 1939 to collect his wife, Tola, and bring her to England. She was recovering from acute appendicitis; her doctors persuaded Joseph that she was not fit to travel. So he returned alone on the last train allowed to pass through Berlin before the Germans attacked Poland once more. Tola wrote her last letter to Joseph in December 1939. While he was in Warsaw, Rotblat confided in Wertenstein about his belief that a uranium fission bomb was feasible using fast neutrons, and he repeated this argument to Chadwick when he returned to Liverpool. Chadwick eventually became the leader of the British contingent on the Manhattan Project and arranged for Rotblat to come to Los Alamos in 1944 while remaining a Polish citizen. Rotblat worked in Robert Wilson’s cyclotron group and survived a significant radiation accident, receiving an estimated dose of 1.5 J/kg to his upper torso and head. The circumstances of his leaving the project in December 1944 were far more complicated than the moralistic account he wrote in The Bulletin of the Atomic Scientists 40 years later, but no less noble.

Tragedy and triumph

As Chadwick wrote to Rotblat in London, he saw “very obvious advantages” for the future of nuclear physics in Britain from Rotblat’s return to Liverpool. For one thing, “Rotblat has a wider experience on the cyclotron than anyone now in England,” and he also possessed “a mass of information on the equipment used in Project Y [Los Alamos] and Chicago.” Chadwick had two major roles in mind for Rotblat. One was to revitalise the depleted Liverpool department and to stimulate cyclotron research in England; and the second to collate the detailed data on nuclear physics brought by British scientists returning from the Manhattan Project. In 1945, Rotblat discovered that six members of his family had miraculously survived the war in Poland, but tragically not Tola. His state of despair deepened after the news of the atomic bombs being used against Japan: he knew about the possibility of a hydrogen bomb, and remembered conversations with Niels Bohr in Los Alamos about the risks of a nuclear arms race. He made two resolutions: to campaign against nuclear weapons and to leave academic nuclear physics and become a medical physicist to use his scientific knowledge for the direct benefit of people.

Joseph Rotblat
Robert Wilson

When Chadwick returned to Liverpool from the US, he found the department in a much better state than he expected. The credit for this belonged largely to Rotblat’s leadership; Chadwick wrote to Lawrence praising his outstanding ability, combined with a truly remarkable concern for the staff and students. Chadwick and Rotblat then agreed to build a synchrocyclotron in Liverpool. Rotblat selected the abandoned crypt of an unbuilt Catholic cathedral as the best site, since the local topography would provide some radiation protection. The post-war shortages, especially of steel, made this an extremely ambitious project. Rotblat presented a successful application for the largest university grant to the Department of Science and Industrial Research, and despite design and construction problems resulting in spiralling costs, the machine was in active research use from 1954 to 1968.

With the encouragement of physicians at Liverpool Royal Infirmary, Rotblat started to dabble in nuclear medicine to image thyroid glands and treat haematological disorders. In 1949 he saw an advert for the chair in physics at the Medical College of St. Bartholomew’s Hospital (Bart’s) in London and applied. While Rotblat was easily the most accomplished candidate, there was a long delay in his appointment on spurious grounds, such as being over-qualified to teach physics to medical students, likely to be a heavy consumer of research funds and xenophobia. Bart’s was a closed, reactionary institution. There was a clear division between the Medical College, with its links to London University, and the hospital, where the post-war teaching was suboptimal as it struggled to recover from the war and adjusted reluctantly to the new National Health Service (NHS). The Medical College, in Charterhouse Square, was severely bombed in the Blitz and the physics department completely destroyed. Rotblat attempted to thwart his main opponent, the dean (described as “secretive and manipulative” in one history), by visiting the hospital and meeting senior clinicians and governors. There was also a determined effort, orchestrated by Chadwick, to retain him in the ranks of nuclear physicists.

When I interviewed Rotblat in 1994, he told me that Chadwick’s final tactic was to tell him that he was close to being elected as a fellow of the Royal Society, but if he took the position at Bart’s, it would never happen. Rotblat poignantly observed: “He was right.” I mentioned this to Lorna Arnold, the nuclear historian, who thought it was a shame. She said she would take it up with her friend Rudolf Peierls. Despite being in poor health, Peierls vowed to correct this omission, and the next year the Royal Society elected Rotblat a fellow at the age of 86.

Full-time medical physicist

Rotblat’s first task at Bart’s, when he finally arrived in 1950, was to prepare a five-year departmental plan: a task he was well-qualified for after his experience with the synchrocyclotron in Liverpool. With wealthy, centuries-old hospitals such as Bart’s allowed to keep their endowments after the advent of the NHS, he also became an active committee member for the new Research Endowment Fund that provided internal grants and hired research assistants. The physics department soon collaborated with the biochemistry, pharmacology and physiology departments that required radioisotopes for research. He persuaded the Medical College to buy a 15 MV linear accelerator from Mullard, an English electronics company, which never worked for long without problems.

Rotblat resolved to campaign against nuclear weapons and use his scientific knowledge for the direct benefit of people

During his first two years, in addition to the radioisotope work, he studied the passage of electrons through biological tissue and the energy dissipation of neutrons in tissue – the 1950s were a golden age for radiobiology in England, and Rotblat forged close relationships with Hal Gray and his group at the Hammersmith Hospital. In the mid-1950s, he was approached by Patricia Lindop, a newly qualified Bart’s physician who had also obtained a first-class degree in physiology. Lindop had a five-year grant from the Nuffield Foundation to study ageing and, after discussions with Rotblat, it was soon arranged that she would study the acute and long-term effects of radiation in mice at different ages. This was a massive, prospective study that would eventually involve six research assistants and a colony of 30,000 mice. Rotblat acted as the supervisor for her PhD, and they published multiple papers together. In terms of acute death (within 30 days of a high, whole-body dose), she found that mice that were one-day old at exposure could tolerate the highest doses, whereas four-week-old mice were the most vulnerable. The interpretation of long-term effects was much less clearcut and provoked major disagreements within the radiobiology community. In a 1994 letter, Rotblat mused on the number of Manhattan Project scientists still alive: “According to my own studies on the effects of radiation on lifespan, I should have been dead a long time, having received a sub-lethal dose in Los Alamos. But here I am, advocating the closure of Los Alamos, Livermore and Sandia, instead of promoting them as health resorts!”

Patricia Lindop

In 1954, the US Bravo test obliterated the Bikini atoll and layered a Japanese fishing boat (Lucky Dragon No. 5) that was outside the exclusion zone in the South Pacific with radioactive dust. American scientists realised that the weapon massively exceeded its designed yield, and there was an unconvincing attempt to allay public fear. Rotblat was invited onto BBC’s flagship current-affairs programme, Panorama, to explain to the public the difference between the original fission bombs and the H-bomb. His lucid delivery impressed Bertrand Russell, a mathematical philosopher and a leading pacifist in World War I, who also spoke on Panorama. The two became close friends. When Rotblat went to a radiobiology conference a few months later, he met a Japanese scientist who had analysed the dust recovered from Lucky Dragon No. 5. The dust was comprised of about 60% rare-earth isotopes, leading Rotblat to believe that most of the explosive energy was due to fission not fusion. He wrote his own report, not based on any inside knowledge and despite official opposition, concluding this was a fission–fusion–fission bomb and that his TV presentation had underestimated its power by orders of magnitude. Rotblat’s report became public just as the British Cabinet decided in secret to develop thermonuclear weapons. The government was concerned that the Americans would view this as another breach of security by an ex-Manhattan Project physicist. Rotblat’s reputation as a man of the political left grew within the conservative institution of Bart’s.

Russell made a radio address at the end of 1954 to address the global existential threat posed by thermonuclear weapons and urged the public to “remember your humanity and forget the rest”. Six months later, Russell announced the Russell–Einstein Manifesto with Rotblat as one of the signatories, and relied upon by Russell to answer questions from the press. The first Pugwash conference followed in 1957 with Rotblat as a prominent contributor. His active involvement, closely supported by Lindop, would last for the rest of his life, as he encouraged communication across the East–West divide and pushed for international arms control agreements. Much of this work took place in his office at Bart’s. Rotblat and the Pugwash conference then shared the 1995 Nobel Peace Prize.

The post The physicist who fought war and cancer appeared first on CERN Courier.

]]>
Feature Subatomic physics has shaped both the conduct of war and the treatment of cancer. Joseph Rotblat, who left the Manhattan Project on moral grounds and later advanced radiotherapy, embodies this dual legacy. https://cerncourier.com/wp-content/uploads/2025/10/CCNovDec25_ROTBLAT_feature.jpg
The measurement problem, measured https://cerncourier.com/a/the-measurement-problem-measured/ Fri, 07 Nov 2025 12:17:51 +0000 https://cerncourier.com/?p=114742 Nature surveyed asked over 1000 researchers about their views on the interpretation of quantum mechanics.

The post The measurement problem, measured appeared first on CERN Courier.

]]>
A century on, physicists still disagree on what quantum mechanics actually means. Nature recently surveyed more than a thousand researchers, asking about their views on the interpretation of quantum mechanics. When broken down by career stage, the results show that a diversity of views spans all generations.

Getting eccentric with age

The Copenhagen interpretation remains the most widely held view, placing the act of measurement at the core of quantum theory well into the 2020s. Epistemic or QBist approaches, where the quantum state expresses an observer’s knowledge or belief, form the next most common group, followed by Everett’s many-worlds framework, in which all quantum outcomes continue to coexist without collapse (CERN Courier July/August 2025 p26). Other views maintain small but steady followings, including pilot-wave theory, spontaneous-collapse models and relational quantum mechanics (CERN Courier July/August 2025 p21).

Fewer than 10% of physicists surveyed declined to express a view. Though this cohort purports to include proponents of the “shut up and calculate” school of thought, an apparently dwindling cohort of disinterested working physicists may simply be undersampled.

Crucially, confidence is modest. Most respondents view their preferred interpretation as an adequate placeholder or a useful conceptual tool. Only 24% are willing to describe their preferred interpretation as correct, leaving ample room for manoeuvre in the very foundations of fundamental physics.

The post The measurement problem, measured appeared first on CERN Courier.

]]>
News Nature surveyed asked over 1000 researchers about their views on the interpretation of quantum mechanics. https://cerncourier.com/wp-content/uploads/2025/10/CCNovDec25_NA_quantum_feature.jpg
Hidden treasures https://cerncourier.com/a/hidden-treasures/ Tue, 09 Sep 2025 08:21:50 +0000 https://cerncourier.com/?p=114208 As the LHC surpasses one exabyte of stored data, Cristinel Diaconu and Ulrich Schwickerath call for new collaborations to join a global effort in data preservation.

The post Hidden treasures appeared first on CERN Courier.

]]>
Data resurrection

In 2009, the JADE experiment had been inoperational for 23 years. The PETRA electron–positron collider that served it had already completed a second life as a pre-accelerator for the HERA electron–proton collider and was preparing for a third life as an X-ray source. JADE and the other PETRA experiments were a piece of physics history, well known for seminal measurements of three-jet quark–quark-gluon events, and early studies of quark fragmentation and jet hadronisation. But two decades after being decommissioned, the JADE collaboration was yet to publish one of its signature measurements.

At high energies and short distances, the strong force becomes weaker. Quarks behave almost like free particles. This “asymptotic freedom” is a unique hallmark of QCD. In 2009, as now, JADE’s electron–positron data was unique in the low-energy range, with other data sets lost to history. When reprocessed with modern next-to-next-to-leading-order QCD and improved simulation tools, the DESY experiment was able to rival experiments at CERN’s higher-energy Large Electron–Positron (LEP) collider for precision on the strong coupling constant, contributing to a stunning proof of QCD’s most fundamental behaviour. The key was a farsighted and original initiative by Siggi Bethke to preserve JADE’s data and analysis software.

New perspectives

This data resurrection from JADE demonstrated how data can be reinterpreted to give new perspectives decades after an experiment ends. It was a timely demonstration. In 2009, HERA and SLAC’s PEP-II electron–positron collider had been recently decommissioned, and Fermilab’s Tevatron proton–antiproton collider was approaching the end of its operations. Each facility nevertheless had a strong analysis programme ahead, and CERN’s Large Hadron Collider (LHC) was preparing for its first collisions. How could all this data be preserved?

The uniqueness of these programmes, for which no upgrade or followup was planned for the coming decades, invited the consideration of data usability at horizons well beyond a few years. A few host labs risked a small investment, with dedicated data-preservation projects beginning, for example, at SLAC, DESY, Fermlilab, IHEP and CERN (see “Data preservation” dashboard). To exchange data-preservation concepts, methodologies and policies, and to ensure the long-term preservation of HEP data, the Data Preservation in High Energy Physics (DPHEP) group was created in 2014. DPHEP is a global initiative under the supervision of the International Committee for Future Accelerators (ICFA), with strong support from CERN from the beginning. It actively welcomes new collaborators and new partner experiments, to ensure a vibrant and long-term future for the precious data sets being collected at present and future colliders.

Data preservation

At the beginning of our efforts, DPHEP designed a four-level classification of data abstraction. Level 1 corresponds to the information typically found in a scientific publication or its associated HEPData entry (a public repository for high-energy physics data tables). Level 4 includes all inputs necessary to fully reprocess the original data and simulate the experiment from scratch.

The concept of data preservation had to be extended too. Simply storing data and freezing software is bound to fail as operating systems evolve and analysis knowledge disappears. A sensible preservation process must begin early on, while the experiments are still active, and take into account the research goals and available resources. Long-term collaboration organisation plays a crucial role, as data cannot be preserved without stable resources. Software must adapt to rapidly changing computing infrastructure to ensure that the data remains accessible in the long term.

Return on investment

But how much research gain could be expected for a reasonable investment in data preservation? We conservatively estimate that for dedicated investments below 1% of the cost of the construction of a facility, the scientific output increases by 10% or more. Publication records confirm that scientific outputs at major experimental facilities continue long after the end of operations (see “Publications per year, during and after data taking” panel). Publication rates remain substantial well beyond the “canonical” five years after the end of the data taking, particularly for experiments that pursued dedicated data-preservation programmes. For some experiments, the lifetime of the preservation system is by now comparable with the data-taking period, illustrating the need to carefully define collaborations for the long term.

Publication records confirm that scientific outputs at major experimental facilities continue long after the end of operations

The most striking example is BaBar, an electron–positron-collider experiment at SLAC that was designed to investigate the violation of charge-parity symmetry in the decays of B mesons, and which continues to publish using a preservation system now hosted outside the original experiment site. Aging infrastructure is now presenting challenges, raising questions about the very-long-term hosting of historical experiments – “preservation 2.0” – or the definitive end of the programme. The other historical b-factory, Belle, benefits from a follow-up experiment on site.

Publications per year, during and after data taking

Publications per year, during and after data taking

The publication record at experiments associated with the DPHEP initiative. Data-taking periods of the relevant facilities are shaded, and the fraction of peer-reviewed articles published afterwards is indicated as a percentage for facilities that are not still operational. The data, which exclude conference proceedings, were extracted from Inspire-HEP on 31 July 2025.

HERA, an electron– and positron–proton collider that was designed to study deep inelastic scattering (DIS) and the structure of the proton, continues to publish and even to attract new collaborators as the community prepares for the Electron Ion Collider (EIC) at BNL, nicely demonstrating the relevance of data preservation for future programmes. The EIC will continue studies of DIS in the regime of gluon saturation (CERN Courier January/February 2025 p31), with polarised beams exploring nucleon spin and a range of nuclear targets. The use of new machine-learning algorithms on the preserved HERA data has even allowed aspects of the EIC physics case to be explored: an example of those “treasures” not foreseen at the end of collisions.

IHEP in China conducts a vigorous data-preservation programme around BESIII data from electron–positron collisions in the BEPCII charm factory. The collaboration is considering using artificial intelligence to rank data priorities and user support for data reuse.

Remarkably, LEP experiments are still publishing physics analyses with archived ALEPH data almost 25 years after the completion of the LEP programme on 4 November 2000. The revival of the CERNLIB collection of FORTRAN data-analysis software libraries has also enabled the resurrection of the legacy software stacks of both DELPHI and OPAL, including the spectacular revival of their event displays (see “Data resurrection” figure). The DELPHI collaboration revised their fairly restrictive data-access policy in early 2024, opening and publishing their data via CERN’s Open Data Portal.

Some LEP data is currently being migrated into the standardised EDM4hep (event data model) format that has been developed for future colliders. As well as testing the format with real data, this will ensure data preservation and support software development, analysis training and detector design for the electron–positron collider phase of the proposed Future Circular Collider using real events.

The future is open

In the past 10 years, data preservation has grown in prominence in parallel with open science, which promotes free public access to publications, data and software in community-driven repositories, and according to the FAIR principles of findability, accessibility, interoperability and reusability. Together, data preservation and open science help maximise the benefits of fundamental research. Collaborations can fully exploit their data and share its unique benefits with the international community.

The two concepts are distinct but tightly linked. Data preservation focuses on maintaining data integrity and usability over time, whereas open data emphasises accessibility and sharing. They have in common the need for careful and resource-loaded planning, with a crucial role played by the host laboratory.

Treasure chest

Data preservation and open science both require clear policies and a proactive approach. Beginning at the very start of an experiment is essential. Clear guidelines on copyright, resource allocation for long-term storage, access strategies and maintenance must be established to address the challenges of data longevity. Last but not least, it is crucially important to design collaborations to ensure smooth international cooperation long after data taking has finished. By addressing these aspects, collaborations can create robust frameworks for preserving, managing and sharing scientific data effectively over the long term.

Today, most collaborations target the highest standards of data preservation (level 4). Open-source software should be prioritised, because the uncontrolled obsolescence of commercial software endangers the entire data-preservation model. It is crucial to maintain all of the data and the software stack, which requires continuous effort to adapt older versions to evolving computing environments. This applies to both software and hardware infrastructures. Synergies between old and new experiments can provide valuable solutions, as demonstrated by HERA and EIC, Belle and Belle II, and the Antares and KM3NeT neutrino telescopes.

From afterthought to forethought

In the past decade, data preservation has evolved from simply an afterthought as experiments wrapped up operations into a necessary specification for HEP experiments. Data preservation is now recognised as a source of cost-effective research. Progress has been rapid, but its implementation remains fragile and needs to be protected and planned.

In the past 10 years, data preservation has grown in prominence in parallel with open science

The benefits will be significant. Signals not imagined during the experiments’ lifetime can be searched for. Data can be reanalysed in light of advances in theory and observations from other realms of fundamental science. Education, training and outreach can be brought to life by demonstrating classic measurements with real data. And scientific integrity is fully realised when results are fully reproducible.

The LHC, having surpassed an exabyte of data, now holds the largest scientific data set ever accumulated. The High-Luminosity LHC will increase this by an order of magnitude. When the programme comes to an end, it will likely be the last data at the energy frontier for decades. History suggests that 10% of the LHC’s scientific programme will not yet have been published when collisions end, and a further 10% not even imagined. While the community discusses its strategy for future colliders, it must therefore also bear in mind data preservation. It is the key to unearthing hidden treasures in the data of the past, present and future.

The post Hidden treasures appeared first on CERN Courier.

]]>
Feature As the LHC surpasses one exabyte of stored data, Cristinel Diaconu and Ulrich Schwickerath call for new collaborations to join a global effort in data preservation. https://cerncourier.com/wp-content/uploads/2025/09/CCSepOct25_DATA_feature.jpg
A scientist in sales https://cerncourier.com/a/a-scientist-in-sales/ Tue, 08 Jul 2025 19:22:15 +0000 https://cerncourier.com/?p=113683 Massimiliano Pindo discusses opportunities for high-energy physicists in marketing and sales.

The post A scientist in sales appeared first on CERN Courier.

]]>
Massimiliano Pindo

The boundary between industry and academia can feel like a chasm. Opportunity abounds for those willing to bridge the gap.

Massimiliano Pindo began his career working on silicon pixel detectors at the DELPHI experiment at the Large Electron–Positron Collider. While at CERN, Pindo developed analytical and technical skills that would later become crucial in his career. But despite his passion for research, doubts clouded his hopes for the future.

“I wanted to stay in academia,” he recalls. “But at that time, it was getting really difficult to get a permanent job.” Pindo moved from his childhood home in Milan to Geneva, before eventually moving back in with his parents while applying for his next research grant. “The golden days of academia where people got a fixed position immediately after a postdoc or PhD were over.”

The path forward seemed increasingly unstable, defined by short-term grants, constant travel and an inability to plan long-term. There was always a constant stream of new grant applications, but permanent contracts were few and far between. With competition increasing, job stability seemed further and further out of reach. “You could make a decent living,” Pindo says, “but the real problem was you could not plan your life.”

Translatable skills

Faced with the unpredictability of academic work, Pindo transitioned into industry – a leap that eventually led him to his current role as marketing and sales director at Renishaw, France, a global engineering and scientific technology company. Pindo was confident that his technical expertise would provide a strong foundation for a job beyond academia, and indeed he found that “hard” skills such as analytical thinking, problem-solving and a deep understanding of technology, which he had honed at CERN alongside soft skills such as teamwork, languages and communication, translated well to his work in industry.

“When you’re a physicist, especially a particle physicist, you’re used to breaking down complex problems, selecting what is really meaningful amongst all the noise, and addressing these issues directly,” Pindo says. His experience in academia gave him the confidence that industry challenges would pale in comparison. “I was telling myself that in the academic world, you are dealing with things that, at least on paper, are more complex and difficult than what you find in industry.”

Initially, these technical skills helped Pindo become a device engineer for a hardware company, before making the switch to sales. The gradual transition from academia to something more hands-on allowed him to really understand the company’s product on a technical level, which made him a more desirable candidate when transitioning into marketing.

“When you are in B2B [business-to-business] mode and selling technical products, it’s always good to have somebody who has technical experience in the industry,” explains Pindo. “You have to have a technical understanding of what you’re selling, to better understand the problems customers are trying to solve.”

However, this experience also allowed him to recognise gaps in his knowledge. As he began gaining more responsibility in his new, more business-focused role, Pindo decided to go back to university and get an MBA. During the programme, he was able to familiarise himself with the worlds of human resources, business strategy and management – skills that aren’t typically the focus in a physics lab.

Pindo’s journey through industry hasn’t been a one-way ticket out of academia. Today, he still maintains a foothold in the academic world, teaching strategy as an affiliated professor at the Sorbonne. “In the end you never leave the places you love,” he says. “I got out through the door – now I’m getting back in through the window!”

Transitioning between industry and academia was not entirely seamless. Misconceptions loomed on both sides, and it took Pindo a while to find a balance between the two.

“There is a stereotype that scientists are people who can’t adapt to industrial environments – that they are too abstract, too theoretical,” Pindo explains. “People think scientists are always in the clouds, disconnected from reality. But that’s not true. The science we make is not the science of cartoons. Scientists can be people who plan and execute practical solutions.”

The misunderstanding, he says, goes both ways. “When I talk to alumni still in academia, many think that industry is a nightmare – boring, routine, uninteresting. But that’s also false,” Pindo says. “There’s this wall of suspicion. Academics look at industry and think, ‘What do they want? What’s the real goal? Are they just trying to make more money?’ There is no trust.”

Tight labour markets

For Pindo, this divide is frustrating and entirely unnecessary. Now with years of experience navigating both worlds, he envisions a more fluid connection between academia and industry – one that leverages the strengths of both. “Industry is currently facing tight labour markets for highly skilled talent, and academia doesn’t have access to the money and practical opportunities that industry can provide,” says Pindo. “Both sides need to work together.”

To bridge this gap, Pindo advocates a more open dialogue and a revolving door between the two fields – one that allows both academics and industry professionals to move fluidly back and forth, carrying their expertise across boundaries. Both sides have much to gain from shared knowledge and collaboration. One way to achieve this, he suggests, is through active participation in alumni networks and university events, which can nurture lasting relationships and mutual understanding. If more professionals embraced this mindset, it could help alleviate the very instability that once pushed him out of academia, creating a landscape where the boundaries between science and industry blur to the benefit of both.

“Everything depends on active listening. You always have to learn from the person in front of you, so give them the chance to speak. We have a better world to build, and that comes only from open dialogue and communication.”

The post A scientist in sales appeared first on CERN Courier.

]]>
Careers Massimiliano Pindo discusses opportunities for high-energy physicists in marketing and sales. https://cerncourier.com/wp-content/uploads/2025/07/CCJulAug25_CAR_Pindo_feature.jpg
Charting DESY’s future https://cerncourier.com/a/charting-desys-future/ Mon, 19 May 2025 07:34:51 +0000 https://cerncourier.com/?p=113176 DESY’s new chair, Beate Heinemann, reflects on the laboratory’s evolving role in science and society – from building next-generation accelerators to navigating Europe’s geopolitical landscape.

The post Charting DESY’s future appeared first on CERN Courier.

]]>
How would you describe DESY’s scientific culture?

DESY is a large laboratory with just over 3000 employees. It was founded 65 years ago as an accelerator lab, and at its heart it remains one, though what we do with the accelerators has evolved over time. It is fully funded by Germany.

In particle physics, DESY has performed many important studies, for example to understand the charm quark following the November Revolution of 1974. The gluon was discovered here in the late 1970s. In the 1980s, DESY ran the first experiments to study B mesons, laying the groundwork for core programmes such as LHCb at CERN and the Belle II experiment in Japan. In the 1990s, the HERA accelerator focused on probing the structure of the proton, which, incidentally, was the subject of my PhD, and those results have been crucial for precision studies of the Higgs boson.

Over time, DESY has become much more than an accelerator and particle-physics lab. Even in the early days, it used what is called synchrotron radiation, the light emitted when electrons change direction in the accelerator. This light is incredibly useful for studying matter in detail. Today, our accelerators are used primarily for this purpose: they generate X-rays that image tiny structures, for example viruses.

DESY’s culture is shaped by its very engaged and loyal workforce. People often call themselves “DESYians” and strongly identify with the laboratory. At its heart, DESY is really an engineering lab. You need an amazing engineering workforce to be able to construct and operate these accelerators.

Which of DESY’s scientific achievements are you most proud of?

The discovery of the gluon is, of course, an incredible achievement, but actually I would say that DESY’s greatest accomplishment has been building so many cutting-edge accelerators: delivering them on time, within budget, and getting them to work as intended.

Take the PETRA accelerator, for example – an entirely new concept when it was first proposed in the 1970s. The decision to build it was made in 1975; construction was completed by 1978; and by 1979 the gluon was discovered. So in just four years, we went from approving a 2.3 km accelerator to making a fundamental discovery, something that is absolutely crucial to our understanding of the universe. That’s something I’m extremely proud of.

I’m also very proud of the European X-ray Free-Electron Laser (XFEL), completed in 2017 and now fully operational. Before that, in 2005 we launched the world’s first free-electron laser, FLASH, and of course in the 1990s HERA, another pioneering machine. Again and again, DESY has succeeded in building large, novel and highly valuable accelerators that have pushed the boundaries of science.

What can we look forward to during your time as chair?

We are currently working on 10 major projects in the next three years alone! PETRA III will be running until the end of 2029, but our goal is to move forward with PETRA IV, the world’s most advanced X-ray source. Securing funding for that first, and then building it, is one of my main objectives. In Germany, there’s a roadmap process, and by July this year we’ll know whether an independent committee has judged PETRA IV to be one of the highest-priority science projects in the country. If all goes well, we aim to begin operating PETRA IV in 2032.

Our FLASH soft X-ray facility is also being upgraded to improve beam quality, and we plan to relaunch it in early September. That will allow us to serve more users and deliver better beam quality, increasing its impact.

In parallel, we’re contributing significantly to the HL-LHC upgrade. More than 100 people at DESY are working on building trackers for the ATLAS and CMS detectors, and parts of the forward calorimeter of CMS. That work needs to be completed by 2028.

Hunting axions

Astroparticle physics is another growing area for us. Over the next three years we’re completing telescopes for the Cherenkov Telescope Array and building detectors for the IceCube upgrade. For the first time, DESY is also constructing a space camera for the satellite UltraSat, which is expected to launch within the next three years.

At the Hamburg site, DESY is diving further into axion research. We’re currently running the ALPS II experiment, which has a fascinating “light shining through a wall” setup. Normally, of course, light can’t pass through something like a thick concrete wall. But in ALPS II, light inside a magnet can convert into an axion, a hypothetical dark-matter particle that can travel through matter almost unhindered. On the other side, another magnet converts the axion back into light. So, it appears as if the light has passed through the wall, when in fact it was briefly an axion. We started the experiment last year. As with most experiments, we began carefully, because not everything works at once, but two more major upgrades are planned in the next two years, and that’s when we expect ALPS II to reach its full scientific potential.

We’re also developing additional axion experiments. One of them, in collaboration with CERN, is called BabyIAXO. It’s designed to look for axions from the Sun, where you have both light and magnetic fields. We hope to start construction before the end of the decade.

Finally, DESY also has a strong and diverse theory group. Their work spans many areas, and it’s exciting to see what ideas will emerge from them over the coming years.

How does DESY collaborate with industry to deliver benefits to society?

We already collaborate quite a lot with industry. The beamlines at PETRA, in particular, are of strong interest. For example, BioNTech conducted some of its research for the COVID-19 vaccine here. We also have a close relationship with the Fraunhofer Society in Germany, which focuses on translating basic research into industrial applications. They famously developed the MP3 format, for instance. Our collaboration with them is quite structured, and there have also been several spinoffs and start-ups based on technology developed at DESY. Looking ahead, we want to significantly strengthen our ties with industry through PETRA IV. With much higher data rates and improved beam quality, it will be far easier to obtain results quickly. Our goal is for 10% of PETRA IV’s capacity to be dedicated to industrial use. Furthermore, we are developing a strong ecosystem for innovation on the campus and the surrounding area, with DESY in the centre, called the Science City Hamburg Bahrenfeld.

What’s your position on “dual use” research, which could have military applications?

The discussion around dual-use research is complicated. Personally, I find the term “dual use” a bit odd – almost any high-tech equipment can be used for both civilian and military purposes. Take a transistor for example, which has countless applications, including military ones, but it wasn’t invented for that reason. At DESY, we’re currently having an internal discussion about whether to engage in projects that relate to defence. This is part of an ongoing process where we’re trying to define under what conditions, if any, DESY would take on targeted projects related to defence. There are a range of views within DESY, and I think that diversity of opinion is valuable. Some people are firmly against this idea, and I respect that. Honestly, it’s probably how I would have felt 10 or 20 years ago. But others believe DESY should play a role. Personally, I’m open to it.

If our expertise can help people defend themselves and our freedom in Europe, that’s something worth considering. Of course, I would love to live in a world without weapons, where no one attacks anyone. But if I were attacked, I’d want to be able to defend myself. I prefer to work on shields, not swords, like in Asterix and Obelix, but, of course, it’s never that simple. That’s why we’re taking time with this. It’s a complex and multifaceted issue, and we’re engaging with experts from peace and security research, as well as the social sciences, to help us understand all dimensions. I’ve already learned far more about this than I ever expected to. We hope to come to a decision on this later this year.

You are DESY’s first female chair. What barriers do you think still exist for women in physics, and how can institutions like DESY address them?

There are two main barriers, I think. The first is that, in my opinion, society at large still discourages girls from going into maths and science.

Certainly in Germany, if you stopped a hundred people on the street, I think most of them would still say that girls aren’t naturally good at maths and science. Of course, there are always exceptions: you do find great teachers and supportive parents who go against this narrative. I wouldn’t be here today if I hadn’t received that kind of encouragement.

That’s why it’s so important to actively counter those messages. Girls need encouragement from an early age, they need to be strengthened and supported. On the encouragement side, DESY is quite active. We run many outreach activities for schoolchildren, including a dedicated school lab. Every year, more than 13,000 school pupils visit our campus. We also take part in Germany’s “Zukunftstag”, where girls are encouraged to explore careers traditionally considered male-dominated, and boys do the same for fields seen as female-dominated.

Looking ahead, we want to significantly strengthen our ties with industry

The second challenge comes later, at a different career stage, and it has to do with family responsibilities. Often, family work still falls more heavily on women than men in many partnerships. That imbalance can hold women back, particularly during the postdoc years, which tend to coincide with the time when many people are starting families. It’s a tough period, because you’re trying to advance your career.

Workplaces like DESY can play a role in making this easier. We offer good childcare options, flexibility with home–office arrangements, and even shared leadership positions, which help make it more manageable to balance work and family life. We also have mentoring programmes. One example is dynaMENT, where female PhD students and postdocs are mentored by more senior professionals. I’ve taken part in that myself, and I think it’s incredibly valuable.

Do you have any advice for early-career women physicists?

If I could offer one more piece of advice, it’s about building a strong professional network. That’s something I’ve found truly valuable. I’m fortunate to have a fantastic international network, both male and female colleagues, including many women in leadership positions. It’s so important to have people you can talk to, who understand your challenges, and who might be in similar situations. So if you’re a student, I’d really recommend investing in your network. That’s very important, I think.

What are your personal reflections on the next-generation colliders?

Our generation has a responsibility to understand the electroweak scale and the Higgs boson. These questions have been around for almost 90 years, since 1935 when Hideki Yukawa explored the idea that forces might be mediated by the exchange of massive particles. While we’ve made progress, a true understanding is still out of reach. That’s what the next generation of machines is aiming to tackle.

The problem, of course, is cost. All the proposed solutions are expensive, and it is very challenging to secure investments for such large-scale projects, even though the return on investment from big science is typically excellent: these projects drive innovation, build high-tech capability and create a highly skilled workforce.

Europe’s role is more vital than ever

From a scientific point of view, the FCC is the most comprehensive option. As a Higgs factory, it offers a broad and strong programme to analyse the Higgs and electroweak gauge bosons. But who knows if we’ll be able to afford it? And it’s not just about money. The timeline and the risks also matter. The FCC feasibility report was just published and is still under review by an expert committee. I’d rather not comment further until I’ve seen the full information. I’m part of the European Strategy Group and we’ll publish a new report by the end of the year. Until then, I want to understand all the details before forming an opinion.

It’s good to have other options too. The muon collider is not yet as technically ready as the FCC or linear collider, but it’s an exciting technology and could be the machine after next. Another could be using plasma-wakefield acceleration, which we’re very actively working on at DESY. It could enable us to build high-energy colliders on a much smaller scale. This is something we’ll need, as we can’t keep building ever-larger machines forever. Investing in accelerator R&D to develop these next-gen technologies is crucial.

Still, I really hope there will be an intermediate machine in the near future, a Higgs factory that lets us properly explore the Higgs boson. There are still many mysteries there. I like to compare it to an egg: you have to crack it open to see what’s inside. And that’s what we need to do with the Higgs.

One thing that is becoming clearer to me is the growing importance of Europe. With the current uncertainties in the US, which are already affecting health and climate research, we can’t assume fundamental research will remain unaffected. That’s why Europe’s role is more vital than ever.

I think we need to build more collaborations between European labs. Sharing expertise, especially through staff exchanges, could be particularly valuable in engineering, where we need a huge number of highly skilled professionals to deliver billion-euro projects. We’ve got one coming up ourselves, and the technical expertise for that will be critical.

I believe science has a key role to play in strengthening Europe, not just culturally, but economically too. It’s an area where we can and should come together.

The post Charting DESY’s future appeared first on CERN Courier.

]]>
Opinion DESY’s new chair, Beate Heinemann, reflects on the laboratory’s evolving role in science and society – from building next-generation accelerators to navigating Europe’s geopolitical landscape. https://cerncourier.com/wp-content/uploads/2025/05/CCMayJun25_INT_Heinemann.jpg
European strategy update: the community speaks https://cerncourier.com/a/european-strategy-update-the-community-speaks/ Mon, 19 May 2025 07:18:23 +0000 https://cerncourier.com/?p=113032 A total of 263 submissions range from individual to national perspectives.

The post European strategy update: the community speaks appeared first on CERN Courier.

]]>
Community input themes of the European Strategy process

The deadline for submitting inputs to the 2026 update of the European Strategy for Particle Physics (ESPP) passed on 31 March. A total of 263 submissions, ranging from individual to national perspectives, express the priorities of the high-energy physics community (see “Community inputs” figure). These inputs will be distilled by expert panels in preparation for an Open Symposium that will be held in Venice from 23 to 27 June (CERN Courier March/April 2025 p11).

Launched by the CERN Council in March 2024, the stated aim of the 2026 update to the ESPP is to develop a visionary and concrete plan that greatly advances human knowledge in fundamental physics, in particular through the realisation of the next flagship project at CERN. The community-wide process, which is due to submit recom­mendations to Council by the end of the year, is also expected to prioritise alternative options to be pursued if the preferred project turns out not to be feasible or competitive.

“We are heartened to see so many rich and varied contributions, in particular the national input and the various proposals for the next large-scale accelerator project at CERN,” says strategy secretary Karl Jakobs of the University of Freiburg, speaking on behalf of the European Strategy Group (ESG). “We thank everyone for their hard work and rigour.”

Two proposals for flagship colliders are at an advanced stage: a Future Circular Collider (FCC) and a Linear Collider Facility (LCF). As recommended in the 2020 strategy update, a feasibility study for the FCC was released on 31 March, describing a 91 km-circumference infrastructure that could host an electron–positron Higgs and electroweak factory followed by an energy-frontier hadron collider at a later stage. Inputs for an electron–positron LCF cover potential starting configurations based on Compact Linear Collider (CLIC) or International Linear Collider (ILC) technologies. It is proposed that the latter LCF could be upgraded using CLIC, Cool Copper Collider, plasma-wakefield or energy-recovery technologies and designs. Other proposals outline a muon collider and a possible plasma-wakefield collider, as well as potential “bridging” projects to a future flagship collider. Among the latter are LEP3 and LHeC, which would site an electron–positron and an electron–proton collider, respectively, in the existing LHC tunnel. For the LHeC, an additional energy-recovery linac would need to be added to CERN’s accelerator complex.

Future choices

In probing beyond the Standard Model and more deeply studying the Higgs boson and its electroweak domain, next-generation colliders will pick up where the High-Luminosity LHC (HL-LHC) leaves off. In a joint submission, the ATLAS and CMS collaborations presented physics projections which suggest that the HL-LHC will be able to: observe the H  µ+µ and H  Zγ decays of the Higgs boson; observe Standard Model di-Higgs production; and measure the Higgs’ trilinear self-coupling with a precision better than 30%. The joint document also highlights the need for further progress in high-precision theoretical calculations aligned with the demands of the HL-LHC and serves as important input to the discussion on the choice of a future collider at CERN.

Neutrinos and cosmic messengers, dark matter and the dark sector, strong interactions and flavour physics also attracted many inputs, allowing priorities in non-collider physics to complement collider programmes. Underpinning the community’s physics aspirations are numerous submissions in the categories of accelerator science and technology, detector instrumentation and computing. Progress in these technologies is vital for the realisation of a post-LHC collider, which was also reflected by the recommendation of the 2020 strategy update to define R&D roadmaps. The scientific and technical inputs will be reviewed by the Physics Preparatory Group (PPG), which will conduct comparative assessments of the scientific potential of various proposed projects against defined physics benchmarks.

We are heartened to see so many rich and varied contributions

Key to the ESPP 2026 update are 57 national and national-laboratory submissions, including some from outside Europe. Most identify the FCC as the preferred project to succeed the LHC. If the FCC is found to be unfeasible, many national communities propose that a linear collider at CERN should be pursued, while taking into account the global context: a 250 GeV linear collider may not be competitive if China decides to proceed with a Circular Electron Positron Collider at a comparable energy on the anticipated timescale, potentially motivating a higher energy electron–positron machine or a proton–proton collider instead.

Complex process

In its review, the ESG will take the physics reach of proposed colliders as well as other factors into account. This complex process will be undertaken by seven working groups, addressing: national inputs; diversity in European particle physics; project comparison; implementation of the strategy and deliverability of large projects; relations with other fields of physics; sustainability and environmental impact; public engagement, education, communication and social and career aspects for the next generation; and knowledge and technology transfer. “The ESG and the PPG have their work cut out and we look forward to further strong participation by the full community, in particular at the Open Symposium,” says Jakobs.

A briefing book prepared by the PPG based on the community input and discussions at the Open Symposium will be submitted to the ESG by the end of September for consideration during a five-day-long drafting session, which is scheduled to take place from 1 to 5 December. The CERN Council will then review the final ESG recommendations ahead of a special session to be held in Budapest in May 2026.

The post European strategy update: the community speaks appeared first on CERN Courier.

]]>
News A total of 263 submissions range from individual to national perspectives. https://cerncourier.com/wp-content/uploads/2025/05/CCMayJun25_NA_ESPP.png
PhyStat turns 25 https://cerncourier.com/a/phystat-turns-25/ Fri, 16 May 2025 16:31:48 +0000 https://cerncourier.com/?p=112707 On 16 January, physicists and statisticians met in the CERN Council Chamber to celebrate 25 years of the PhyStat series of conferences, workshops and seminars.

The post PhyStat turns 25 appeared first on CERN Courier.

]]>
Confidence intervals

On 16 January, physicists and statisticians met in the CERN Council Chamber to celebrate 25 years of the PhyStat series of conferences, workshops and seminars, which bring together physicists, statisticians and scientists from related fields to discuss, develop and disseminate methods for statistical data analysis and machine learning.

The special symposium heard from the founder and primary organiser of the PhyStat series Louis Lyons (Imperial College London and University of Oxford), who together with Fred James and Yves Perrin initiated the movement with the “Workshop on Confidence Limits” in January 2000. According to Lyons, the series was to bring together physicists and statisticians, a philosophy that has been followed and extended throughout the 22 PhyStat workshops and conferences, as well as numerous seminars and “informal reviews”. Speakers called attention to recognition from the Royal Statistical Society’s pictorial timeline of statistics, starting with the use of averages by Hippias of Elis in 450 BC and culminating with the 2012 discovery of the Higgs boson with 5σ significance.

Lyons and Bob Cousins (UCLA) offered their views on the evolution of statistical practice in high-energy physics, starting in the 1960s bubble-chamber era, strongly influenced by the 1971 book Statistical Methods in Experimental Physics by W T Eadie et al., its 2006 second edition by symposium participant Fred James (CERN), as well as Statistics for Nuclear and Particle Physics (1985) by Louis Lyons – reportedly the most stolen book from the CERN library. Both Lyons and Cousins noted the interest of the PhyStat community not only in practical solutions to concrete problems but also in foundational questions in statistics, with the focus on frequentist methods setting high-energy physics somewhat apart from the Bayesian approach more widely used in astrophysics.

Giving his view of the PhyStat era, ATLAS physicist and director of the University of Wisconsin Data Science Institute Kyle Cranmer emphasised the enormous impact that PhyStat has had on the field, noting important milestones such as the ability to publish full likelihood models through the statistical package RooStats, the treatment of systematic uncertainties with profile-likelihood ratio analyses, methods for combining analyses, and the reuse of published analyses to place constraints on new physics models. In regards to the next 25 years, Cranmer predicted the increasing use of methods that have emerged from PhyStat, such as simulation-based inference, and pointed out that artificial intelligence (the elephant in the room) could drastically alter how we use statistics.

Statistician Mikael Kuusela (CMU) noted that Phystat workshops have provided important two-way communication between the physics and statistics communities, citing simulation-based inference as an example where many key ideas were first developed in physics and later adopted by statisticians. In his view, the use of statistics in particle physics has emerged as “phystatistics”, a proper subfield with distinct problems and methods.

Another important feature of the PhyStat movement has been to encourage active participation and leadership by younger members of the community.  With its 25th anniversary, the torch is now passed from Louis Lyons to Olaf Behnke (DESY), Lydia Brenner (NIKHEF) and a younger team, who will guide Phystat into the next 25 years and beyond.

The post PhyStat turns 25 appeared first on CERN Courier.

]]>
Meeting report On 16 January, physicists and statisticians met in the CERN Council Chamber to celebrate 25 years of the PhyStat series of conferences, workshops and seminars. https://cerncourier.com/wp-content/uploads/2025/03/CCMarApr25_FN_phystat_feature.jpg
Game on for physicists https://cerncourier.com/a/game-on-for-physicists/ Wed, 26 Mar 2025 14:35:42 +0000 https://cerncourier.com/?p=112787 Raphael Granier de Cassagnac discusses opportunities for particle physicists in the gaming industry.

The post Game on for physicists appeared first on CERN Courier.

]]>
Raphael Granier de Cassagnac and Exographer

“Confucius famously may or may not have said: ‘When I hear, I forget. When I see, I remember. When I do, I understand.’ And computer-game mechanics can be inspired directly by science. Study it well, and you can invent game mechanics that allow you to engage with and learn about your own reality in a way you can’t when simply watching films or reading books.”

So says Raphael Granier de Cassagnac, a research director at France’s CNRS and Ecole Polytechnique, as well as member of the CMS collaboration at the CMS. Granier de Cassagnac is also the creative director of Exographer, a science-fiction computer game that draws on concepts from particle physics and is available on Steam, Switch, PlayStation 5 and Xbox.

“To some extent, it’s not too different from working at a place like CMS, which is also a super complicated object,” explains Granier de Cassagnac. Developing a game often requires graphic artists, sound designers, programmers and science advisors. To keep a detector like CMS running, you need engineers, computer scientists, accelerator physicists and funding agencies. And that’s to name just a few. Even if you are not the primary game designer or principal investigator, understanding the
fundamentals is crucial to keep the project running efficiently.

Root skills

Most physicists already have some familiarity with structured programming and data handling, which eases the transition into game development. Just as tools like ROOT and Geant4 serve as libraries for analysing particle collisions, game engines such as Unreal, Unity or Godot provide a foundation for building games. Prebuilt functionalities are used to refine the game mechanics.

“Physicists are trained to have an analytical mind, which helps when it comes to organising a game’s software,” explains Granier de Cassagnac. “The engine is merely one big library, and you never have to code anything super complicated, you just need to know how to use the building blocks you have and code in smaller sections to optimise the engine itself.”

While coding is an essential skill for game production, it is not enough to create a compelling game. Game design demands storytelling, character development and world-building. Structure, coherence and the ability to guide an audience through complex information are also required.

“Some games are character-driven, others focus more on the adventure or world-building,” says Granier de Cassagnac. “I’ve always enjoyed reading science fiction and playing role-playing games like Dungeons and Dragons, so writing for me came naturally.”

Entrepreneurship and collaboration are also key skills, as it is increasingly rare for developers to create games independently. Universities and startup incubators can provide valuable support through funding and mentorship. Incubators can help connect entrepreneurs with industry experts, and bridge the gap between scientific research and commercial viability.

“Managing a creative studio and a company, as well as selling the game, was entirely new for me,” recalls Granier de Cassagnac. “While working at CMS, we always had long deadlines and low pressure. Physicists are usually not prepared for the speed of the industry at all. Specialised offices in most universities can help with valorisation – taking scientific research and putting it on the market. You cannot forget that your academic institutions are still part of your support network.”

Though challenging to break into, opportunity abounds for those willing to upskill

The industry is fiercely competitive, with more games being released than players can consume, but a well-crafted game with a unique vision can still break through. A common mistake made by first-time developers is releasing their game too early. No matter how innovative the concept or engaging the mechanics, a game riddled with bugs frustrates players and damages its reputation. Even with strong marketing, a rushed release can lead to negative reviews and refunds – sometimes sinking a project entirely.

“In this industry, time is money and money is time,” explains Granier de Cassagnac. But though challenging to break into, opportunity abounds for those willing to upskill, with the gaming industry worth almost $200 billion a year and reaching more than three billion players worldwide by Granier de Cassagnac’s estimation. The most important aspects for making a successful game are originality, creativity, marketing and knowing the engine, he says.

“Learning must always be part of the process; without it we cannot improve,” adds Granier de Cassagnac, referring to his own upskilling for the company’s next project, which will be even more ambitious in its scientific coverage. “In the next game we want to explore the world as we know it, from the Big Bang to the rise of technology. We want to tell the story of humankind.”

The post Game on for physicists appeared first on CERN Courier.

]]>
Careers Raphael Granier de Cassagnac discusses opportunities for particle physicists in the gaming industry. https://cerncourier.com/wp-content/uploads/2025/03/CCMarApr25_CAREERS_Garnier_feature.jpg
Salam’s dream visits the Himalayas https://cerncourier.com/a/salams-dream-visits-the-himalayas/ Wed, 26 Mar 2025 14:28:34 +0000 https://cerncourier.com/?p=112728 The BCVSPIN programme aims to facilitate interactions between researchers from Bangladesh, China, Vietnam, Sri Lanka, Pakistan, India and Nepal and the broader international community.

The post Salam’s dream visits the Himalayas appeared first on CERN Courier.

]]>
After winning the Nobel Prize in Physics in 1979, Abdus Salam wanted to bring world-class physics research opportunities to South Asia. This was the beginning of the BCSPIN programme, encompassing Bangladesh, China, Sri Lanka, Pakistan, India and Nepal. The goal was to provide scientists in South and Southeast Asia with new opportunities to learn from leading experts about developments in particle physics, astroparticle physics and cosmology. Together with Jogesh Pati, Yu Lu and Qaisar Shafi, Salam initiated the programme in 1989. This first edition was hosted by Nepal. Vietnam joined in 2009 and BCSPIN became BCVSPIN. Over the years, the conference has been held as far afield as Mexico.

The most recent edition attracted more than 100 participants to the historic Hotel Shanker in Kathmandu, Nepal, from 9 to 13 December 2024. The conference aimed to facilitate interactions between researchers from BCVSPIN countries and the broader international community, covering topics such as collider physics, cosmology, gravitational waves, dark matter, neutrino physics, particle astrophysics, physics beyond the Standard Model and machine learning. Participants ranged from renowned professors from across the globe to aspiring students.

Speaking of aspiring students, the main event was preceded by the BCVSPIN-2024 Masterclass in Particle Physics and Workshop in Machine Learning, hosted at Tribhuvan University from 4 to 6 December. The workshop provided 34 undergraduate and graduate students from around Nepal with a comprehensive introduction to particle physics, high-energy physics (HEP) experiments and machine learning. In addition to lectures, the workshop engaged students in hands-on sessions, allowing them to experience real research by exploring core concepts and applying machine-learning techniques to data from the ATLAS experiment. The students’ enthusiasm was palpable as they delved into the intricacies of particle physics and machine learning. The interactive sessions were particularly engaging, with students eagerly participating in discussions and practical exercises. Highlights included a special talk on artificial intelligence (AI) and a career development session focused on crafting CVs, applications and research statements. These sessions ensured participants were equipped with both academic insights and practical guidance. The impact on students was profound, as they gained valuable skills and networking opportunities, preparing them for future careers in HEP.

The BCVSPIN conference officially started the following Monday. In the spirit of BCVSPIN, the first plenary session featured an insightful talk on the status and prospects of HEP in Nepal, providing valuable insights for both locals and newcomers to the initiative. Then, the latest and the near-future physics highlights of experiments such as ATLAS, ALICE, CMS, as well as Belle, DUNE and IceCube, were showcased. From physics performance such as ATLAS nailing b-tagging with graph neural networks, to the most elaborate mass measurement of the W boson mass by CMS, not to mention ProtoDUNE’s runs exceeding expectations, the audience were offered comprehensive reviews of the recent breakthroughs on the experimental side. The younger physicists willing to continue or start hardware efforts surely appreciated the overview and schedule of the different upgrade programmes. The theory talks covered, among others, dark-matter models, our dear friend the neutrino and the interactions between the two. A special talk on AI invited the audience to reflect on what AI really is and how – in the midst of the ongoing revolution – it impacts the fields of physics and physicists themselves. Overviews of long-term future endeavours such as the Electron–Ion Collider and the Future Circular Collider concluded the programme.

BCVSPIN offers younger scientists precious connections with physicists from the international community

A special highlight of the conference was a public lecture “Oscillating Neutrinos” by the 2015 Nobel Laureate Takaaki Kajita. The event was held near the historical landmark of Patan Durbar Square, in the packed auditorium of the Rato Bangala School. This centre of excellence is known for its innovative teaching methods and quality instruction. More than half the room was filled with excited students from schools and universities, eager to listen to the keynote speaker. After a very pedagogical introduction explaining the “problem of solar neutrinos”, Kajita shared his insights on the discovery of neutrino oscillations and its implications for our understanding of the universe. His presentation included historical photographs of the experiments in Kamioka, Japan, as well as his participation at BCVSPIN in 1994. After encouraging the students to become scientists and answering as many questions as time allowed, he was swept up in a crowd of passionate Nepali youth, thrilled to be in the presence of such a renowned physicist.

The BCVSPIN initiative has changed the landscape of HEP in South and Southeast Asia. With participation made affordable for students, it is a stepping stone for the younger generation of scientists, offering them precious connections with physicists from the international community.

The post Salam’s dream visits the Himalayas appeared first on CERN Courier.

]]>
Meeting report The BCVSPIN programme aims to facilitate interactions between researchers from Bangladesh, China, Vietnam, Sri Lanka, Pakistan, India and Nepal and the broader international community. https://cerncourier.com/wp-content/uploads/2025/03/CCMarApr25_FN_BCVSPIN.jpg
Space oddities https://cerncourier.com/a/space-oddities/ Wed, 26 Mar 2025 14:11:01 +0000 https://cerncourier.com/?p=112823 In his new popular book, Harry Cliff tackles the thorny subject of anomalies in fundamental science.

The post Space oddities appeared first on CERN Courier.

]]>
Space Oddities

Space Oddities takes readers on a journey through the mysteries of modern physics, from the smallest subatomic particles to the vast expanse of stars and space. Harry Cliff – an experimental particle physicist at Cambridge University – unravels some of the most perplexing anomalies challenging the Standard Model (SM), with behind-the-scenes scoops from eight different experiments. The most intriguing stories concern lepton universality and the magnetic moment of the muon.

Theoretical predictions have demonstrated an extremely precise value for the muon’s magnetic moment, experimentally verified to an astonishing 11 significant figures. Over the last few years, however, experimental measurements have suggested a slight discrepancy – the devil lying in the 12th digit. 2021 measurements at Fermilab disagreed with theory predictions at 4σ. Not enough to cause a “scientific earthquake”, as Cliff puts it, but enough to suggest that new physics might be at play.

Just as everything seemed to be edging towards a new discovery, Cliff introduces the “villains” of the piece. Groundbreaking lattice–QCD predictions from the Budapest–Marseille–Wuppertal collaboration were published on the same day as a new measurement from Fermilab. If correct, these would destroy the anomaly by contradicting the data-driven theory consensus. (“Yeah, bullshit,” said one experimentalist to Cliff when put to him that the timing wasn’t intended to steal the experiment’s thunder.) The situation is still unresolved, though many new theoretical predictions have been made and a new theoretical consensus is imminent (see “Do muons wobble faster than expected“). Regardless of the outcome, Cliff emphasises that this research will pave the way for future discoveries, and none of it should be taken for granted – even if the anomaly disappears.

“One of the challenging aspects of being part of a large international project is that your colleagues are both collaborators and competitors,” Cliff notes. “When it comes to analysing the data with the ultimate goal of making discoveries, each research group will fight to claim ownership of the most interesting topics.”

This spirit of spurring collaborator- competitors on to greater heights of precision is echoed throughout Cliff’s own experience of working in the LHCb collaboration, where he studies “lepton universality”. All three lepton flavours – electron, muon and tau – should interact almost identically, except for small differences due to their masses. However, over the past decade several experimental results suggested that this theory might not hold in B-meson decays, where muons seemed to be appearing less frequently than electrons. If confirmed, this would point to physics beyond the SM.

Having been involved himself in a complementary but less sensitive analy­sis of B-meson decay channels involving strange quarks, Cliff recalls the emotional rollercoaster experienced by some of the key protagonists: the “RK” team from Imperial College London. After a year of rigorous testing, RK unblinded a sanity check of their new computational toolkit: a reanalysis of the prior measurement that yielded a perfectly consistent R value of 0.72 with an uncertainty of about 0.08, upholding a 3σ discrepancy. Now was the time to put the data collected since then through the same pasta machine: if it agreed, the tension between the SM and their overall measurement would cross the 5σ threshold. After an anxious wait while the numbers were crunched, the team received the results for the new data: 0.93 with an uncertainty of 0.09.

“Dreams of a major discovery evaporated in an instant,” recalls Cliff. “Anyone who saw the RK team in the CERN cafeteria that day could read the result from their faces.” The lead on the RK team, Mitesh Patel, told Cliff that they felt “emotionally train wrecked”.

One day we might make the right mistake and escape the claustrophobic clutches of the SM

With both results combined, the ratio averaged out to 0.85 ± 0.06, just shy of 3σ away from unity. While the experimentalists were deflated, Cliff notes that for theorists this result may have been more exciting than the initial anomaly, as it was easier to explain using new particles or forces. “It was as if we were spying the footprints of a great, unknown beast as it crashed about in a dark jungle,” writes Cliff.

Space Oddities is a great defence of irrepressible experimentation. Even “failed” anomalies are far from useless: if they evaporate, the effort required to investigate them pushes the boundaries of experimental precision, enhances collaboration between scientists across the world, and refines theoretical frameworks. Through retellings and interviews, Cliff helps the public experience the excitement of near breakthroughs, the heartbreak of failed experiments, and the dynamic interactions between theoretical and experimental physicists. Thwarting myths that physicists are cold, calculating figures working in isolation, Cliff sheds light on a community driven by curiosity, ambition and (healthy) competition. His book is a story of hope that one day we might make the right mistake and escape the claustrophobic clutches of the SM.

“I’ve learned so much from my mistakes,” read a poster above Cliff’s undergraduate tutor’s desk. “I think I’ll make another.”

The post Space oddities appeared first on CERN Courier.

]]>
Review In his new popular book, Harry Cliff tackles the thorny subject of anomalies in fundamental science. https://cerncourier.com/wp-content/uploads/2025/03/CCMarApr25_REV_Space_feature.jpg
Encounters with artists https://cerncourier.com/a/encounters-with-artists/ Wed, 26 Mar 2025 13:45:55 +0000 https://cerncourier.com/?p=112793 Over the past 10 years, Mónica Bello facilitated hundreds of encounters between artists and scientists as curator of the Arts at CERN programme.

The post Encounters with artists appeared first on CERN Courier.

]]>
Why should scientists care about art?

Throughout my experiences in the laboratory, I have seen how art is an important part of a scientist’s life. By being connected with art, scientists recognise that their activities are very embedded in contemporary culture. Science is culture. Through art and dialogues with artists, people realise how important science is for society and for culture in general. Science is an important cultural pillar in our society, and these interactions bring scientists meaning.

Are science and art two separate cultures?

Today, if you ask anyone: “What is nature?” they describe everything in scientific terms. The way you describe things, the mysteries of your research: you are actually answering the questions that are present in everyone’s life. In this case, scientists have a sense of responsibility. I think art helps to open this dialogue from science into society.

Do scientists have a responsibility to communicate their research?

All of us have a social responsibility in everything we produce. Ideas don’t belong to anyone, so it’s a collective endeavour. I think that scientists don’t have the responsibility to communicate the research themselves, but that their research cannot be isolated from society. I think it’s a very joyful experience to see that someone cares about what you do.

Why should artists care about science?

If you go to any academic institution, there’s always a scientific component, very often also a technological one. A scientific aspect of your life is always present. This is happening because we’re all on the same course. It’s a consequence of this presence of science in our culture. Artists have an important role in our society, and they help to spark conversations that are important to everyone. Sometimes it might seem as though they are coming from a very individual lens, but in fact they have a very large reach and impact. Not immediately, not something that you can count with data, but there is definitely an impact. Artists open these channels for communicating and thinking about a particular aspect of science, which is difficult to see from a scientific perspective. Because in any discipline, it’s amazing to see your activity from the eyes of others.

Creativity and curiosity are the parameters and competencies that make up artists and scientists

A few years back we did a little survey, and most of the scientists thought that by spending time with artists, they took a step back to think about their research from a different lens, and this changed their perspective. They thought of this as a very positive experience. So I think art is not only about communicating to the public, but about exploring the personal synergies of art and science. This is why artists are so important.

Do experimental and theoretical physicists have different attitudes towards art?

Typically, we think that theorists are much more open to artists, but I don’t agree. In my experiences at CERN, I found many engineers and experimental physicists being highly theoretical. Both value artistic perspectives and their ability to consider questions and scientific ideas in an unconventional way. Experimental physicists would emphasise engagement with instruments and data, while theoretical physicists would focus on conceptual abstraction.

By being with artists, many experimentalists feel that they have the opportunity to talk about things beyond their research. For example, we often talk about the “frontiers of knowledge”. When asked about this, experimentalists or theoretical physicists might tell us about something other than particle physics – like neuroscience, or the brain and consciousness. A scientist is a scientist. They are very curious about everything.

Do these interactions help to blur the distinction between art and science?

Well, here I’m a bit radical because I know that creativity is something we define. Creativity and curiosity are the parameters and competencies that make up artists and scientists. But to become a scientist or an artist you need years of training – it’s not that you can become one just because you are a curious and creative person.

Chroma VII work of art

Not many people can chat about particle physics, but scientists very often chat with artists. I saw artists speaking for hours with scientists about the Higgs field. When you see two people speaking about the same thing, but with different registers, knowledge and background, it’s a precious moment.

When facilitating these discussions between physicists and artists, we don’t speak only about physics, but about everything that worries them. Through that, grows a sort of intimacy that often becomes something else: a friendship. This is the point at which a scientist stops being an information point for an artist and becomes someone who deals with big questions alongside an artist – who is also a very knowledgeable and curious person. This is a process rich in contrast, and you get many interesting surprises out of these interactions.

But even in this moment, they are still artists and scientists. They don’t become this blurred figure that can do anything.

Can scientific discovery exist without art?

That’s a very tricky question. I think that art is a component of science, therefore science cannot exist without art – without the qualities that the artist and scientist have in common. To advance science, you have to create a question that needs to be answered experimentally.

Did discoveries in quantum mechanics affect the arts?

Everything is subjected to quantum mechanics. Maybe what it changed was an attitude towards uncertainty: what we see and what we think is there. There was an increased sense of doubt and general uncertainty in the arts.

Do art and science evolve together or separately?

I think there have been moments of convergence – you can clearly see it in any of the avant garde. The same applies to literature; for example, modernist writers showed a keen interest in science. Poets such as T S Eliot approached poetry with a clear resonance of the first scientific revolutions of the century. There are references to the contributions of Faraday, Maxwell and Planck. You can tell these artists and poets were informed and eager to follow what science was revealing about the world.

You can also note the influence of science in music, as physicists get a better understanding of the physical aspects of sound and matter. Physics became less about viewing the world through a lens, and instead focused on the invisible: the vibrations of matter, electricity, the innermost components of materials. At the end of the 19th and 20th centuries, these examples crop up constantly. It’s not just representing the world as you see it through a particular lens, but being involved in the phenomena of the world and these uncensored realities.

From the 1950s to the 1970s you can see these connections in every single moment. Science is very present in the work of artists, but my feeling is that we don’t have enough literature about it. We really need to conduct more research on this connection between humanities and science.

What are your favourite examples of art influencing science?

Feynman diagrams are one example. Feynman was amazing – a prodigy. Many people before him tried to represent things that escaped our intuition visually and failed. We also have the Pauli Archives here at CERN. Pauli was not the most popular father of quantum mechanics, but he was determined to not only understand mathematical equations but to visualise them, and share them with his friends and colleagues. This sort of endeavour goes beyond just writing – it is about the possibility of creating a tangible experience. I think scientists do that all the time by building machines, and then by trying to understand these machines statistically. I see that in the laboratory constantly, and it’s very revealing because usually people might think of these statistics as something no one cares about – that the visuals are clumsy and nerdy. But they’re not.

Even Leonardo da Vinci was known as a scientist and an artist, but his anatomical sketches were not discovered until hundreds of years after his other works. Newton was also paranoid about expressing his true scientific theories because of the social standards and politics of the time. His views were unorthodox, and he did not want to ruin his prestigious reputation.

Today’s culture also influences how we interpret history. We often think of Aristotle as a philosopher, yet he is also recognised for contributions to natural history. The same with Democritus, whose ideas laid foundations for scientific thought.

So I think that opening laboratories to artists is very revealing about the influence of today’s culture on science.

When did natural philosophy branch out into art and science?

I believe it was during the development of the scientific method: observation, analysis and the evolution of objectivity. The departure point was definitely when we developed a need to be objective. It took centuries to get where we are now, but I think there is a clear division: a line with philosophy, natural philosophy and natural history on one side, and modern science on the other. Today, I think art and science have different purposes. They convene at different moments, but there is always this detour. Some artists are very scientific minded, and some others are more abstract, but they are both bound to speculate massively.

Its really good news for everyone that labs want to include non-scientists

For example, at our Arts at CERN programme we have had artists who were interested in niche scientific aspects. Erich Berger, an artist from Finland, was interested in designing a detector, and scientists whom he met kept telling him that he would need to calibrate the detector. The scientist and the artist here had different goals. For the scientist, the most important thing is that the detector has precision in the greatest complexity. And for the artist, it’s not. It’s about the process of creation, not the analysis.

Do you think that science is purely an objective medium while art is a subjective one?

No. It’s difficult to define subjectivity and objectivity. But art can be very objective. Artists create artefacts to convey their intended message. It’s not that these creations are standing alone without purpose. No, we are beyond that. Now art seeks meaning that is, in this context, grounded in scientific and technological expertise.

How do you see the future of art and science evolving?

There are financial threats to both disciplines. We are still in this moment where things look a bit bleak. But I think our programme is pioneering, because many scientific labs are developing their own arts programmes inspired by the example of Arts at CERN. This is really great, because unless you are in a laboratory, you don’t see what doing science is really about. We usually read science in the newspapers or listen to it on a podcast – everything is very much oriented to the communication of science, but making science is something very specific. It’s really good news for everyone that laboratories want to include non-scientists. Arts at CERN works mostly with visual artists, but you could imagine filmmakers, philosophers, those from the humanities, poets or almost anyone at all, depending on the model that one wants to create in the lab.

The post Encounters with artists appeared first on CERN Courier.

]]>
Opinion Over the past 10 years, Mónica Bello facilitated hundreds of encounters between artists and scientists as curator of the Arts at CERN programme. https://cerncourier.com/wp-content/uploads/2025/03/CCMarApr25_INT_Bello.jpg
How to unfold with AI https://cerncourier.com/a/how-to-unfold-with-ai/ Mon, 27 Jan 2025 08:00:50 +0000 https://cerncourier.com/?p=112161 Inspired by high-dimensional data and the ideals of open science, high-energy physicists are using artificial intelligence to reimagine the statistical technique of ‘unfolding’.

The post How to unfold with AI appeared first on CERN Courier.

]]>
Open-science unfolding

All scientific measurements are affected by the limitations of measuring devices. To make a fair comparison between data and a scientific hypothesis, theoretical predictions must typically be smeared to approximate the known distortions of the detector. Data is then compared with theory at the level of the detector’s response. This works well for targeted measurements, but the detector simulation must be reapplied to the underlying physics model for every new hypothesis.

The alternative is to try to remove detector distortions from the data, and compare with theoretical predictions at the level of the theory. Once detector effects have been “unfolded” from the data, analysts can test any number of hypotheses without having to resimulate or re-estimate detector effects – a huge advantage for open science and data preservation that allows comparisons between datasets from different detectors. Physicists without access to the smearing functions can only use unfolded data.

No simple task

But unfolding detector distortions is no simple task. If the mathematical problem is solved through a straightforward inversion, using linear algebra, noisy fluctuations are amplified, resulting in large uncertainties. Some sort of “regularisation” must be imposed to smooth the fluctuations, but algorithms vary substantively and none is preeminent. Their scope has remained limited for decades. No traditional algorithm is capable of reliably unfolding detector distortions from data relative to more than a few observables at a time.

In the past few years, a new technique has emerged. Rather than unfolding detector effects from only one or two observables, it can unfold detector effects from multiple observables in a high-dimensional space; and rather than unfolding detector effects from binned histograms, it unfolds detector effects from an unbinned distribution of events. This technique is inspired by both artificial-intelligence techniques and the uniquely sparse and high-dimensional data sets of the LHC.

An ill-posed problem

Unfolding is used in many fields. Astronomers unfold point-spread functions to reveal true sky distributions. Medical physicists unfold detector distortions from CT and MRI scans. Geophysicists use unfolding to infer the Earth’s internal structure from seismic-wave data. Economists attempt to unfold the true distribution of opinions from incomplete survey samples. Engineers use deconvolution methods for noise reduction in signal processing. But in recent decades, no field has had a greater need to innovate unfolding techniques than high-energy physics, given its complex detectors, sparse datasets and stringent standards for statistical rigour.

In traditional unfolding algorithms, analysers first choose which quantity they are interested in measuring. An event generator then creates a histogram of the true values of this observable for a large sample of events in their detector. Next, a Monte Carlo simulation simulates the detector response, accounting for noise, background modelling, acceptance effects, reconstruction errors, misidentification errors and energy smearing. A matrix is constructed that transforms the histogram of the true values of the observable into the histogram of detector-level events. Finally, analysts “invert” the matrix and apply it to data, to unfold detector effects from the measurement.

How to unfold traditionally

Diverse algorithms have been invented to unfold distortions from data, with none yet achieving preeminence.

• Developed by Soviet mathematician Andrey Tikhonov in the late 1940s, Tikhonov regularisation (TR) frames unfolding as a minimisation problem with a penalty term added to suppress fluctuations in the solution.

• In the 1950s, statistical mechanic Edwin Jaynes took inspiration from information theory to seek solutions with maximum entropy, seeking to minimise bias beyond the data constraints.

• Between the 1960s and the 1990s, high-energy physicists increasingly drew on the linear algebra of 19th-century mathematicians Eugenio Beltrami and Camille Jordan to develop singular value decomposition as a pragmatic way to suppress noisy fluctuations.

• In the 1990s, Giulio D’Agostini and other high-energy physicists developed iterative Bayesian unfolding (IBU)– a similar technique to Lucy–Richardson deconvolution, which was developed independently in astronomy in the 1970s. An explicitly probabilistic approach well suited to complex detectors, IBU may be considered a forerunner of the neural-network-based technique described in this article.

IBU and TR are the most widely-used approaches in high-energy physics today, with the RooUnfold tool started by Tim Adye serving countless analysts.

At this point in the analysis, the ill-posed nature of the problem presents a major challenge. A simple matrix inversion seldom suffices as statistical noise produces large changes in the estimated input. Several algorithms have been proposed to regularise these fluctuations. Each comes with caveats and constraints, and there is no consensus on a single method that outperforms the rest (see “How to unfold traditionally” panel).

While these approaches have been successfully applied to thousands of measurements at the LHC and beyond, they have limitations. Histogramming is an efficient way to describe the distributions of one or two observables, but the number of bins grows exponentially with the number of parameters, restricting the number of observables that can be simultaneously unfolded. When unfolding only a few observables, model dependence can creep in, for example due to acceptance effects, and if another scientist wants to change the bin sizes or measure a different observable, they will have to redo the entire process.

New possibilities

AI opens up new possibilities for unfolding particle-physics data. Choosing good parameterisations in a high-dimensional space is difficult for humans, and binning is a way to limit the number of degrees of freedom in the problem, making it more tractable. Machine learning (ML) offers flexibility due to the large number of parameters in a deep neural network. Dozens of observables can be unfolded at once, and unfolded datasets can be published as an unbinned collection of individual events that have been corrected for detector distortions as an ensemble.

Unfolding performance

One way to represent the result is as a set of simulated events with weights that encode information from the data. For example, if there are 10 times as many simulated events as real events, the average weight would be about 0.1, with the distribution of weights correcting the simulation to match reality, and errors on the weights reflecting the uncertainties inherent in the unfolding process. This approach gives maximum flexibility to future analysts, who can recombine them into any binning or combination they desire. The weights can be used to build histograms or compute statistics. The full covariance matrix can also be extracted from the weights, which is important for downstream fits.

But how do we know the unfolded values are capturing the truth, and not just “hallucinations” from the AI model?

An important validation step for these analyses are tests performed on synthetic data with a known answer. Analysts take new simulation models, different from the one being used for the primary analysis, and treat them as if they were real data. By unfolding these alternative simulations, researchers are able to compare their results to a known answer. If the biases are large, analysts will need to refine their methods to reduce the model-dependency. If the biases are small compared to the other uncertainties then this remaining difference can be added into the total uncertainty estimate, which is calculated in the traditional way using hundreds of simulations. In unfolding problems, the choice of regularisation method and strength always involves some tradeoff between bias and variance.

Just as unfolding in two dimensions instead of one with traditional methods can reduce model dependence by incorporating more aspects of the detector response, ML methods use the same underlying principle to include as much of the detector response as possible. Learning differences between data and simulation in high-dimensional spaces is the kind of task that ML excels at, and the results are competitive with established methods (see “Better performance” figure).

Neural learning

In the past few years, AI techniques have proven to be useful in practice, yielding publications from the LHC experiments, the H1 experiment at HERA and the STAR experiment at RHIC. The key idea underpinning the strategies used in each of these results is to use neural networks to learn a function that can reweight simulated events to look like data. The neural network is given a list of relevant features about an event such as the masses, energies and momenta of reconstructed objects, and trained to output the probability that it is from a Monte Carlo simulation or the data itself. Neural connections that reweight and combine the inputs across multiple layers are iteratively adjusted depending on the network’s performance. The network thereby learns the relative densities of the simulation and data throughout phase space. The ratio of these densities is used to transform the simulated distribution into one that more closely resembles real events (see “OmniFold” figure).

Illustration of AI unfolding using the OmniFold algorithm

As this is a recently-developed technique, there are plenty of opportunities for new developments and improvements. These strategies are in principle capable of handling significant levels of background subtraction as well as acceptance and efficiency effects, but existing LHC measurements using AI-based unfolding generally have small backgrounds. And as with traditional methods, there is a risk in trying to estimate too many parameters from not enough data. This is typically controlled by stopping the training of the neural network early, combining multiple trainings into a single result, and performing cross validations on different subsets of the data.

Beyond the “OmniFold” methods we are developing, an active community is also working on alternative techniques, including ones based on generative AI. Researchers are also considering creative new ways to use these unfolded results that aren’t possible with traditional methods. One possibility in development is unfolding not just a selection of observables, but the full event. Another intriguing direction could be to generate new events with the corrections learnt by the network built-in. At present, the result of the unfolding is a reweighted set of simulated events, but once the neural network has been trained, its reweighting function could be used to simulate the unfolded sample from scratch, simplifying the output.

The post How to unfold with AI appeared first on CERN Courier.

]]>
Feature Inspired by high-dimensional data and the ideals of open science, high-energy physicists are using artificial intelligence to reimagine the statistical technique of ‘unfolding’. https://cerncourier.com/wp-content/uploads/2025/01/CCJanFeb25_AI_feature.jpg
CERN and ESA: a decade of innovation https://cerncourier.com/a/cern-and-esa-a-decade-of-innovation/ Mon, 27 Jan 2025 07:59:01 +0000 https://cerncourier.com/?p=112108 Enrico Chesta, Véronique Ferlet-Cavrois and Markus Brugger highlight seven ways CERN and ESA are working together to further fundamental exploration and innovation in space technologies.

The post CERN and ESA: a decade of innovation appeared first on CERN Courier.

]]>
Sky maps

Particle accelerators and spacecraft both operate in harsh radiation environments, extreme temperatures and high vacuum. Each must process large amounts of data quickly and autonomously. Much can be gained from cooperation between scientists and engineers in each field.

Ten years ago, the European Space Agency (ESA) and CERN signed a bilateral cooperation agreement to share expertise and facilities. The goal was to expand the limits of human knowledge and keep Europe at the leading edge of progress, innovation and growth. A decade on, CERN and ESA have collaborated on projects ranging from cosmology and planetary exploration to Earth observation and human spaceflight, supporting new space-tech ventures and developing electronic systems, radiation-monitoring instruments and irradiation facilities.

1. Mapping the universe

The Euclid space telescope is exploring the dark universe by mapping the large-scale structure of billions of galaxies out to 10 billion light-years across more than a third of the sky. With tens of petabytes expected in its final data set – already a substantial reduction of the 850 billion bits of compressed images Euclid processes each day – it will generate more data than any other ESA mission by far.

With many CERN cosmologists involved in testing theories of beyond-the-Standard-Model physics, Euclid first became a CERN-recognised experiment in 2015. CERN also contributes to the development of Euclid’s “science ground segment” (SGS), which processes raw data received from the Euclid spacecraft into usable scientific products such as galaxy catalogues and dark-matter maps. CERN’s virtual-machine file system (CernVM-FS) has been integrated into the SGS to allow continuous software deployment across Euclid’s nine data centres and on developers’ laptops.

The telescope was launched in July 2023 and began observations in February 2024. The first piece of its great map of the universe was released in October 2024, showing millions of stars and galaxies from observations and covering 132 square degrees of the southern sky (see “Sky map” figure). Based on just two weeks of observations, it accounts for just 1% of project’s six-year survey, which will be the largest cosmic map ever made.

Future CERN–ESA collaborations on cosmology, astrophysics and multimessenger astronomy are likely to include the Laser Interferometer Space Antenna (LISA) and the NewAthena X-ray observatory. LISA will be the first space-based observatory to study gravitational waves. NewAthena will study the most energetic phenomena in the universe. Both projects are expected to be ready to launch about 10 years from now.

2. Planetary exploration

Though planetary exploration is conceptually far from fundamental physics, its technical demands require similar expertise. A good example is the Jupiter Icy Moons Explorer (JUICE) mission, which will make detailed observations of the gas giant and its three large ocean-bearing moons Ganymede, Callisto and Europa.

Jupiter’s magnetic field is a million times greater in volume than Earth’s magnetosphere, trapping large fluxes of highly energetic electrons and protons. Before JUICE, the direct and indirect impact of high-energy electrons on modern electronic devices, and in particular their ability to cause “single event effects”, had never been studied before. Two test campaigns took place in the VESPER facility, which is part of the CERN Linear Electron Accelerator for Research (CLEAR) project. Components were tested with tuneable beam energies between 60 and 200 MeV, and average fluxes of roughly 108 electrons per square centimetre per second, mirroring expected radiation levels in the Jovian system.

JUICE radiation-monitor measurements

JUICE was successfully launched in April 2023, starting an epic eight-year journey to Jupiter including several flyby manoeuvres that will be used to commission the onboard instruments (see “Flyby” figure). JUICE should reach Jupiter in July 2031. It remains to be seen whether test results obtained at CERN have successfully de-risked the mission.

Another interesting example of cooperation on planetary exploration is the Mars Sample Return mission, which must operate in low temperatures during eclipse phases. CERN supported the main industrial partner, Thales Alenia Space, in qualifying the orbiter’s thermal-protection systems in cryogenic conditions.

3. Earth observation

Earth observation from orbit has applications ranging from environmental monitoring to weather forecasting. CERN and ESA collaborate both on developing the advanced technologies required by these applications and ensuring they can operate in the harsh radiation environment of space.

In 2017 and 2018, ESA teams came to CERN’s North Area with several partner companies to test the performance of radiation monitors, field-programmable gate arrays (FPGAs) and electronics chips in ultra-high-energy ion beams at the Super Proton Synchrotron. The tests mimicked the ultra-high-energy part of the galactic cosmic-ray spectrum, whose effects had never previously been measured on the ground beyond 10 GeV/nucleon. In 2017, ESA’s standard radiation-environment monitor and several FPGAs and multiprocessor chips were tested with xenon ions. In 2018, the highlight of the campaign was the testing of Intel’s Myriad-2 artificial intelligence (AI) chip with lead ions (see “Space AI” figure). Following its radiation characterisation and qualification, in 2020 the chip embarked on the φ-sat-1 mission to autonomously detect clouds using images from a hyperspectral camera.

Myriad 2 chip testing

More recently, CERN joined Edge SpAIce – an EU project to monitor ecosystems onboard the Balkan-1 satellite and track plastic pollution in the oceans. The project will use CERN’s high-level synthesis for machine learning (hls4ml) AI technology to run inference models on an FPGA that will be launched in 2025.

Looking further ahead, ESA’s φ-lab and CERN’s Quantum Technology Initiative are sponsoring two PhD programmes to study the potential of quantum machine learning, generative models and time-series processing to advance Earth observation. Applications may accelerate the task of extracting features from images to monitor natural disasters, deforestation and the impact of environmental effects on the lifecycle of crops.

4. Dosimetry for human spaceflight

In space, nothing is more important than astronauts’ safety and wellbeing. To this end, in August 2021 ESA astronaut Thomas Pesquet activated the LUMINA experiment inside the International Space Station (ISS), as part of the ALPHA mission (see “Space dosimetry” figure). Developed under the coordination of the French Space Agency and the Laboratoire Hubert Curien at the Université Jean-Monnet-Saint-Étienne and iXblue, LUMINA uses two several-kilometre-long phosphorous-doped optical fibres as active dosimeters to measure ionising radiation aboard the ISS.

ESA astronaut Thomas Pesquet

When exposed to radiation, optical fibres experience a partial loss of transmitted power. Using a reference control channel, radiation-induced attenuation can be accurately measured related to the total ionising dose, with the sensitivity of the device primarily governed by the length of the fibre. Having studied optical-fibre-based technologies for many years, CERN helped optimise the architecture of the dosimeters and performed irradiation tests to calibrate the instrument, which will operate on the ISS for a period of up to five years.

LUMINA complements dosimetry measurements performed on the ISS using CERN’s Timepix technology – an offshoot of the hybrid-pixel-detector technology developed for the LHC experiments (CERN Courier September/October 2024 p37). Timepix dosimeters have been integrated in multiple NASA payloads since 2012.

5. Radiation-hardness assurance

It’s no mean feat to ensure that CERN’s accelerator infrastructure functions in increasingly challenging radiation environments. Similar challenges are found in space. Damage can be caused by accumulating ionising doses, single-event effects (SEEs) or so-called displacement damage dose, which dislodges atoms within a material’s crystal lattice rather than ionising them. Radiation-hardness assurance (RHA) reduces radiation-induced failures in space through environment simulations, part selection and testing, radiation-tolerant design, worst-case analysis and shielding definition.

Since its creation in 2008, CERN’s Radiation to Electronics project has amplified the work of many equipment and service groups in modelling, mitigating and testing the effect of radiation on electronics. A decade later, joint test campaigns with ESA demonstrated the value of CERN’s facilities and expertise to RHA for spaceflight. This led to the signing of a joint protocol on radiation environments, technologies and facilities in 2019, which also included radiation detectors and radiation-tolerant systems, and components and simulation tools.

CHARM facility

Among CERN’s facilities is CHARM: the CERN high-energy-accelerator mixed-field facility, which offers an innovative approach to low-cost RHA. CHARM’s radiation field is generated by the interaction between a 24 GeV/c beam from the Proton Synchrotron and a metallic target. CHARM offers a uniquely wide spectrum of radiation types and energies, the possibility to adjust the environment using mobile shielding, and enough space to test a medium-sized satellite in full operating conditions.

Radiation testing is particularly challenging for the new generation of rapidly developed and often privately funded “new space” projects, which frequently make use of commercial and off-the-shelf (COTS) components. Here, RHA relies on testing and mitigation rather than radiation hardening by design. For “flip chip” configurations, which have their active circuitry facing inward toward the substrate, and dense three-dimensional structures that cannot be directly exposed without compromising their performance, heavy-ion beams accelerated to between 10 and 100 MeV/nucleon are the only way to induce SEE in the sensitive semiconductor volumes of the devices.

To enable testing of highly integrated electronic components, ESA supported studies to develop the CHARM heavy ions for micro-electronics reliability-assurance facility – CHIMERA for short (see “CHIMERA” figure). ESA has sponsored key feasibility activities such as: tuning the ion flux in a large dynamic range; tuning the beam size for board-level testing; and reducing beam energy to maximise the frequency of SEE while maintaining a penetration depth of a few millimetres in silicon.

6. In-orbit demonstrators

Weighing 1 kg and measuring just 10 cm on each side – a nanosatellite standard – the CELESTA satellite was designed to study the effects of cosmic radiation on electronics (see “CubeSat” figure). Initiated in partnership with the University of Montpellier and ESA, and launched in July 2022, CELESTA was CERN’s first in-orbit technology demonstrator.

Radiation-testing model of the CELESTA satellite

As well as providing the first opportunity for CHARM to test a full satellite, CELESTA offered the opportunity to flight-qualify SpaceRadMon, which counts single-event upsets (SEUs) and single-event latchups (SELs) in static random-access memory while using a field-effect transistor for dose monitoring. (SEUs are temporary errors caused by a high-energy particle flipping a bit and SELs are short circuits induced by high-energy particles.) More than 30 students contributed to the mission development, partially in the frame of ESA’s Fly Your Satellite Programme. Built from COTS components calibrated in CHARM, SpaceRadMon has since been adopted by other ESA missions such as Trisat and GENA-OT, and could be used in the future as a low-cost predictive maintenance tool to reduce space debris and improve space sustainability.

The maiden flight of the Vega-C launcher placed CELESTA on an atypical quasi-circular medium-Earth orbit in the middle of the inner Van Allen proton belt at roughly 6000 km. Two months of flight data sufficed to validate the performance of the payload and the ground-testing procedure in CHARM, though CELESTA will fly for thousands of years in a region of space where debris is not a problem due to the harsh radiation environment.

The CELESTA approach has since been adopted by industrial partners to develop radiation-tolerant cameras, radios and on-board computers.

7. Stimulating the space economy

Space technology is a fast-growing industry replete with opportunities for public–private cooperation. The global space economy will be worth $1.8 trillion by 2035, according to the World Economic Forum – up from $630 billion in 2023 and growing at double the projected rate for global GDP.

Whether spun off from space exploration or particle physics, ESA and CERN look to support start-up companies and high-tech ventures in bringing to market technologies with positive societal and economic impacts (see “Spin offs” figure). The use of CERN’s Timepix technology in space missions is a prime example. Private company Advacam collaborated with the Czech Technical University to provide a Timepix-based radiation-monitoring payload called SATRAM to ESA’s Proba-V mission to map land cover and vegetation growth across the entire planet every two days.

The Hannover Messe fair

Advacam is now testing a pixel-detector instrument on JoeySat – an ESA-sponsored technology demonstrator for OneWeb’s next-generation constellation of satellites designed to expand global connectivity. Advacam is also working with ESA on radiation monitors for Space Rider and NASA’s Lunar Gateway. Space Rider is a reusable spacecraft whose maiden voyage is scheduled for the coming years, and Lunar Gateway is a planned space station in lunar orbit that could act as a staging post for Mars exploration.

Another promising example is SigmaLabs – a Polish startup founded by CERN alumni specialising in radiation detectors and predictive-maintenance R&D for space applications. SigmaLabs was recently selected by ESA and the Polish Space Agency to provide one of the experiments expected to fly on Axiom Mission 4 – a private spaceflight to the ISS in 2025 that will include Polish astronaut and CERN engineer Sławosz Uznański (CERN Courier May/June 2024 p55). The experiment will assess the scalability and versatility of the SpaceRadMon radiation-monitoring technology initially developed at CERN for the LHC and flight tested on the CELESTA CubeSat.

In radiation-hardness assurance, the CHIMERA facility is associated with the High-Energy Accelerators for Radiation Testing and Shielding (HEARTS) programme sponsored by the European Commission. Its 2024 pilot user run is already stimulating private innovation, with high-energy heavy ions used to perform business-critical research on electronic components for a dozen aerospace companies.

The post CERN and ESA: a decade of innovation appeared first on CERN Courier.

]]>
Feature Enrico Chesta, Véronique Ferlet-Cavrois and Markus Brugger highlight seven ways CERN and ESA are working together to further fundamental exploration and innovation in space technologies. https://cerncourier.com/wp-content/uploads/2025/01/CCJanFeb25_CERNandESA_pesquet.jpg
A word with CERN’s next Director-General https://cerncourier.com/a/a-word-with-cerns-next-director-general/ Mon, 27 Jan 2025 07:56:07 +0000 https://cerncourier.com/?p=112181 Mark Thomson, CERN's Director General designate for 2025, talks to the Courier about the future of particle physics.

The post A word with CERN’s next Director-General appeared first on CERN Courier.

]]>
Mark Thomson

What motivates you to be CERN’s next Director-General?

CERN is an incredibly important organisation. I believe my deep passion for particle physics, coupled with the experience I have accumulated in recent years, including leading the Deep Underground Neutrino Experiment, DUNE, through a formative phase, and running the Science and Technology Facilities Council in the UK, has equipped me with the right skill set to lead CERN though a particularly important period.

How would you describe your management style?

That’s a good question. My overarching approach is built around delegating and trusting my team. This has two advantages. First, it builds an empowering culture, which in my experience provides the right environment for people to thrive. Second, it frees me up to focus on strategic planning and engagement with numerous key stakeholders. I like to focus on transparency and openness, to build trust both internally and externally.

How will you spend your familiarisation year before you take over in 2026?

First, by getting a deep understanding of CERN “from within”, to plan how I want to approach my mandate. Second, by lending my voice to the scientific discussion that will underpin the third update to the European strategy for particle physics. The European strategy process is a key opportunity for the particle-physics community to provide genuine bottom-up input and shape the future. This is going to be a really varied and exciting year.

What open question in fundamental physics would you most like to see answered in your lifetime?

I am going to have to pick two. I would really like to understand the nature of dark matter. There are a wide range of possibilities, and we are addressing this question from multiple angles; the search for dark matter is an area where the collider and non-collider experiments can both contribute enormously. The second question is the nature of the Higgs field. The Higgs boson is just so different from anything else we’ve ever seen. It’s not just unique – it’s unique and very strange. There are just so many deep questions, such as whether it is fundamental or composite. I am confident that we will make progress in the coming years. I believe the High-Luminosity LHC will be able to make meaningful measurements of the self-coupling at the heart of the Higgs potential. If you’d asked me five years ago whether this was possible, I would have been doubtful. But today I am very optimistic because of the rapid progress with advanced analysis techniques being developed by the brilliant scientists on the LHC experiments.

What areas of R&D are most in need of innovation to meet our science goals?

Artificial intelligence is changing how we look at data in all areas of science. Particle physics is the ideal testing ground for artificial intelligence, because our data is complex there are none of the issues around the sensitive nature of the data that exist in other fields. Complex multidimensional datasets are where you’ll benefit the most from artificial intelligence. I’m also excited by the emergence of new quantum technologies, which will open up fresh opportunities for our detector systems and also new ways of doing experiments in fundamental physics. We’ve only scratched the surface of what can be achieved with entangled quantum systems.

How about in accelerator R&D?

There are two areas that I would like to highlight: making our current technologies more sustainable, and the development of high-field magnets based on high-temperature superconductivity. This connects to the question of innovation more broadly. To quote one example among many, high-temperature superconducting magnets are likely to be an important component of fusion reactors just as much as particle accelerators, making this a very exciting area where CERN can deploy its engineering expertise and really push that programme forward. That’s not just a benefit for particle physics, but a benefit for wider society.

How has CERN changed since you were a fellow back in 1994?

The biggest change is that the collider experiments are larger and more complex, and the scientific and technical skills required have become more specialised. When I first came to CERN, I worked on the OPAL experiment at LEP – a collaboration of less than 400 people. Everybody knew everybody, and it was relatively easy to understand the science of the whole experiment.

My overarching approach is built around delegating and trusting my team

But I don’t think the scientific culture of CERN and the particle-physics community has changed much. When I visit CERN and meet with the younger scientists, I see the same levels of excitement and enthusiasm. People are driven by the wonderful mission of discovery. When planning the future, we need to ensure that early-career researchers can see a clear way forward with opportunities in all periods of their career. This is essential for the long-term health of particle physics. Today we have an amazing machine that’s running beautifully: the LHC. I also don’t think it is possible to overstate the excitement of the High-Luminosity LHC. So there’s a clear and exciting future out to the early 2040s for today’s early-career researchers. The question is what happens beyond that? This is one reason to ensure that there is not a large gap between the end of the High-Luminosity LHC and the start of whatever comes next.

Should the world be aligning on a single project?

Given the increasing scale of investment, we do have to focus as a global community, but that doesn’t necessarily mean a single project. We saw something similar about 10 years ago when the global neutrino community decided to focus its efforts on two complementary long-baseline projects, DUNE and Hyper-Kamiokande. From the perspective of today’s European strategy, the Future Circular Collider (FCC) is an extremely appealing project that would map out an exciting future for CERN for many decades. I think we’ll see this come through strongly in an open and science-driven European strategy process.

How do you see the scientific case for the FCC?

For me, there are two key points. First, gaining a deep understanding of the Higgs boson is the natural next step in our field. We have discovered something truly unique, and we should now explore its properties to gain deeper insights into fundamental physics. Scientifically, the FCC provides everything you want from a Higgs factory, both in terms of luminosity and the opportunity to support multiple experiments.

Second, investment in the FCC tunnel will provide a route to hadron–hadron collisions at the 100 TeV scale. I find it difficult to foresee a future where we will not want this capability.

These two aspects make the FCC a very attractive proposition.

How successful do you believe particle physics is in communicating science and societal impacts to the public and to policymakers?

I think we communicate science well. After all, we’ve got a great story. People get the idea that we work to understand the universe at its most basic level. It’s a simple and profound message.

Going beyond the science, the way we communicate the wider industrial and societal impact is probably equally important. Here we also have a good story. In our experiments we are always pushing beyond the limits of current technology, doing things that have not been done before. The technologies we develop to do this almost always find their way back into something that will have wider applications. Of course, when we start, we don’t know what the impact will be. That’s the strength and beauty of pushing the boundaries of technology for science.

Would the FCC give a strong return on investment to the member states?

Absolutely. Part of the return is the science, part is the investment in technology, and we should not underestimate the importance of the training opportunities for young people across Europe. CERN provides such an amazing and inspiring environment for young people. The scale of the FCC will provide a huge number of opportunities for young scientists and engineers.

We need to ensure that early-career researchers can see a clear way forward with opportunities in all periods of their career. This is essential for the long-term health of particle physics

In terms of technology development, the detectors for the electron–positron collider will provide an opportunity for pushing forward and deploying new, advanced technologies to deliver the precision required for the science programme. In parallel, the development of the magnet technologies for the future hadron collider will be really exciting, particularly the potential use of high-temperature superconductors, as I said before.

It is always difficult to predict the specific “return on investment” on the technologies for big scientific research infrastructure. Part of this challenge is that some of that benefits might be 20, 30, 40 years down the line. Nevertheless, every retrospective that has tried, has demonstrated that you get a huge downstream benefit.

Do we reward technical innovation well enough in high-energy physics?

There needs to be a bit of a culture shift within our community. Engineering and technology innovation are critical to the future of science and critical to the prosperity of Europe. We should be striving to reward individuals working in these areas.

Should the field make it more flexible for physicists and engineers to work in industry and return to the field having worked there?

This is an important question. I actually think things are changing. The fluidity between academia and industry is increasing in both directions. For example, an early-career researcher in particle physics with a background in deep artificial-intelligence techniques is valued incredibly highly by industry. It also works the other way around, and I experienced this myself in my career when one of my post-doctoral researchers joined from an industry background after a PhD in particle physics. The software skills they picked up from industry were incredibly impactful.

I don’t think there is much we need to do to directly increase flexibility – it’s more about culture change, to recognise that fluidity between industry and academia is important and beneficial. Career trajectories are evolving across many sectors. People move around much more than they did in the past.

Does CERN have a future as a global laboratory?

CERN already is a global laboratory. The amazing range of nationalities working here is both inspiring and a huge benefit to CERN.

How can we open up opportunities in low- and middle-income countries?

I am really passionate about the importance of diversity in all its forms and this includes national and regional inclusivity. It is an agenda that I pursued in my last two positions. At the Deep Underground Neutrino Experiment, I was really keen to engage the scientific community from Latin America, and I believe this has been mutually beneficial. At STFC, we used physics as a way to provide opportunities for people across Africa to gain high-tech skills. Going beyond the training, one of the challenges is to ensure that people use these skills in their home nations. Otherwise, you’re not really helping low- and middle-income countries to develop.

What message would you like to leave with readers?

That we have really only just started the LHC programme. With more than a factor of 10 increase in data to come, coupled with new data tools and upgraded detectors, the High-Luminosity LHC represents a major opportunity for a new discovery. Its nature could be a complete surprise. That’s the whole point of exploring the unknown: you don’t know what’s out there. This alone is incredibly exciting, and it is just a part of CERN’s amazing future.

The post A word with CERN’s next Director-General appeared first on CERN Courier.

]]>
Opinion Mark Thomson, CERN's Director General designate for 2025, talks to the Courier about the future of particle physics. https://cerncourier.com/wp-content/uploads/2025/01/CCJanFeb25_INT_thompson_feature.jpg
Charm and synthesis https://cerncourier.com/a/charm-and-synthesis/ Mon, 27 Jan 2025 07:43:29 +0000 https://cerncourier.com/?p=112128 Sheldon Glashow recalls the events surrounding a remarkable decade of model building and discovery between 1964 and 1974.

The post Charm and synthesis appeared first on CERN Courier.

]]>
In 1955, after a year of graduate study at Harvard, I joined a group of a dozen or so students committed to studying elementary particle theory. We approached Julian Schwinger, one of the founders of quantum electrodynamics, hoping to become his thesis students – and we all did.

Schwinger lined us up in his office, and spent several hours assigning thesis subjects. It was a remarkable performance. I was the last in line. Having run out of well-defined thesis problems, he explained to me that weak and electromagnetic interactions share two remarkable features: both are vectorial and both display aspects of universality. Schwinger suggested that I create a unified theory of the two interactions – an electroweak synthesis. How I was to do this he did not say, aside from slyly hinting at the Yang–Mills gauge theory.

By the summer of 1958, I had convinced myself that weak and electromagnetic interactions might be described by a badly broken gauge theory, and Schwinger that I deserved a PhD. I had hoped to partly spend a postdoctoral fellowship in Moscow at the invitation of the recent Russian Nobel laureate Igor Tamm, and sought to visit Niels Bohr’s institute in Copenhagen while awaiting my Soviet visa. With Bohr’s enthusiastic consent, I boarded the SS Île de France with my friend Jack Schnepps. Following a memorable and luxurious crossing – one of the great ship’s last – Jack drove south to Padova to work with Milla Baldo-Ceolin’s emulsion group in Padova, and I took the slow train north to Copenhagen. Thankfully, my Soviet visa never arrived. I found the SU(2) × U(1) structure of the electroweak model in the spring of 1960 at Bohr’s famous institute at Blegsdamvej 19, and wrote the paper that would earn my share of the 1979 Nobel Prize.

We called the new quark flavour charm, completing two weak doublets of quarks to match two weak doublets of leptons, and establishing lepton–quark symmetry, which holds to this day

A year earlier, in 1959, Augusto Gamba, Bob Marshak and Susumo Okubo had proposed lepton–hadron symmetry, which regarded protons, neutrons and lambda hyperons as the building blocks of all hadrons, to match the three known leptons at the time: neutrinos, electrons and muons. The idea was falsified by the discovery of a second neutrino in 1962, and superseded in 1964 by the invention of fractionally charged hadron constituents, first by George Zweig and André Petermann, and then decisively by Murray Gell-Mann with his three flavours of quarks. Later in 1964, while on sabbatical in Copenhagen, James Bjorken and I realised that lepton–hadron symmetry could be revived simply by adding a fourth quark flavour to Gell-Mann’s three. We called the new quark flavour “charm”, completing two weak doublets of quarks to match two weak doublets of leptons, and establishing lepton–quark symmetry, which holds to this day.

Annus mirabilis

1964 was a remarkable year. In addition to the invention of quarks, Nick Samios spotted the triply strange Ω baryon, and Oscar Greenberg devised what became the critical notion of colour. Arno Penzias and Robert Wilson stumbled on the cosmic microwave background radiation. James Cronin, Val Fitch and others discovered CP violation. Robert Brout, François Englert, Peter Higgs and others invented spontaneously broken non-Abelian gauge theories. And to top off the year, Abdus Salam rediscovered and published my SU(2) × U(1) model, after I had more-or-less abandoned electroweak thoughts due to four seemingly intractable problems.

Four intractable problems of early 1964

How could the W and Z bosons acquire masses while leaving the photon massless?

Steven Weinberg, my friend from both high-school and college, brilliantly solved this problem in 1967 by subjecting the electroweak gauge group to spontaneous symmetry breaking, initiating the half-century-long search for the Higgs boson. Salam published the same solution in 1968.

How could an electroweak model of leptons be extended to describe the weak interactions of hadrons?

John Iliopoulos, Luciano Maiani and I solved this problem in 1970 by introducing charm and quark-lepton symmetry to avoid unobserved strangeness-changing neutral currents.

Was the spontaneously broken electroweak gauge model mathematically consistent?

Gerard ’t Hooft announced in 1971 that he had proven Steven Weinberg’s electroweak model to be renormalisable. In 1972, Claude Bouchiat, John Iliopoulos and Philippe Meyer demonstrated the electroweak model to be free of Adler anomalies provided that lepton–quark symmetry is maintained.

Could the electroweak model describe CP violation without invoking additional spinless fields?

In 1973, Makoto Kobayashi and Toshihide Maskawa showed that the electroweak model could easily and naturally violate CP if there are more than four quark flavours.

Much to my surprise and delight, all of them would be solved within just a few years, with the last theoretical obstacle removed by Makoto Kobayashi and Toshihide Maskawa in 1973 (see “Four intractable problems” panel). A few months later, Paul Musset announced that CERN’s Gargamelle detector had won the race to detect weak neutral-current interactions, giving the electroweak model the status of a predictive theory. Remarkably, the year had begun with Gell-Mann, Harald Fritzsch and Heinrich Leutwyler proposing QCD, and David Gross, Frank Wilczek and David Politzer showing it to be asymptotically free. The Standard Model of particle physics was born.

Charmed findings

But where were the charmed quarks? Early on Monday morning on 11 November, 1974, I was awakened by a phone call from Sam Ting, who asked me to come to his MIT office as soon as possible. He and Ulrich Becker were waiting for me impatiently. They showed me an amazingly sharp resonance. Could it be a vector meson like the ρ or ω and be so narrow, or was it something quite different? I hopped in my car and drove to Harvard, where my colleagues Alvaro de Rújula and Howard Georgi excitedly regaled me about the Californian side of the story. A few days later, experimenters in Frascati confirmed the BNL–SLAC discovery, and de Rújula and I submitted our paper “Is Bound Charm Found?” – one of two papers on the J/ψ discovery printed in Physical Review Letters on 5 July 1965 that would prove to be correct. Among five false papers was one written by my beloved mentor, Julian Schwinger.

Sam Ting at CERN in 1976

The second correct paper was by Tom Appelquist and David Politzer. Well before that November, they had realised (without publishing) that bound states of a charmed quark and its antiquark lying below the charm threshold would be exceptionally narrow due the asymptotic freedom of QCD. De Rújula suggested to them that such a system be called charmonium in an analogy with positronium. His term made it into the dictionary. Shortly afterward, the 1976 Nobel Prize in Physics was jointly awarded to Burton Richter and Sam Ting for “their pioneering work in the discovery of a heavy elementary particle of a new kind” – evidence that charm was not yet a universally accepted explanation. Over the next few years, experimenters worked hard to confirm the predictions of theorists at Harvard and Cornell by detecting and measuring the masses, spins and transitions among the eight sub-threshold charmonium states. Later on, they would do the same for 14 relatively narrow states of bottomonium.

Abdus Salam, Tom Ball and Paul Musset

Other experimenters were searching for particles containing just one charmed quark or antiquark. In our 1975 paper “Hadron Masses in a Gauge Theory”, de Rújula, Georgi and I included predictions of the masses of several not-yet-discovered charmed mesons and baryons. The first claim to have detected charmed particles was made in 1975 by Robert Palmer and Nick Samios at Brookhaven, again with a bubble-chamber event. It seemed to show a cascade decay process in which one charmed baryon decays into another charmed baryon, which itself decays. The measured masses of both of the charmed baryons were in excellent agreement with our predictions. Though the claim was not widely accepted, I believe to this day that Samios and Palmer were the first to detect charmed particles.

Sheldon Glashow and Steven Weinberg

The SLAC electron–positron collider, operating well above charm threshold, was certainly producing charmed particles copiously. Why were they not being detected? I recall attending a conference in Wisconsin that was largely dedicated to this question. On the flight home, I met my old friend Gerson Goldhaber, who had been struggling unsuccessfully to find them. I think I convinced him to try a bit harder. A couple of weeks later in 1976, Goldhaber and François Pierre succeeded. My role in charm physics had come to a happy ending. 

  • This article is adapted from a presentation given at the Institute of High-Energy Physics in Beijing on 20 October 2024 to celebrate the 50th anniversary of the discovery of the J/ψ.

The post Charm and synthesis appeared first on CERN Courier.

]]>
Feature Sheldon Glashow recalls the events surrounding a remarkable decade of model building and discovery between 1964 and 1974. https://cerncourier.com/wp-content/uploads/2025/01/CCJanFeb25_GLASHOW_lectures.jpg
CLOUD explains Amazon aerosols https://cerncourier.com/a/cloud-explains-amazon-aerosols/ Mon, 27 Jan 2025 07:26:49 +0000 https://cerncourier.com/?p=112200 The CLOUD collaboration at CERN has revealed a new source of atmospheric aerosol particles that could help scientists to refine climate models.

The post CLOUD explains Amazon aerosols appeared first on CERN Courier.

]]>
In a paper published in the journal Nature, the CLOUD collaboration at CERN has revealed a new source of atmospheric aerosol particles that could help scientists to refine climate models.

Aerosols are microscopic particles suspended in the atmosphere that arise from both natural sources and human activities. They play an important role in Earth’s climate system because they seed clouds and influence their reflectivity and coverage. Most aerosols arise from the spontaneous condensation of molecules that are present in the atmosphere only in minute concentrations. However, the vapours responsible for their formation are not well understood, particularly in the remote upper troposphere.

The CLOUD (Cosmics Leaving Outdoor Droplets) experiment at CERN is designed to investigate the formation and growth of atmospheric aerosol particles in a controlled laboratory environment. CLOUD comprises a 26 m3 ultra-clean chamber and a suite of advanced instruments that continuously analyse its contents. The chamber contains a precisely selected mixture of gases under atmospheric conditions, into which beams of charged pions are fired from CERN’s Proton Synchrotron to mimic the influence of galactic cosmic rays.

“Large concentrations of aerosol particles have been observed high over the Amazon rainforest for the past 20 years, but their source has remained a puzzle until now,” says CLOUD spokesperson Jasper Kirkby. “Our latest study shows that the source is isoprene emitted by the rainforest and lofted in deep convective clouds to high altitudes, where it is oxidised to form highly condensable vapours. Isoprene represents a vast source of biogenic particles in both the present-day and pre-industrial atmospheres that is currently missing in atmospheric chemistry and climate models.”

Isoprene is a hydrocarbon containing five carbon atoms and eight hydrogen atoms. It is emitted by broad-leaved trees and other vegetation and is the most abundant non-methane hydrocarbon released into the atmosphere. Until now, isoprene’s ability to form new particles has been considered negligible.

Seeding clouds

The CLOUD results change this picture. By studying the reaction of hydroxyl radicals with isoprene at upper tropospheric temperatures of –30 °C and –50 °C, the collaboration discovered that isoprene oxidation products form copious particles at ambient isoprene concentrations. This new source of aerosol particles does not require any additional vapours. However, when minute concentrations of sulphuric acid or iodine oxoacids were introduced into the CLOUD chamber, a 100-fold increase in aerosol formation rate was observed. Although sulphuric acid derives mainly from anthropogenic sulphur dioxide emissions, the acid concentrations used in CLOUD can also arise from natural sources.

In addition, the team found that isoprene oxidation products drive rapid growth of particles to sizes at which they can seed clouds and influence the climate – a behaviour that persists in the presence of nitrogen oxides produced by lightning at upper-tropospheric concentrations. After continued growth and descent to lower altitudes, these particles may provide a globally important source for seeding shallow continental and marine clouds, which influence Earth’s radiative balance – the amount of incoming solar radiation compared to outgoing longwave radiation (see “Seeding clouds” figure).

“This new source of biogenic particles in the upper troposphere may impact estimates of Earth’s climate sensitivity, since it implies that more aerosol particles were produced in the pristine pre-industrial atmosphere than previously thought,” adds Kirkby. “However, until our findings have been evaluated in global climate models, it’s not possible to quantify the effect.”

The CLOUD findings are consistent with aircraft observations over the Amazon, as reported in an accompanying paper in the same issue of Nature. Together, the two papers provide a compelling picture of the importance of isoprene-driven aerosol formation and its relevance for the atmosphere.

Since it began operation in 2009, the CLOUD experiment has unearthed several mechanisms by which aerosol particles form and grow in different regions of Earth’s atmosphere. “In addition to helping climate researchers understand the critical role of aerosols in Earth’s climate, the new CLOUD result demonstrates the rich diversity of CERN’s scientific programme and the power of accelerator-based science to address societal challenges,” says CERN Director for Research and Computing, Joachim Mnich.

The post CLOUD explains Amazon aerosols appeared first on CERN Courier.

]]>
News The CLOUD collaboration at CERN has revealed a new source of atmospheric aerosol particles that could help scientists to refine climate models. https://cerncourier.com/wp-content/uploads/2025/01/CCJanFeb25_NA_cloudfrontis.jpg
Emphasising the free circulation of scientists https://cerncourier.com/a/emphasising-the-free-circulation-of-scientists/ Mon, 27 Jan 2025 07:23:24 +0000 https://cerncourier.com/?p=112341 The 33rd assembly of the International Union of Pure and Applied Physics took place in Haikou, China.

The post Emphasising the free circulation of scientists appeared first on CERN Courier.

]]>
Physics is a universal language that unites scientists worldwide. No event illustrates this more vividly than the general assembly of the International Union of Pure and Applied Physics (IUPAP). The 33rd assembly convened 100 delegates representing territories around the world in Haikou, China, from 10 to 14 October 2024. Amid today’s polarised global landscape, one clear commitment emerged: to uphold the universality of science and ensure the free movement of scientists.

IUPAP was established in 1922 in the aftermath of World War I to coordinate international efforts in physics. Its logo is recognisable from conferences and proceedings, but its mission is less widely understood. IUPAP is the only worldwide organisation dedicated to the advancement of all fields of physics. Its goals include promoting global development and cooperation in physics by sponsoring international meetings; strengthening physics education, especially in developing countries; increasing diversity and inclusion in physics; advancing the participation and recognition of women and of people from under-represented groups; enhancing the visibility of early-career talents; and promoting international agreements on symbols, units, nomenclature and standards. At the 33rd assembly, 300 physicists were elected to the executive council and specialised commissions for a period of three years.

Global scientific initiatives were highlighted, including the International Year of Quantum Science and Technology (IYQ2025) and the International Decade on Science for Sustainable Development (IDSSD) from 2024 to 2033, which was adopted by the United Nations General Assembly in August 2023. A key session addressed the importance of industry partnerships, with delegates exploring strategies to engage companies in IYQ2025 and IDSSD to further IUPAP’s mission of using physics to drive societal progress. Nobel laureate Giorgio Parisi discussed the role of physics in promoting a sustainable future, and public lectures by fellow laureates Barry Barish, Takaaki Kajita and Samuel Ting filled the 1820-seat Oriental Universal Theater with enthusiastic students.

A key focus of the meeting was visa-related issues affecting international conferences. Delegates reaffirmed the union’s commitment to scientists’ freedom of movement. IUPAP stands against any discrimination in physics and will continue to sponsor events only in locations that uphold this value – a stance that is orthogonal to the policy of countries imposing sanctions on scientists affiliated with specific institutions.

A joint session with the fall meeting of the Chinese Physical Society celebrated the 25th anniversary of the IUPAP working group “Women in Physics” and emphasised diversity, equity and inclusion in the field. Since 2002, IUPAP has established precise guidelines for the sponsorship of conferences to ensure that women are fairly represented among participants, speakers and committee members, and has actively monitored the data ever since. This has contributed to a significant change in the participation of women in IUPAP-sponsored conferences. IUPAP is now building on this still-necessary work on gender by focusing on discrimination on the grounds of disability and ethnicity.

The closing ceremony brought together the themes of continuity and change. Incoming president Silvina Ponce Dawson (University of Buenos Aires) and president-designate Sunil Gupta (Tata Institute) outlined their joint commitment to maintaining an open dialogue among all physicists in an increasingly fragmented world, and to promoting physics as an essential tool for development and sustainability. Outgoing leaders Michel Spiro (CNRS) and Bruce McKellar (University of Melbourne) were honoured for their contributions, and the ceremonial handover symbolised a smooth transition of leadership.

As the general assembly concluded, there was a palpable sense of momentum. From strategic modernisation to deeper engagement with global issues, IUPAP is well-positioned to make physics more relevant and accessible. The resounding message was one of unity and purpose: the physics community is dedicated to leveraging science for a brighter, more sustainable future.

The post Emphasising the free circulation of scientists appeared first on CERN Courier.

]]>
Meeting report The 33rd assembly of the International Union of Pure and Applied Physics took place in Haikou, China. https://cerncourier.com/wp-content/uploads/2025/01/CCJanFeb25_FN_IUPAP.jpg
The value of being messy https://cerncourier.com/a/the-value-of-being-messy/ Mon, 27 Jan 2025 07:20:48 +0000 https://cerncourier.com/?p=112190 Claire Malone argues that science communicators should not stray too far into public-relations territory.

The post The value of being messy appeared first on CERN Courier.

]]>
The line between science communication and public relations has become increasingly blurred. On one side, scientific press officers highlight institutional success, secure funding and showcase breakthrough discoveries. On the other, science communicators and journalists present scientific findings in a way that educates and entertains readers – acknowledging both the triumphs and the inherent uncertainties of the scientific process.

The core difference between these approaches lies in how they handle the inevitable messiness of science. Science isn’t a smooth, linear path of consistent triumphs; it’s an uncertain, trial-and-error journey. This uncertainty, and our willingness to discuss it openly, is what distinguishes authentic science communication from a polished public relations (PR) pitch. By necessity, PR often strives to present a neat narrative, free of controversy or doubt, but this risks creating a distorted perception of what science actually is.

Finding your voice

Take, for example, the situation in particle physics. Experiments probing the fundamental laws of physics are often critiqued in the press for their hefty price tags – particularly when people are eager to see resources directed towards solving global crises like climate change or preventing future pandemics. When researchers and science communicators are finding their voice, a pressing question is how much messiness to communicate in uncertain times.

After completing my PhD as part of the ATLAS collaboration, I became a science journalist and communicator, connecting audiences across Europe and America with the joy of learning about fundamental physics. After a recent talk at the Royal Institution in London, in which I explained how ATLAS measures fundamental particles, I received an email from a colleague. The only question the talk prompted him to ask was about the safety of colliding protons, aiming to create undiscovered particles. This reaction reflects how scientific misinformation – such as the idea that experiments at CERN could endanger the planet – can be persistent and difficult to eradicate.

In response to such criticisms and concerns, I have argued many times for the value of fundamental physics research, often highlighting the vast number of technological advancements it enables, from touch screens to healthcare advances. However, we must be wary not to only rely on this PR tactic of stressing the tangible benefits of research, as it can sometimes sidestep the uncertainties and iterative nature of scientific investigation, presenting an oversimplified version of scientific progress.

From Democritus to the Standard Model

This PR-driven approach risks undermining public understanding and trust in science in the long run. When science is framed solely as a series of grand successes without any setbacks, people may become confused or disillusioned when they inevitably encounter controversies or failures. Instead, this is where honest science communication shines – admitting that our understanding evolves, that we make mistakes and that uncertainties are an integral part of the process.

Our evolving understanding of particle physics is a perfect illustration of this. From Democritus’ concept of “indivisible atoms” to the development of the Standard Model, every new discovery has refined or even overhauled our previous understanding. This is the essence of science – always refining, never perfect – and it’s exactly what we should be communicating to the public.

Embracing this messiness doesn’t necessarily reduce public trust. When presenting scientific results to the public, it’s important to remember that uncertainty can take many forms, and how we communicate these forms can significantly affect credibility. Technical uncertainty – expressing complexity or incomplete information – often increases audience trust, as it communicates the real intricacies of scientific research. Conversely, consensus uncertainty – spotlighting disagreements or controversies among experts – can have a negative impact on credibility. When it comes to genuine disagreements among scientists, effectively communicating uncertainty to the public requires a thoughtful balance. Transparency is key: acknowledging the existence of different scientific perspectives helps the public understand that science is a dynamic process. Providing context about why disagreements exist, whether due to limited data or competing theoretical frameworks, also helps in making the uncertainty comprehensible.

Embrace errors

In other words, the next time you present your latest results on social media, don’t shy away from including the error bars. And if you must have a public argument with a colleague about what the results mean, context is essential!

Acknowledging the existence of different scientific perspectives helps the public understand that science is a dynamic process

No one knows where the next breakthrough will come from or how it might solve the challenges we face. In an information ecosystem increasingly filled with misinformation, scientists and science communicators must help people understand the iterative, uncertain and evolving nature of science. As science communicators, we should be cautious not to stray too far into PR territory. Authentic communication doesn’t mean glossing over uncertainties but rather embracing them as an essential part of the story. This way, the public can appreciate science not just as a collection of established facts, but as an ongoing, dynamic process – messy, yet ultimately satisfying.

The post The value of being messy appeared first on CERN Courier.

]]>
Opinion Claire Malone argues that science communicators should not stray too far into public-relations territory. https://cerncourier.com/wp-content/uploads/2025/01/CCJanFeb25_VIEW_malone.jpg
Unprecedented progress in energy-efficient RF https://cerncourier.com/a/unprecedented-progress-in-energy-efficient-rf/ Mon, 27 Jan 2025 07:14:38 +0000 https://cerncourier.com/?p=112349 Forty-five experts from industry and academia met in the magnificent city of Toledo for the second workshop on efficient RF sources.

The post Unprecedented progress in energy-efficient RF appeared first on CERN Courier.

]]>
Forty-five experts from industry and academia met in the magnificent city of Toledo, Spain from 23 to 25 September 2024 for the second workshop on efficient RF sources. Part of the I.FAST initiative on sustainable concepts and technologies (CERN Courier July/August 2024 p20), the event focused on recent advances in energy-efficient technology for RF sources essential to accelerators. Progress in the last two years has been unprecedented, with new initiatives and accomplishments around the world fuelled by the ambitious goals of new, high-energy particle-physics projects.

Out of more than 30 presentations, a significant number featured pulsed, high-peak-power RF sources working at frequencies above 3 GHz in the S, C and X bands. These involve high-efficiency klystrons that are being designed, built and tested for the KEK e/e+ Injector, the new EuPRAXIA@SPARC_LAB linac, the CLIC testing facilities, muon collider R&D, the CEPC injector linac and the C3 project. Reported increases in beam-to-RF power efficiency range from 15 percentage points for the retro­fit prototype for CLIC to more than 25 points (expected) for a new greenfield klystron design that can be used across most new projects.

A very dynamic area for R&D is the search of efficient sources for the continuous wave (CW) and long-pulse RF needed for circular accelerators. Typically working in the L-band, existing devices deliver less than 3 MW in peak power. Solid-state amplifiers, inductive output tubes, klystrons, magnetrons, triodes and exotic newly rediscovered vacuum tubes called “tristrons” compete in this arena. Successful prototypes have been built for the High-Luminosity LHC and CEPC with power efficiency gains of 10 to 20 points. In the case of the LHC, this will allow 15% more power without an impact on the electricity bill; in the case of a circular Higgs factory, this will allow a 30% reduction. CERN and SLAC are also investigating very-high-efficiency vacuum tubes for the Future Circular Collider with a potential reduction of close to 50% on the final electricity bill. A collaboration between academia and industry would certainly be required to bring this exciting new technology to light.

Besides the astounding advances in vacuum-tube technology, solid-state amplifiers based on cheap transistors are undergoing a major transformation thanks to the adoption of gallium-nitride technology. Commercial amplifiers are now capable of delivering kilowatts of power at low duty cycles with a power efficiency of 80%, while Uppsala University and the European Spallation Source have demonstrated the same efficiency for combined systems working in CW.

The search for energy efficiency does not stop at designing and building more efficient RF sources. All aspects of operation, power combination and using permanent magnets and efficient modulators need to be folded in, as described by many concrete examples during the workshop. The field is thriving.

The post Unprecedented progress in energy-efficient RF appeared first on CERN Courier.

]]>
Meeting report Forty-five experts from industry and academia met in the magnificent city of Toledo for the second workshop on efficient RF sources. https://cerncourier.com/wp-content/uploads/2025/01/CCJanFeb25_FN_WERFSII.jpg
ICFA talks strategy and sustainability in Prague https://cerncourier.com/a/icfa-talks-strategy-and-sustainability-in-prague-2/ Mon, 27 Jan 2025 07:13:18 +0000 https://preview-courier.web.cern.ch/?p=111309 The 96th ICFA meeting heard extensive reports from the leading HEP laboratories and various world regions on their recent activities and plans.

The post ICFA talks strategy and sustainability in Prague appeared first on CERN Courier.

]]>
ICFA, the International Committee for Future Accelerators, was formed in 1976 to promote international collaboration in all phases of the construction and exploitation of very-high-energy accelerators. Its 96th meeting took place on 20 and 21 July during the recent ICHEP conference in Prague. Almost all of the 16 members from across the world attended in person, making the assembly lively and constructive.

The committee heard extensive reports from the leading HEP laboratories and various world regions on their recent activities and plans, including a presentation by Paris Sphicas, the chair of the European Committee for Future Accelerators (ECFA), on the process for the update of the European strategy for particle physics (ESPP). Launched by CERN Council in March 2024, the ESPP update is charged with recommending the next collider project at CERN after HL-LHC operation.

A global task

The ESPP update is also of high interest to non-European institutions and projects. Consequently, in addition to the expected inputs to the strategy from European HEP communities, those from non-European HEP communities are also welcome. Moreover, the recent US P5 report and the Chinese plans for CEPC, with a potential positive decision in 2025/2026, and discussions about the ILC project in Japan, will be important elements of the work to be carried out in the context of the ESPP update. They also emphasise the global nature of high-energy physics.

An integral part of the work of ICFA is carried out within its panels, which have been very active. Presentations were given from the new panel on the Data Lifecycle (chair Kati Lassila-Perini, Helsinki), the Beam Dynamics panel (new chair Yuan He, IMPCAS) and the Advanced and Novel Accelerators panel (new chair Patric Muggli, Max Planck Munich, proxied at the meeting by Brigitte Cros, Paris-Saclay). The Instrumentation and Innovation Development panel (chair Ian Shipsey, Oxford) is setting an example with its numerous schools, the ICFA instrumentation awards and centrally sponsored instrumentation studentships for early-career researchers from underserved world regions. Finally, the chair of the ILC International Development Team panel (Tatsuya Nakada, EPFL) summarised the latest status of the ILC Technological Network, and the proposed ILC collider project in Japan.

ICFA noted interesting structural developments in the global organisation of HEP

A special session was devoted to the sustainability of HEP accelerator infrastructures, considering the need to invest efforts into guidelines that enable better comparison of the environmental reports of labs and infrastructures, in particular for future facilities. It was therefore natural for ICFA to also hear reports not only from the panel on Sustainable Accelerators and Colliders led by Thomas Roser (BNL), but also from the European Lab Directors Working Group on Sustainability. This group, chaired by Caterina Bloise (INFN) and Maxim Titov (CEA), is mandated to develop a set of key indicators and a methodology for the reporting on future HEP projects, to be delivered in time for the ESPP update.

Finally, ICFA noted some very interesting structural developments in the global organisation of HEP. In the Asia-Oceania region, ACFA-HEP was recently formed as a sub-panel under the Asian Committee for Future Accelerators (ACFA), aiming for a better coordination of HEP activities in this particular region of the world. Hopefully, this will encourage other world regions to organise themselves in a similar way in order to strengthen their voice in the global HEP community – for example in Latin America. Here, a meeting was organised in August by the Latin American Association for High Energy, Cosmology and Astroparticle Physics (LAA-HECAP) to bring together scientists, institutions and funding agencies from across Latin America to coordinate actions for jointly funding research projects across the continent.

The next in-person ICFA meeting will be held during the Lepton–Photon conference in Madison, Wisconsin (USA), in August 2025.

The post ICFA talks strategy and sustainability in Prague appeared first on CERN Courier.

]]>
Meeting report The 96th ICFA meeting heard extensive reports from the leading HEP laboratories and various world regions on their recent activities and plans. https://cerncourier.com/wp-content/uploads/2024/09/CCNovDec24_FN_ICFA.jpg
Open-science cloud takes shape in Berlin https://cerncourier.com/a/open-science-cloud-takes-shape-in-berlin/ Fri, 24 Jan 2025 15:16:54 +0000 https://cerncourier.com/?p=112354 Findable, Accessible, Interoperable and Reusable: the sixth symposium of the European Open Science Cloud (EOSC) attracted over 1,000 participants.

The post Open-science cloud takes shape in Berlin appeared first on CERN Courier.

]]>
Findable. Accessible. Interoperable. Reusable. That’s the dream scenario for scientific data and tools. The European Open Science Cloud (EOSC) is a pan-European initiative to develop a web of “FAIR” data services across all scientific fields. EOSC’s vision is to put in place a system for researchers in Europe to store, share, process, analyse and reuse research outputs such as data, publications and software across disciplines and borders.

EOSC’s sixth symposium attracted 450 delegates to Berlin from 21 to 23 October 2024, with a further 900 participating online. Since its launch in 2017, EOSC activities have focused on conceptualisation, prototyping and planning. In order to develop a trusted federation of research data and services for research and innovation, EOSC is being deployed as a network of nodes. With the launch during the symposium of the EOSC EU node, this year marked a transition from design to deployment.

While EOSC is a flagship science initiative of the European Commission, FAIR concerns researchers and stakeholders globally. Via the multiple projects under the wings of EOSC that collaborate with software and data institutes around the world, a pan-European effort can be made to ensure a research landscape that encourages knowledge sharing while recognising work and training the next generation in best practices in research. The EU node – funded by the European Commission, and the first to be implemented – will serve as a reference for roughly 10 additional nodes to be deployed in a first wave, with more to follow. They are accessible using any institutional credentials based on GÉANT’s MyAccess or with an EU login. A first operational implementation of the EOSC Federation is expected by the end of 2025.

A thematic focus of this year’s symposium was the need for clear guidelines on the adaption of FAIR governance for artificial intelligence (AI), which relies on the accessibility of large and high-quality datasets. It is often the case that AI models are trained with synthetic data, large-scale simulations and first-principles mathematical models, although these may only provide an incomplete description of complex and highly nonlinear real-world phenomena. Once AI models are calibrated against experimental data, their predictions become increasingly accurate. Adopting FAIR principles for the production, collection and curation of scientific datasets will streamline the design, training, validation and testing of AI models (see, for example, Y Chen et al. 2021 arXiv:2108.02214).

EOSC includes five science clusters, from natural sciences to social sciences, with a dedicated cluster for particle physics and astronomy called ESCAPE: the European Science Cluster of Astronomy and Particle Physics. The future deployment of the ESCAPE Virtual Research Environment across multiple nodes will provide users with tools to bring together diverse experimental results, for example, in the search for evidence of dark matter, and to perform new analyses incorporating data from complementary searches.

The post Open-science cloud takes shape in Berlin appeared first on CERN Courier.

]]>
Meeting report Findable, Accessible, Interoperable and Reusable: the sixth symposium of the European Open Science Cloud (EOSC) attracted over 1,000 participants. https://cerncourier.com/wp-content/uploads/2025/01/CCJanFeb25_FN_cloud.jpg
Data analysis in the age of AI https://cerncourier.com/a/data-analysis-in-the-age-of-ai/ Wed, 20 Nov 2024 13:50:36 +0000 https://cern-courier.web.cern.ch/?p=111424 Experts in data analysis, statistics and machine learning for physics came together from 9 to 12 September for PHYSTAT’s Statistics meets Machine Learning workshop.

The post Data analysis in the age of AI appeared first on CERN Courier.

]]>
Experts in data analysis, statistics and machine learning for physics came together from 9 to 12 September at Imperial College London for PHYSTAT’s Statistics meets Machine Learning workshop. The goal of the meeting, which is part of the PHYSTAT series, was to discuss recent developments in machine learning (ML) and their impact on the statistical data-analysis techniques used in particle physics and astronomy.

Particle-physics experiments typically produce large amounts of highly complex data. Extracting information about the properties of fundamental physics interactions from these data is a non-trivial task. The general availability of simulation frameworks makes it relatively straightforward to model the forward process of data analysis: to go from an analytically formulated theory of nature to a sample of simulated events that describe the observation of that theory for a given particle collider and detector in minute detail. The inverse process – to infer from a set of observed data what is learned about a theory – is much harder as the predictions at the detector level are only available as “point clouds” of simulated events, rather than as the analytically formulated distributions that are needed by most statistical-inference methods.

Traditionally, statistical techniques have found a variety of ways to deal with this problem, mostly centered on simplifying the data via summary statistics that can be modelled empirically in an analytical form. A wide range of ML algorithms, ranging from neural networks to boosted decision trees trained to classify events as signal- or background-like, have been used in the past 25 years to construct such summary statistics.

The broader field of ML has experienced a very rapid development in recent years, moving from relatively straightforward models capable of describing a handful of observable quantities, to neural models with advanced architectures such as normalising flows, diffusion models and transformers. These boast millions to billions of parameters that are potentially capable of describing hundreds to thousands of observables – and can now extract features from the data with an order-of-magnitude better performance than traditional approaches. 

New generation

These advances are driven by newly available computation strategies that not only calculate the learned functions, but also their analytical derivatives with respect to all model parameters, greatly speeding up training times, in particular in combination with modern computing hardware with graphics processing units (GPUs) that facilitate massively parallel calculations. This new generation of ML models offers great potential for novel uses in physics data analyses, but have not yet found their way to the mainstream of published physics results on a large scale. Nevertheless, significant progress has been made in the particle-physics community in learning the technology needed, and many new developments using this technology were shown at the workshop.

This new generation of machine-learning models offers great potential for novel uses in physics data analyses

Many of these ML developments showcase the ability of modern ML architectures to learn multidimensional distributions from point-cloud training samples to a very good approximation, even when the number of dimensions is large, for example between 20 and 100. 

A prime use-case of such ML models is an emerging statistical analysis strategy known as simulation-based inference (SBI), where learned approximations of the probability density of signal and background over the full high-dimensional observables space are used, dispensing with the notion of summary statistics to simplify the data. Many examples were shown at the workshop, with applications ranging from particle physics to astronomy, pointing to significant improvements in sensitivity. Work is ongoing on procedures to model systematic uncertainties, and no published results in particle physics exist to date. Examples from astronomy showed that SBI can give results of comparable precision to the default Markov chain Monte Carlo approach for Bayesian computations, but with orders of magnitude faster computation times.

Beyond binning

A commonly used alternative approach to the full-fledged theory parameter inference from observed data is known as deconvolution or unfolding. Here the goal is publishing intermediate results in a form where the detector response has been taken out, but stopping short of interpreting this result in a particular theory framework. The classical approach to unfolding requires estimating a response matrix that captures the smearing effect of the detector on a particular observable, and applying the inverse of that to obtain an estimate of a theory-level distribution – however, this approach is challenging and limited in scope, as the inversion is numerically unstable, and requires a low dimensionality binning of the data. Results on several ML-based approaches were presented, which either learn the response matrix from modelling distributions outright (the generative approach) or learn classifiers that reweight simulated samples (the discriminative approach). Both approaches show very promising results that do not have the limitations on the binning and dimensionality of the distribution of the classical response-inversion approach.

A third domain where ML is facilitating great progress is that of anomaly searches, where an anomaly can either be a single observation that doesn’t fit the distribution (mostly in astronomy), or a collection of events that together don’t fit the distribution (mostly in particle physics). Several analyses highlighted both the power of ML models in such searches and the bounds from statistical theory: it is impossible to optimise sensitivity for single-event anomalies without knowing the outlier distribution, and unsupervised anomaly detectors require a semi-supervised statistical model to interpret ensembles of outliers.

A final application of machine-learned distributions that was much discussed is data augmentation – sampling a new, larger data sample from a learned distribution. If the synthetic data is significantly larger than the training sample, its statistical power will be greater, but will derive this statistical power from the smooth interpolation of the model, potentially generating so-called inductive bias. The validity of the assumed smoothness depends on its realism in a particular setting, for which there is no generic validation strategy. The use of a generative model amounts to a tradeoff between bias and variance.

Interpretable and explainable

Beyond the various novel applications of ML, there were lively discussions on the more fundamental aspects of artificial intelligence (AI), notably on the notion of and need for AI to be interpretable or explainable. Explainable AI aims to elucidate what input information was used, and its relative importance, but this goal has no unambiguous definition. The discussion on the need for explainability centres to a large extent on trust: would you trust a discovery if it is unclear what information the model used and how it was used? Can you convince peers of the validity of your result? The notion of interpretable AI goes beyond that. It is an often-desired quality by scientists, as human knowledge resulting from AI-based science is generally desired to be interpretable, for example in the form of theories based on symmetries, or structures that are simple, or “low-rank”. However, interpretability has no formal criteria, which makes it an impractical requirement. Beyond practicality, there is also a fundamental point: why should nature be simple? Why should models that describe it be restricted to being interpretable? The almost philosophical nature of this question made the discussion on interpretability one of the liveliest ones in the workshop, but for now without conclusion.

Human knowledge resulting from AI-based science is generally desired to be interpretable

For the longer-term future there are several interesting developments in the pipeline. In the design and training of new neural models, two techniques were shown to have great promise. The first one is the concept of foundation models, which are very large models that are pre-trained by very large datasets to learn generic features of the data. When these pre-trained generic models are retrained to perform a specific task, they are shown to outperform purpose-trained models for that same task. The second is on encoding domain knowledge in the network. Networks that have known symmetry principles encoded in the model can significantly outperform models that are generically trained on the same data.

The evaluation of systematic effects is still mostly taken care of in the statistical post-processing step. Future ML techniques may more fully integrate systematic uncertainties, for example by reducing the sensitivity to these uncertainties through adversarial training or pivoting methods. Beyond that, future methods may also integrate the currently separate step of propagating systematic uncertainties (“learning the profiling”) into the training of the procedure. A truly global end-to-end optimisation of the full analysis chain may ultimately become feasible and computationally tractable for models that provide analytical derivatives.

The post Data analysis in the age of AI appeared first on CERN Courier.

]]>
Meeting report Experts in data analysis, statistics and machine learning for physics came together from 9 to 12 September for PHYSTAT’s Statistics meets Machine Learning workshop. https://cerncourier.com/wp-content/uploads/2024/10/CCNovDec24FN_Phystat-1-1.jpg
An obligation to engage https://cerncourier.com/a/an-obligation-to-engage/ Wed, 20 Nov 2024 13:29:45 +0000 https://cern-courier.web.cern.ch/?p=111403 As the CERN & Society Foundation turns 10, founding Director-General Rolf-Dieter Heuer argues that physicists have a duty to promote curiosity and evidence-based critical thinking.

The post An obligation to engage appeared first on CERN Courier.

]]>
Science is for everyone, and everyone depends on science, so why not bring more of it to society? That was the idea behind the CERN & Society Foundation, established 10 years ago.

The longer I work in science, and the more people I talk to about science, the more I become convinced that everyone is interested in science whether they realise it or not. Many have emerged from their school education with a belief that science is hard and not for them, but they nevertheless ask the very same questions that those at the cutting edge of fundamental physics research ask, and that people have been asking since time immemorial: what is the universe made of, where did we come from and where are we going? Such curiosity is part of what it is to be human. On a more prosaic level, science and technology play an ever-growing role in modern society, and it is incumbent on all of us to understand its consequences and engage on the debate about its uses.

The power to inspire

When I tell people about CERN, more often than not their eyes light up with excitement and they want to know more. Experiences like this show that the scientific community needs to do all it can to engage with society at large in a fast-changing world. We need to bring people closer to an understanding of science, of how science works and why critical evidence-based thinking is vital in every walk of life, not only in science.

Laboratories like CERN are extraordinary places where people from all over the world come together to explore nature’s mysteries. I believe that when we come together like this, we have the power to inspire and an obligation to use this power to address the critical challenge of public engagement in science and technology. CERN has always taken this responsibility seriously. Ten years ago, it added a new string to its bow in the form of the CERN & Society Foundation. Through philanthropy, the foundation spreads CERN’s spirit of scientific curiosity.

Rolf-Dieter Heuer

The CERN & Society Foundation helps the laboratory to deepen its impact beyond the core mission of fundamental physics research. Projects supported by the foundation encourage talented young people from around the globe to follow STEM careers, catalyse innovation for the benefit of all, and inspire wide and diverse audiences. From training high-school teachers to producing medical isotopes, donors’ generosity brings research excellence to all corners of society.

The foundation’s work rests on three pillars: education and outreach, innovation and knowledge exchange, and culture and creativity. Allow me to highlight one example from each pillar that I particularly like.

One of the flagships of the education and outreach pillar is the Beamline for Schools (BL4S) competition. Launched in 2014, BL4S invites groups of high-school students from around the world to submit a proposal for an experiment at CERN. The winning teams are invited to come to CERN to carry out their experiment under expert supervision from CERN scientists. More recently, the DESY laboratory has joined the programme and also welcomes high-school groups to work on a beamline there. Project proposals have ranged from fundamental physics to projects aimed at enabling cosmic-ray tomography of the pyramids by measuring muon transmission through limestone (see “Inside pyramids, underneath glaciers“). To date, some 20,000 students have taken part in the competition, with 25 winning teams coming to CERN or DESY to carry out their experiments (see “From blackboard to beamline“).

Zenodo is a great example of the innovation and knowledge-exchange pillar. It provides a repository for free and easy access to research results, data and analy­sis code, thereby promoting the ideal of open science, which is at the very heart of scientific progress. Zenodo taps into CERN’s long-standing tradition and know-how in sharing and preserving scientific knowledge for the benefit of all. The scientific community can now store data in a non-commercial environment, freely available for society at large. Zenodo goes far beyond high-energy physics and played an important role during the COVID-19 pandemic.

Mutual inspiration

Our flagship culture-and-creativity initiative is the world-leading Arts at CERN programme, which recognises the creativity inherent in both the arts and the sciences, and harnesses them to generate benefits for both. Participating artists and scientists find mutual inspiration, going on to inspire audiences around the world.

“In an era where society needs science more than ever, inspiring new generations to believe in their dreams and giving them the tools and space to change the world is essential,” said one donor recently. It is encouraging to hear such sentiments, and there’s no doubt that the CERN & Society Foundation should feel satisfied with its first decade. Through the examples I have cited above, and many more that I have not mentioned, the foundation has made a tangible difference. It is, however, but one voice. Scientists and scientific organisations in prominent positions should take inspiration from the foundation: the world needs more ambassadors for science. On that note, all that remains is for me to say happy birthday, CERN & Society Foundation.

The post An obligation to engage appeared first on CERN Courier.

]]>
Opinion As the CERN & Society Foundation turns 10, founding Director-General Rolf-Dieter Heuer argues that physicists have a duty to promote curiosity and evidence-based critical thinking. https://cerncourier.com/wp-content/uploads/2024/10/CCNovDec24_VIEW_summer-1-1.jpg
Exploding misconceptions https://cerncourier.com/a/exploding-misconceptions/ Wed, 13 Nov 2024 09:47:45 +0000 https://cern-courier.web.cern.ch/?p=111451 Cosmologist Katie Mack talks to the Courier about how high-energy physics can succeed in #scicomm by throwing open the doors to academia.

The post Exploding misconceptions appeared first on CERN Courier.

]]>
Katie Mack

What role does science communication play in your academic career?

When I was a postdoc I started to realise that the science communication side of my life was really important to me. It felt like I was having a big impact – and in research, you don’t always feel like you’re having that big impact. When you’re a grad student or postdoc, you spend a lot of time dealing with rejection, feeling like you’re not making progress or you’re not good enough. I realised that with science communication, I was able to really feel like I did know something, and I was able to share that with people.

When I began to apply for faculty jobs, I realised I didn’t want to just do science writing as a nights and weekends job, I wanted it to be integrated into my career. Partially because I didn’t want to give up the opportunity to have that kind of impact, but also because I really enjoyed it. It was energising for me and helped me contextualise the work I was doing as a scientist.

How did you begin your career in science communication?

I’ve always enjoyed writing stories and poetry. At some point I figured out that I could write about science. When I went to grad school I took a class on science journalism and the professor helped me pitch some stories to magazines, and I started to do freelance science writing. Then I discovered Twitter. That was even better because I could share every little idea I had with a big audience. Between Twitter and freelance science writing, I garnered quite a large profile in science communication and that led to opportunities to speak and do more writing. At some point I was approached by agents and publishers about writing books.

Who is your audience?

When I’m not talking to other scientists, my main community is generally those who have a high-school education, but not necessarily a university education. I don’t tailor things to people who aren’t interested in science, or try to change people’s minds on whether science is a good idea. I try to help people who don’t have a science background feel empowered to learn about science. I think there are a lot of people who don’t see themselves as “science people”. I think that’s a silly concept but a lot of people conceptualise it that way. They feel like science is closed to them.

The more that science communicators can give people a moment of understanding, an insight into science, I think they can really help people get more involved in science. The best feedback I’ve ever gotten is when students have come up to me and said “I started studying physics because I followed you on Twitter and I saw that I could do this,” or they read my book and that inspired them. That’s absolutely the best thing that comes out of this. It is possible to have a big impact on individuals by doing social media and science communication – and hopefully change the situation in science itself over time.

What were your own preconceptions of academia?

I have been excited about science since I was a little kid. I saw that Stephen Hawking was called a cosmologist, so I decided I wanted to be a cosmologist too. I had this vision in my head that I would be a theoretical physicist. I thought that involved a lot of standing alone in a small room with a blackboard, writing equations and having eureka moments. That’s what was always depicted on TV: you just sit by yourself and think real hard. When I actually got into academia, I was surprised by how collaborative and social it is. That was probably the biggest difference between expectation and reality.

How do you communicate the challenges of academia, alongside the awe-inspiring discoveries and eureka moments?

I think it’s important to talk about what it’s really like to be an academic, in both good ways and bad. Most people outside of academia have no idea what we do, so it’s really valuable to share our experiences, both because it challenges stereotypes in terms of what we’re really motivated by and how we spend our time, but also because there are a lot of people who have the same impression I did: where you just sit alone in a room with a chalkboard. I believe it’s important to be clear about what you actually do in academia, so more people can see themselves happy in the job.

At the same time, there are challenges. Academia is hard and can be very isolating. My advice for early-career researchers is to have things other than science in your life. As a student you’re working on something that potentially no one else cares very much about, except maybe your supervisor. You’re going to be the world-expert on it for a while. It can be hard to go through that and not have anybody to talk to about your work. I think it’s important to acknowledge what people go through and encourage them to get support.

Theoretical physicist Katie Mack

There are of course other parts of academia that can be really challenging, like moving all the time. I went from West coast to East coast between undergrad and grad school, and then from the US to the UK, from the UK to Australia, back to the US and then to Canada. That’s a lot. It’s hard. They’re all big moves so you lose whatever local support system you had and you have to start over in a new place, make new friends and get used to a whole new government bureaucracy.

So there are a whole lot of things that are difficult about academia, and you do need to acknowledge those because a lot of them affect equity. Some of these make it more challenging to have diversity in the field, and they disproportionately affect some groups more than others. It is important to talk about these issues instead of just sweeping people under the rug.

Do you think that social media can help to diversify science and research?

Yes! I think that a large reason why people from underrepresented groups leave science is because they lack the feeling of belonging. If you get into a field and don’t feel like you belong, it’s hard to power through that. It makes it very unpleasant to be there. So I think that one of the ways social media can really help is by letting people see scientists who are not the stereotypical old white men. Talking about what being a scientist is really like, what the lifestyle is like, is really helpful for dismantling those stereotypes.

Your first book, The End of Everything, explored astrophysics but your next will popularise particle physics. Have you had to change your strategy when communicating different subjects?

This book is definitely a lot harder to write. The first one was very big and dramatic: the universe is ending! In this one, I’m really trying to get deeper into how fundamental physics works, which is a more challenging story to tell. The way I’m framing it is through “how to build a universe”. It’s about how fundamental physics connects with the structure of reality, both in terms of what we experience in our daily lives, but also the structure of the universe, and how physicists are working to understand that. I also want to highlight some of the scientists who are doing that work.

So yes, it’s much harder to find a catchy hook, but I think the subject matter and topics are things that people are curious about and have a hunger to understand. There really is a desire amongst the public to understand what the point of studying particle physics is.

Is high-energy physics succeeding when it comes to communicating with the public?

I think that there are some aspects where high-energy physics does a fantastic job. When the Higgs boson was discovered in 2012, it was all over the news and everybody was talking about it. Even though it’s a really tough concept to explain, a lot of people got some inkling of its importance.

A lot of science communication in high-energy physics relies on big discoveries, however recently there have not been that many discoveries at the level of international news. There have been many interesting anomalies in recent years, however in terms of discoveries we had the Higgs and the neutrino mass in 1998, but I’m not sure that there are many others that would really grab your attention if you’re not already invested in physics.

Part of the challenge is just the phase of discovery that particle physics is in right now. We have a model, and we’re trying to find the edges of validity of that model. We see some anomalies and then we fix them, and some might stick around. We have some ideas and theories but they might not pan out. That’s kind of the story we’re working with right now, whereas if you’re looking at astronomy, we had gravitational waves and dark energy. We get new telescopes with beautiful pictures all the time, so it’s easier to communicate and get people excited than it is in particle physics, where we’re constantly refining the model and learning new things. It’s a fantastically exciting time, but there have been no big paradigm shifts recently.

How can you keep people engaged in a subject where big discoveries aren’t constantly being made?

I think it’s hard. There are a few ways to go about it. You can talk about the really massive journey we’re on: this hugely consequential and difficult challenge we’re facing in high-energy physics. It’s a huge task of massive global effort, so you can help people feel involved in the quest to go beyond the Standard Model of particle physics.

You need to acknowledge it’s going to be a long journey before we make any big discoveries. There’s much work to be done, and we’re learning lots of amazing things along the way. We’re getting much higher precision. The process of discovery is also hugely consequential outside of high-energy physics: there are so many technological spin-offs that tie into other fields, like cosmology. Discoveries are being made between particle and cosmological physics that are really exciting.

Every little milestone is an achievement to be celebrated

We don’t know what the end of the story looks like. There aren’t a lot of big signposts along the way where we can say “we’ve made so much progress, we’re halfway there!” Highlighting the purpose of discovery, the little exciting things that we accomplish along the way such as new experimental achievements, and the people who are involved and what they’re excited about – this is how we can get around this communication challenge.

Every little milestone is an achievement to be celebrated. CERN is the biggest laboratory in the world. It’s one of humanity’s crowning achievements in terms of technology and international collaboration – I don’t think that’s an exaggeration. CERN and the International Space Station. Those two labs are examples of where a bunch of different countries, which may or may not get along, collaborate to achieve something that they can’t do alone. Seeing how everyone works together on these projects is really inspiring. If more people were able to get a glimpse of the excitement and enthusiasm around these experiments, it would make a big difference.

The post Exploding misconceptions appeared first on CERN Courier.

]]>
Opinion Cosmologist Katie Mack talks to the Courier about how high-energy physics can succeed in #scicomm by throwing open the doors to academia. https://cerncourier.com/wp-content/uploads/2024/10/CCNovDec24_INT_writing-1.jpg
Building on success, planning for the future https://cerncourier.com/a/building-on-success-planning-for-the-future/ Mon, 16 Sep 2024 13:46:34 +0000 https://preview-courier.web.cern.ch/?p=110476 The Chamonix Workshop upheld its long tradition of fostering open and collaborative discussions within CERN’s accelerator and physics communities.

The post Building on success, planning for the future appeared first on CERN Courier.

]]>
From 29 January to 1 February, the Chamonix Workshop 2024 upheld its long tradition of fostering open and collaborative discussions within CERN’s accelerator and physics communities. This year marked a significant shift with more explicit inclusion of the injector complex, acknowledging its crucial role in shaping future research endeavours. Chamonix discussions focused on three main areas:  maximising the remaining years of Run 3; the High-Luminosity LHC (HL-LHC), preparations for Long Shutdown 3 and operations in Run 4; and a look to the further future and the proposed Future Circular Collider (FCC).

Immense effort

Analysing the performance of CERN’s accelerator complex, speakers noted the impressive progress to date, examined limitations in the LHC and injectors and discussed improvements for optimal performance in upcoming runs. It’s difficult to do justice to the immense technical effort made by all systems, operations and technical infrastructure teams that underpins the exploitation of the complex. Machine availability emerged as a crucial theme, recognised as critical for both maximising the potential of existing facilities and ensuring the success of the HL-LHC. Fault tracking, dedicated maintenance efforts and targeted infrastructure improvements across the complex were highlighted as key contributors to achieving and maintaining optimal uptime.

As the HL-LHC project moves into full series production, the technical challenges associated with magnets, cold powering and crab cavities are being addressed (CERN Courier January/February 2024 p37). Looking beyond Long Shutdown 3 (LS3), potential limitations are already being targeted now, with, for example, electron-cloud mitigation measures planned to be deployed in LS3. The transition to the high-luminosity era will involve a huge programme of work that requires meticulous preparation and a well-coordinated effort across the complex during LS3, which will see the deployment of the HL-LHC, a widespread consolidation effort, and other upgrades such as that planned for the ECN3 cavern at CERN’s North Area.

The vision for the next decades of these facilities is diverse, imaginative and well-motivated from a physics perspective

The breadth and depth of the physics being performed at CERN facilities is quite remarkable, and the Chamonix workshop reconfirmed the high demand from experimentalists across the board. The unique capabilities of ISOLDE, n_TOF, AD-ELENA, and the East and North Areas were recognised. The North Area, for example, provides protons, hadrons, electrons and ion beams for detector R&D, experiments, the CERN neutrino platform, irradiation facilities and counts more than 2000 users. The vision for the next decades of these facilities is diverse, imaginative and well-motivated from a physics perspective. The potential for long-term exploitation and leveraging fully the capabilities of the LHC and other facilities is considerable, demanding continued support and development.

In the longer term, CERN is exploring the potential construction of the FCC via a dedicated feasibility study that has just delivered a mid-term report – a summary of which was presented at Chamonix. The initiative is accompanied by R&D on key accelerator technologies. The physics case for FCC-ee was well made for an audience of mostly non-particle physicists, concluding that the FCC is the only proposed collider that covers each key area in the field – electroweak, QCD, flavour, Higgs and searches for phenomena beyond the Standard Model – in paradigm-shifting depth.

Environmental consciousness

Sustainability was another focus of the Chamonix workshop. Building and operating future facilities with environmental consciousness is a top priority, and full life-cycle analyses will be performed for any options to help ensure a low-carbon future.

Interesting times, lots to do. To quote former CERN Director-General Herwig Schopper from 1983: “It is therefore clear that, for some time to come, there will be interesting work to do and I doubt whether accelerator experts will find themselves without a job.”

The post Building on success, planning for the future appeared first on CERN Courier.

]]>
Meeting report The Chamonix Workshop upheld its long tradition of fostering open and collaborative discussions within CERN’s accelerator and physics communities. https://cerncourier.com/wp-content/uploads/2024/04/CCMarApr24_FN_cham24.jpg
Wonderstruck wanderings https://cerncourier.com/a/wonderstruck-wanderings/ Mon, 16 Sep 2024 09:00:01 +0000 https://preview-courier.web.cern.ch/?p=111204 The wonder and awe that we sense when we look at the starry skies is a major motivation to do science. Both Plato (Theaetetus 155d) and Aristotle (Metaphysics 982b12) wrote that philosophy starts in wonder. Plato went even further to declare that the eye’s primary purpose is none other than to see and study the […]

The post Wonderstruck wanderings appeared first on CERN Courier.

]]>
An illustration of a flea from Robert Hooke’s Micrographia

The wonder and awe that we sense when we look at the starry skies is a major motivation to do science. Both Plato (Theaetetus 155d) and Aristotle (Metaphysics 982b12) wrote that philosophy starts in wonder. Plato went even further to declare that the eye’s primary purpose is none other than to see and study the stars (Timaeus 47c). But wonder and awe also play a wider role beyond science, and are fundamental to other endeavours of human civilisation, such as religion. In Wonderstruck: How Wonder and Awe Shape the Way We Think, Helen De Cruz (Saint Louis University) traces the relationship between wonder and awe and philosophy, religion, magic and science, and the development of these concepts throughout history.

Essential emotion

De Cruz’s book is rich in content, drawing from psychology, anthropology and literature. Aptly for particle physicists, she points out that it is not only the very largest scales that fill us with awe, but also the very smallest, as for example in Robert Hooke’s Micrographia, the first book to include illustrations of insects and plants as seen through a microscope. Everyday things may be sources of wonder, according to philosopher and rabbi Abraham J Heschel, who has written on religion as a response to the awe that we feel when we look at the cosmos. Even hard-nosed economists recognise the fundamental role of wonder, she observes: Adam Smith, the famous economist who wrote The Wealth of Nations, believed that wonder is an essential emotion that underlies the pursuit of science, as it prompts people to explore the unknown and seek knowledge about the world. Although particle physics is not mentioned explicitly in the book – the closest instance is a quote from Feynman’s Lectures on Physics – the implications are clear. And while the sources quoted are mostly Western, other traditions are not ignored, with references to Chinese and Japanese culture present, among others.

Wonderstruck

The book also motivates questions that it does not address, some of which are especially interesting for funda­mental physics. For example, modern human beings who live and work in cities spend most of their lives in an environment that alienates them from nature, and nature-induced awe must compete with technology-driven amazement. One can maybe glimpse that in outreach, where curiosity about technology sometimes, though not always, eclipses interest about the fundamental questions of science. While the book discusses this topic in the context of climate change – a reality that reminds us that we cannot ignore nature – there is more one can do with respect to the effects of such an attitude in motivating fundamental science.

At a time when large scientific projects, such as CERN’s proposed Future Circular Collider, are being considered, generating a lot of discussions about cost and benefit, this book reminds us that the major motivation of a new telescope or collider is to push into the frontiers of the unknown – a process that starts and finishes with wonder and awe. As such, the book is very useful reading for scientists doing fundamental research, especially those who engage with the public.

The post Wonderstruck wanderings appeared first on CERN Courier.

]]>
Review https://cerncourier.com/wp-content/uploads/2024/09/CCSepOct24_REV_wonder_feature.jpg
US and CERN sign joint statement of intent https://cerncourier.com/a/us-and-cern-sign-joint-statement-of-intent/ Fri, 05 Jul 2024 07:27:40 +0000 https://preview-courier.web.cern.ch/?p=110850 In April, CERN and the US government released a joint statement of intent concerning future planning for large research infrastructures, advanced scientific computing and open science.

The post US and CERN sign joint statement of intent appeared first on CERN Courier.

]]>
In April, CERN and the US government released a joint statement of intent concerning future planning for large research infrastructures, advanced scientific computing and open science. The statement was signed in Washington, DC by CERN Director-General Fabiola Gianotti and principal deputy US chief technology officer Deirdre Mulligan of the White House Office of Science and Technology.

Acknowledging their longstanding partnership in nuclear and particle physics, CERN and the US intend to enhance collaboration in planning activities for large-scale, resource-intensive facilities. Concerning the proposed Future Circular Collider, FCC-ee, the text states: “Should the CERN Member States determine that the FCC-ee is likely to be CERN’s next world-leading research facility following the high-luminosity Large Hadron Collider, the US intends to collaborate on its construction and physics exploitation, subject to appropriate domestic approvals.” A technical and financial feasibility study for the proposed FCC is due to be completed in March 2025.

CERN and the US also intend to discuss potential collaboration on pilot projects to incorporate new analytics techniques and tools such as AI into particle-physics research at scale, and affirm their collective mission “to take swift strategic action that leads to accelerating widespread adoption of equitable open research, science and scholarship throughout the world”.

The post US and CERN sign joint statement of intent appeared first on CERN Courier.

]]>
News In April, CERN and the US government released a joint statement of intent concerning future planning for large research infrastructures, advanced scientific computing and open science. https://cerncourier.com/wp-content/uploads/2024/07/CCJulAug24_NA_US2.jpg
Accelerator sustainability in focus https://cerncourier.com/a/accelerator-sustainability-in-focus/ Fri, 19 Apr 2024 06:31:44 +0000 https://preview-courier.web.cern.ch/?p=110478 A workshop on sustainability for future accelerators took place on 25–27 September in Morioka, Japan.

The post Accelerator sustainability in focus appeared first on CERN Courier.

]]>
The world is facing a crisis of anthropogenic climate change, driven by excessive CO2 emissions during the past 150 years. In response, the United Nations has defined goals in a race towards zero net-carbon emission. One of these goals is to ensure that all projects due to be completed by 2030 or after have a net-zero carbon operation, with a reduction in embodied carbon by at least 40% compared to current practice. At the same time, the European Union (EU), Japan and other nations have decided to become carbon neutral by around 2050.

These boundary conditions put large-scale science projects under pressure to reduce CO2 emissions during construction, operation and potentially decommissioning. For context: given the current French energy mix, CERN’s annual 1.3 TWh electricity consumption (which is mostly used for accelerator operation) corresponds to roughly 50 kt CO2e global warming potential (GWP), while recent estimates for the construction of tunnels for future colliders are in the multi-100 kt CO2e GWP range.

Green realisation

To discuss potential ways forward, a Workshop on Sustainability for Future Accelerators (WSFA2023) took place on 25–27 September in Morioka, Japan within the framework of the recently started EU project EAJADE (Europe–America–Japan Accelerator Development and Exchange). Around 50 international experts discussed a slew of topics ranging from life-cycle assessments (LCAs) of accelerator technologies with carbon-reduction potential to funding initiatives towards sustainable accelerator R&D, and local initiatives aimed at the “green” realisation of future colliders. With the workshop being held in Japan, the proposed International Linear Collider (ILC) figured prominently as a reference project – attracting considerable attention from local media.

The general context of discussions was set by Beate Heinemann, DESY director for particle physics, on behalf of the European Laboratory Directors Group (LDG). The LDG recently created a working group to assess the sustainability of accelerators, with a mandate to develop guidelines and a minimum set of key indicators pertaining to the methodology and scope of reporting of sustainability aspects for future high-energy physics projects. Since LCAs are becoming the main tool to estimate GWP, a number of project representatives discussed their take on sustainability and steps towards performing LCAs. Starting with the much-cited ARUP study on linear colliders published in 2023 (edms.cern.ch/document/2917948/1), there were presentations on the ESS in Sweden, the ISIS-II neutron and muon source in the UK, the CERN sustainability forum, the Future Circular Collider, the Cool Copper Collider and other proposed colliders. Also discussed were R&D items for sustainable technologies, including CERN’s High Efficiency Klystron Project, the ZEPTO permanent-magnet project, thin film-coated SRF cavities and others.

A second big block in the workshop agenda was devoted to the “greening” of future accelerators and potential local and general construction measures towards achieving this goal. The focus was on Japanese efforts around the ILC, but numerous results can be re-interpreted in a more general way. Presentations were given on the potential of concrete to turn from a massive carbon source into a carbon sink with net negative CO2e balance (a topic with huge industrial interest), on large-scale wooden construction (e.g. for experimental halls), and on the ILC connection with the agriculture, forestry and fisheries industries to reduce CO2 emissions and offset them by increasing CO2 absorption. The focus was on building an energy recycling society by the time the ILC would become operational.

What have we learnt on our way towards sustainable large-scale research infrastructures? First, that time might be our friend: energy mixes will include increasingly larger carbon-free components, making construction projects and operations more eco-friendly. Also, new and more sustainable technologies will be developed that help achieve global climate goals. Second, we as a community must consider the imprint our research leaves on the globe, along with as many indicators as possible. The GWP can be a beginning, but there are many other factors relating, for example, to rare-earth elements, toxicity and acidity. The LCA methodology provides the accelerator community with guidelines for the planning of more sustainable large-scale projects and needs to be further developed – including end-of-life, decommissioning and recycling steps – in an appropriate manner. Last but not least, it is clear that we need to be proactive in anticipating the changes happening in the energy markets and society with respect to sustainability-driven challenges at all levels.

The post Accelerator sustainability in focus appeared first on CERN Courier.

]]>
Meeting report A workshop on sustainability for future accelerators took place on 25–27 September in Morioka, Japan. https://cerncourier.com/wp-content/uploads/2024/04/CCMarApr24_FN_field.jpg
The Many Voices of Modern Physics: Written Communication Practices of Key Discoveries https://cerncourier.com/a/the-many-voices-of-modern-physics-written-communication-practices-of-key-discoveries/ Thu, 21 Mar 2024 15:53:32 +0000 https://preview-courier.web.cern.ch/?p=110251 Joseph Harmon and Alan Gross follow the evolution of written science communication.

The post The Many Voices of Modern Physics: Written Communication Practices of Key Discoveries appeared first on CERN Courier.

]]>

This book provides a rich glimpse into written science communication throughout a century that introduced many new and abstract concepts in physics. It begins with Einstein’s 1905 paper “On the Electrodynamics of Moving Bodies”, in which he introduced special relativity. Atypically, the paper starts with a thought experiment that helps the reader to follow a complex and novel physical mechanism. Authors Harmon and Gross analyse and explain the terminological text and bring further perspective by adding comments made from other scientists or science writers during the time. They follow this analysis style throughout the book, covering science from the smallest to the largest scales and addressing the controversies surrounding atomic weapons.

The only exception from written evaluations of scientific papers is the chapter “Astronomical value”, in which the authors revisit the times of great astronomers such as Galileo Galilei or the Herschel siblings William and Caroline. The authors show that, even back then researchers were in need of sponsors and supporters to fund their research. In Galilei’s case, he regularly presented his findings to the Medici family and fuelled fascination in his patrons so that he was able to continue his work.

While writing the book, Gross, a rhetoric and communications professor, died unexpectedly, leaving Harmon, a science writer and editor at Argonne National Laboratory in communications, to complete the work.

While somewhat repetitive in style, readers can pick a topic of interest from the table of contents and see how scientists and communicators interacted with their audiences. While in-depth scientific knowledge is not required, the book is best targeted at readers who are familiar with the basics of physics and who want to gain new perspectives on some of the most important breakthroughs during the past century and beyond. Indeed, by casting well-known texts in a communication context, the book offers analogies and explanations that can be used by anyone involved in public engagement.

The post The Many Voices of Modern Physics: Written Communication Practices of Key Discoveries appeared first on CERN Courier.

]]>
Review Joseph Harmon and Alan Gross follow the evolution of written science communication. https://cerncourier.com/wp-content/uploads/2024/03/The-Many-Voices-of-Modern-Physics-featured.png
Leading in collaborations https://cerncourier.com/a/leading-in-collaborations/ Wed, 17 Jan 2024 09:46:25 +0000 https://preview-courier.web.cern.ch/?p=109980 Gabriel Facini describes a new programme to change the culture of leadership in large scientific collaborations.

The post Leading in collaborations appeared first on CERN Courier.

]]>
Are we at the vanguard of every facet of our field? In our quest for knowledge, physicists have charted nebulae, quantified quarks and built instruments and machines at the edge of technology. Yet, there is a frontier that remains less explored: leadership. As a field, particle physics has only just begun to navigate the complexities of guiding our brightest minds.

Large-experiment collaborations such as those at the LHC achieve remarkable feats. Indeed, social scientists have praised our ability to coordinate thousands of researchers with limited “power” while retaining individual independence. Similarly, as we continuously optimise experiments for performance and quality, and there also exist opportunities to refine behaviours and practices to facilitate progress and collective success.

A voice for all

Hierarchies in any organisation can inadvertently become a barrier rather than a facilitator of open idea exchange. Often, decision-making is confined to higher levels, reducing the agency of those implementing actions and leading to disconnects in roles and responsibilities. Excellence in physics doesn’t guarantee the interpersonal skills that are essential for inspiring teams. Moreover, imposter syndrome infects us all, especially junior collaborators who may lack soft-skills training. While striving for diversity we sometimes overlook the need to embrace different personality types, which, for example, can make large meetings daunting for the less outspoken. Good leadership can help navigate these challenges, ensuring that every voice contributes to our collective progress.

Leadership is not management (using resources to get a particular job done), nor is it rank (merely a line on a CV). It is guidance and influence of others towards a shared vision – a pivotal force as essential as any tool in our research arsenal. Good leadership is a combination of strategic foresight, emotional intelligence and adaptive communication; it creates an inclusive environment where individual contributions are not commanded but empowered. These practices would improve any collaboration. In large physics experiments this type of leadership is incidental instead of being broadly acknowledged and pursued.

Luckily, leadership is a skill that can be taught and developed through training. True training is a craft and is best delivered by experts who are not just versed in theory but are also skilled practitioners. Launched in autumn 2023 based on the innovative training approach of Resilient Leaders Elements, a new course “Leading in Collaborations” is tailored specifically for our community. The three-month expert-facilitated course includes four half-day workshops and two one-hour clinics, addressing two main themes: “what I do”, which equips participants with decision-making skills to set clear goals and navigate the path to achieving them; and “who I am”, which encourages participants to channel their emotions positively and motivate both themselves and others effectively. The course confronts participants with the question “What is leadership in a large physics collaboration?” and provides a new framework of concepts. Through self-assessment, peer-feedback sessions, individualised challenges and buddy-coaching, participants are able to identify blind spots and hidden talents. A final assessment shows measurable change in each skill.

The first cohort of 20 participants, displaying a diverse mix of physics experience from various institutions and nationalities, was welcomed to the programme at University College London on 14 and 15 November 2023. More than half of the participants were women – in line with the programme’s aim to ensure that those often overshadowed are given the visibility and support to become more impactful leaders. The lead facilitator, Chris Russell, masterfully connected with the audience via his technical physics background and proceeded to build trust and impart knowledge in an open and supportive atmosphere. When discussing leadership, the initial examples given cited military and political figures; reframing led to a participant’s description of a conductor giving their orchestra space to play through an often-rehearsed tough section as an example of great leadership.

Crucial catalyst

Building on the experience of the first cohort, the aim is to offer the programme more broadly so that we can encourage common practice and change the culture of leadership in large collaborations. Given that the LHC hosts the largest collaborations in physics, the programme also hopes to find a home within CERN’s learning and development portfolio.

The Leading in Collaborations programme is a crucial catalyst in the endeavour to ensure that our precious resources are wielded with precision and purpose, and thus to amplify our collective capacity for discovery. Join the leadership revolution by being the leader you wish you had, no matter your rank. Together, we will become the cultural vanguard!

The post Leading in collaborations appeared first on CERN Courier.

]]>
Careers Gabriel Facini describes a new programme to change the culture of leadership in large scientific collaborations. https://cerncourier.com/wp-content/uploads/2024/01/CCJanFeb24_Careers_leadingcollabs.jpg
Portraits of particle physics in Ukraine https://cerncourier.com/a/portraits-of-particle-physics-in-ukraine/ Thu, 11 Jan 2024 16:47:15 +0000 https://preview-courier.web.cern.ch/?p=109925 Particle physicists in Ukraine describe how they and their institutes are recovering from the damage so far, the importance of continued global support, and how science in Ukraine can be rebuilt when the war is over.

The post Portraits of particle physics in Ukraine appeared first on CERN Courier.

]]>
Institute for Scintillation Materials, Kharkiv

Located in Kharkiv, the Institute for Scintillation Materials (ISMA) of Ukraine has both a large scientific base and technological facilities for the production of scintillation materials and detectors. It has been a member of the CMS collaboration for about 20 years, including participation in the production of scintillation tiles for the current calorimeter and as a potential manufacturer of tiles for the HGCAL upgrade. Since 2021, ISMA has also been a technical associate member of LHCb hosted by the University of Bologna, where we participate in the PLUME (probe for luminosity measurement) project. ISMA is also a member of the Crystal Clear Collaboration at CERN and, since 2019, of the 3D printed detectors (3DET) project (see p8). In addition, ISMA is a supplier of scintillation materials and detectors for projects outside CERN.

Institute for Scintillation Materials

With the outbreak of the war in Ukraine, the Institute became a home for many. In the months following March 2022, about 50 staff members lived in the basement with their families and pets. In addition, some 300 people who were living nearby moved into the Institute’s bomb shelter, where staff provided food and helped people to adapt.

At the beginning of March 2022, one of our processing areas for crystal growth was damaged due to an air raid. This was shocking not only to us, but also for our partners for whom we serve as the main supplier of products. It was necessary to make a quick and important decision: wait until the end of active hostilities and then reconstruct infrastructure and technology, or start doing something now. We realised that technological downtime would result in the loss of a market that had been developed over decades and would also make it economically impossible for us to restart production cycles with the necessary volumes. We got together with our staff, who were living on the Institute’s territory. Some people even came to besieged Kharkiv from other cities to help. Between alarms and artillery shelling, the guys were coming out of the bomb shelters to go to work. Just one month after the war started, products were already being shipped to our customers. Once temperatures started to rise above zero, we started to move the processing equipment and growth units out of the damaged processing area. Not only did we have to repair these, but we also had to clear the premises of other equipment, calculate and pour new foundations, hook up the entire infrastructure and lay the lines for services – all in a period of a few months. By May 2022, we had already started growing large crystals of up to 500 mm in diameter at the new location. Some of our partners did not even notice the delays in delivery and we were able to meet our delivery commitments for 2022 in full.

We are very grateful to our colleagues, and to our friends at CERN, who offered their help and supported us from the early days of the war. They were not only CERN staff members, but also people from other institutes and organisations who called and wrote letters every day. They even organised a special programme to welcome families who had to leave Kharkiv at that time, and helped to persuade those who did not want to leave to move to safer cities in Ukraine or in Europe, at least temporarily.

By mid-summer, ISMA resumed the production of experimental scintillator tiles for CMS

In April 2022 we started discussions on future cooperation with our colleagues at CERN. Unfortunately, it was impossible to continue any work during the first two months of the war. However, we agreed that work should not stop and that some of it could be carried out in the organisations of our partners. We collected all the materials from Kharkiv that our colleagues needed and sent it to them. Some female colleagues, who could leave Ukraine, were also invited temporarily to continue their work abroad in these organisations. This allowed us to continue joint research programmes with our European partners. All our R&D projects were maintained either in Kharkiv or at the partner institutes abroad.

In May 2022 we were informed that ISMA, together with CNRS, Université Claude Bernard Lyon 1 and CERN, had won a project financed from the European Union’s Horizon Europe programme to develop inorganic scintillation crystals for innovative calorimeters for high-energy physics. By mid-summer, ISMA resumed the production of experimental scintillator tiles for CMS. We also continued work on developing technology for the synthesis of scintillation granules based on inorganic crystals. At the end of summer 2022, the crystals had already been shipped to our partners. Work on the 3D printing of scintillators in Kharkiv continued unabated.

Despite the war and its impact on life in Kharkiv and work at our Institute, over the past 18 months ISMA was able to contribute to all of the ongoing projects at CERN, and even expanded its capacity by transferring some work to other European institutes – strengthening our capabilities to do world-class research. The technological aspect of scintillator production has been restored and ISMA is receiving new requests to design and manufacture scintillators for international projects. We are grateful to our partners for their support and cooperation.

Andriy Boyaryntsev deputy director ISMA.

Taras Shevchenko National University of Kyiv

Our group at Kyiv has cooperation with many European universities and groups. We collaborate on LHCb and on the proposed SHiP experiment at CERN, and the International Large Detector – a general-purpose detector for an electron–positron collider, primarily the ILC. The group has many scientific contacts with IJCLab at Paris-Saclay and cooperates with ETH Zurich on the study of perovskite materials. Before the war and COVID periods, our students had many internships in various European institutes and staff travelled regularly to Europe.

University of Kyiv

In the first weeks of the war, there was a serious disruption to life and to hopes for the future. Many of the women and girls were evacuated from Kyiv to the west of Ukraine and abroad. With the help of our graduates and foreign colleagues, I sent 17 female students to various European cities for long-term internships. Many other teachers also helped some travel to Europe.

At that time, we were really expecting a nuclear strike from a maddened neighbour. Thanks to our colleagues abroad, the registration of internships took place instantly, in just a few days. Meanwhile, the men in Kyiv were preparing for battles on the streets. I actively read how to use various types of weapons, even though I was not accepted due to my age. I was sure that I would find weapons on the streets during the fighting, and I collected equipment and materials for actions after a nuclear explosion (nuclear physics is our department specialty). Now it already looks childish, but at the beginning of March 2022 I said goodbye to my wife, who was evacuated to Europe to join her daughter, because we thought that we would never meet again.

I was not afraid: there were almost only men left in the city, and those who remained were ready to stand to the death. The general feeling of a joint struggle united us and supported our spirit. It was clear in those weeks that this was not the time for science. I did some volunteering, first buying body armour and other military equipment, then collecting money for the purchase of jeeps for the front line and prostheses for crippled soldiers. We (with the alumni of our department in Ukraine and abroad) collected for the army very quickly, raising the necessary several thousand euros in a few days. 

Since autumn 2022, we have resumed our scientific work and the connections with students

After the defeat of the Russian forces near Kyiv and Kharkiv, and especially after the return of Kherson, it became a little easier and we began to implement grants for students. This partially compensated for the decrease in real salaries and scholarships, and the high inflation of the hryvnia. Since autumn 2022, we have resumed our scientific work and the connections with students. We also have a lot of volunteer work as physicists and engineers. Many of the women and children have returned home – as has my wife. The main problem now is a more than two-fold drop in wages, taking into account inflation.

On 31 December 2022 a large Russian missile exploded between the buildings of the university. The explosion occurred at a height of several metres (the rocket had impacted a large tree), completely or partially destroying more than 500 large windows in seven buildings, including two thirds of the windows in our building.

Our small group now has acceptable working conditions. Currently, quite a lot of European and partial US grants are provided to our students for remote work. However, the necessary restriction during the war period on trips abroad for boys and men of conscription age has greatly hindered both scientific work and effective teaching. There has been a rapid washout of qualified personnel from scientific groups, especially young people who have been driven to look elsewhere for acceptable wages in Ukraine or abroad. After the end of the war, it will be difficult (or even impossible in some areas) to restore an effective group composition. Obtaining scientific grants during the war can significantly stop this degradation of science in our country.

Ukrainian science has been seriously affected due to the constant bombing of buildings and scientific facilities, the large outflow of personnel (especially women) to institutes and universities abroad, the decrease in real salaries, and the blocking of international internships and scientific travel for male scientists. I am sure that, step by step, we will restore lost contacts with foreign scientific centres and rebuild the scientific and educational resources of Ukraine that have been destroyed by the Russian invasion.

Oleg Bezshyyko associate professor, Taras Shevchenko National University of Kyiv.

Odesa National University

When the war started on 24 February 2022, I was with my family in Odesa. At around 4 a.m. I saw from the window in my flat how Russia had bombed Odesa port. In that moment, it was very difficult to understand what was going on and how to act. Yet, within a week, when it became clear that this was a real war, I received invitations from people in the physics department of the Jagiellonian University in Krakow, Poland, to visit them in the capacity of a visiting professor. I drove with my family through Moldova, Romania, Hungary, Slovakia and finally arrived in Krakow, where the people from the department adopted us. The children went to school the next day. There is only a small difference in language between Polish and Ukrainian, so it was not too difficult for them to adapt. Two months later I received an invitation from a new institute near Dresden called the Сenter for Advanced System Understanding (CASUS), where I have been based ever since.

Odesa University

As a theoretical physicist, and a frequent visitor to the CERN theory department, it’s much easier for me to move than it is for those who are connected to an experiment. Many of the laboratories in Ukraine have been completely destroyed. How they manage is difficult for me to comprehend. Odesa was not occupied, so it was possible to remain there. But in winter there was no electricity or heating, and during the day there were often air alarms when residents had to go to shelters. We were also worried about our grandchildren. A few weeks after the war started, a rocket fell a couple of hundred metres from my apartment.

Prior to the invasion, I would travel to Russia for conferences but I didn’t have any collaborations with Russian institutes. For me to work abroad is quite a normal situation. But I thought it was just my own will. Now I will stay here because of the war. But I also miss Ukraine and Odesa. The question is what will happen when we win? The answer is not so simple.

Without investing money into science it is impossible to build a strong country

Without investing money into science it is impossible to build a strong country. But to have that requires a good level of education, and that’s not easy because many young people are abroad. Will they go back, and how? If we have a good scientific climate, I think many would like to return. But if there is no money for science, then no. The government situation is not easy. The eastern part of the country is completely destroyed. Up to now only a small percent of the nation’s budget goes to science. The level of education and science in Ukraine already went down in the 1990s compared to when it was part of the Soviet Union. Many good scientists went abroad and have not returned.

We are currently living in a state of stress and uncertainty. Our minds are completely occupied by the news. The situation is even worse for those who stay in Ukraine. Many young scientists would like to go to foreign institutions and many foreign universities and institutes have adopted Ukrainian people, especially female scientists. But for boys it is forbidden. The border is closed. How many cross it illegally is probably only a small percentage. They stay mainly inside of Ukraine, often to fight against Russia. I know many people who were killed, including a former astronomy student who was educated at Odesa University.

The second year of the war is ending. A difficult winter is ahead. Russia will again try to destroy infrastructure with missiles and drones, so that people have neither heat nor light, so that they lose the will to win. The situation is very difficult. I don’t know what will happen next year or the year after. I can’t imagine where my family and I will live. I really want to return to Odesa. But for this, Ukraine must win.

Oleksandr Zhuk Odesa National University; currently CASUS Germany.

Uzhhorod National University

Uzhhorod National University was established in 1945, and five years later the faculty of physics and mathematics began its work. Today, the university has cooperation with around 90 institutions worldwide. We have activities in solid-state physics, optics and laser physics, physics of electron–atom collisions and plasma, quantum theory of scattering, and astrophysics and astronomy. For the past five years our group (comprising 10 engineers, technicians, senior scientists and PhD students) has been cooperating with the ISOLDE facility at CERN. At the beginning this was a multidisciplinary project to investigate materials that have spontaneous magnetisation and polarisation. We have published several articles in this area and in particular have proposed layered van-der-Waals crystals – a promising field for applications that can be further investigated with ISOLDE.

Uzhhorod National University

Uzhhorod is located just at the western Ukrainian border towards Slovakia and 20 km to the border with Hungary. While there were a few attacks from Russian forces, the situation here is relatively okay compared to other parts of Ukraine. We have the possibility to work, although there were times where we wouldn’t have electricity, so we couldn’t do any measurements or calculations. When the war started I immediately received calls from many colleagues outside Ukraine, who asked me to come to their labs. I did not expect this at all. Generally, many scientists left, especially from Kharkiv. Many of them came to Uzhhorod, others went abroad, for example to Poland, the US, the UK or France. For many who are from highly bombarded regions, this was certainly the correct decision; otherwise, they could have been killed. We keep in touch and continue our work.

After the first week of the invasion, we evaluated the situation and hoped that Kyiv would remain unoccupied. After about a month, we fully resumed work. It took some time to get back to a reality where you can concentrate. Looking back, it felt like a state of hypnosis, because the situation was so bad. Now, it’s better. I have published three papers in Physical Review since the beginning of the war. I hope we continue to receive support from European countries, the US, Canada, Australia and Japan.

Long before the invasion, I often participated in meetings and worked with Russian scientists. After the annexation of Crimea in 2014, I stopped. Many others continued to collaborate after 2014. We are academics after all, and we work in science. Maybe after the war, some peace regulation will make scientific and diplomatic co-operations possible again. To use an analogy from solid-state physics, the 2014 invasion of Crimea was a first-order transition whereas this one was a second-order transition that continues with a modulated phase.

It took some time to get back to a reality where you can concentrate

Since the invasion, we have prepared and submitted a proposal to the European Union Horizon programme. After successful evaluation at the beginning of October, together with scientists from Portugal, Spain, Denmark, Poland and from Kyiv, we have started the Piezo2D project to investigate piezoelectricity in 2D materials and their relevant device performance.

It is crucial to have Ukrainian universities participate in academic European programmes, not just formally on paper but to be actively involved. We don’t ask for any preference. We want to have the same possibilities as any other country to participate and for our people have the experience to be part of it.

As I am over 60 years old, I am allowed to leave the country. But younger male scientists can’t leave unless they have a permit for special services or duties. Some find special permission to study abroad as PhD students. Many others went to join the Ukrainian army. Some of us, especially physicists and chemists, are involved in special technology R&D programmes.

I’m sure that Ukraine will win the war. Then we will rebuild the economy, society and science. The latter will be especially important. Our government understands that science produces knowledge, and now is the time for it. For now, however, we must hope and work with the situation at hand. And here goes a big “thank you” from me and my colleagues to all those helping and supporting us at CERN and beyond.

Yulian Vysochanskii head of the semiconductor physics department, Uzhhorod National University.

Kharkiv Institute of Physics and Technology

Our institute, founded in 1928, has a long connection with high-energy physics and with CERN. Theorists Dmitrij Volkov and Vladimir Akulov played a crucial role in the development of supergravity and supersymmetry, for example, and for more than 20 years researchers at Kharkiv Institute of Physics and Technology (KIPT) have been actively working with the LHC experiments. In CMS, for which we contributed to the endcap hadron calorimeters, we host a Tier-2 computational cluster that is considered one of the best; in LHCb we have participated in the calorimeter system maintenance and support. In collaboration with colleagues at Bogolyubov Institute for Theoretical Physics in Kyiv, we participate in the inner tracking project for ALICE and are working on ITS3 upgrade. We also have collaborations with CERN concerning new theoretical and experimental proposals, for instance on the interaction of half-bare particles with matter. The first electron accelerator with an energy of 2 GeV in Europe was created and launched at KIPT in 1965. Before February 2022, the institute continued to operate a number of electron accelerators of lower energies and several large installations, such as the stellarator and quasi-stationary plasma accelerator.

Kharkiv Institute of Physics and Technology

Prior to the Russian invasion, our institute had a staff of more than 2000 people. In former Soviet Union times it was three times larger, and subordinate to the ministry within which the atomic project was performed (our institute had the status of laboratory no. 1). It was not so well known at the time because we were a closed-regime facility. In 1993 our institute became the first national scientific centre of Ukraine, with the full name National Science Centre “Kharkiv Institute of Physics and Technology” (NSC-KIPT), and our scientists started to cooperate actively with CERN and other international centres. NSC KIPT consists of institutes devoted to theoretical physics, high-energy and nuclear physics, solid-state physics, plasma physics, plasma electronics and new methods of acceleration, in addition to a number of quite large scientific complexes. A significant portion of the institute’s work centres around the Neutron Source facility (which is being created jointly with the US Department of Energy) and R&D into fuels for nuclear power plants. Based on this setup we are promoting the creation of an international centre for nuclear physics and medicine, a preliminary proposal for which has been supported by the US and the IAEA. COVID, followed by the full-scale invasion of the Russian army, have temporarily put this project on hold.

The institute has sharply increased cooperation with major international scientific centres

At the beginning of the invasion, an idea was spread quickly by Russian media that our institute was still working on the creation of nuclear weapons. It was a lie. Similar things were also said by the Russian media, incorrectly, to be taking place at Chernobyl. On 6 March 2022 we got together with the head of the institute of safety operations for nuclear power plants in Kyiv and made a joint declaration rejecting these accusations. Since 1994, and especially lately (even during the war), the institute has been regularly inspected by the IAEA. Of course, no violations were discovered, nor was any work on the creation of nuclear weapons discovered.

Our institute is located around 30 km from the border of Russia. Since 24 February 2022, it has been repeatedly shelled and has suffered significant damage. More than 100 shells, rockets and bombs fell on its territory. At the very beginning, Russian troops started their movement to Kharkiv along the road near our institute; it was stopped by our soldiers. About one month later, Russia made a second attempt to take Kharkiv, which came within 500 m of our institute before being stopped. Outside the institute in a residential area called Piatykhatky, where many staff members live, multiple buildings were destroyed. For 40 days following the shelling of 31 March 2022, the entire area didn’t have water, electricity or phone networks. Thanks to the hard work of the staff who remained, we managed to restore everything, often while bombs were falling.

With the start of military activity, many specialists from the institute left Kharkiv and continued to work remotely. Some large installations intended for conducting physical experiments have remained operational. The institute has sharply increased cooperation with major international scientific centres such as CERN, DESY, Orsay, the Italian centres at Frascati and Ferrara, and others.

With great hope, enthusiasm and optimism we believe that it will be possible to defend the territorial integrity of Ukraine and look to reviving its economic and scientific potential.

Mykola Shulga director-general National Science Centre Kharkiv Institute of Physics and Technology.

The post Portraits of particle physics in Ukraine appeared first on CERN Courier.

]]>
Feature Particle physicists in Ukraine describe how they and their institutes are recovering from the damage so far, the importance of continued global support, and how science in Ukraine can be rebuilt when the war is over. https://cerncourier.com/wp-content/uploads/2024/01/CCJanFeb24_UKRAINE_map.jpg
Interactive exhibits: theory and practice https://cerncourier.com/a/interactive-exhibits-theory-and-practice/ Fri, 03 Nov 2023 12:53:06 +0000 https://preview-courier.web.cern.ch/?p=109524 Members of the CERN exhibition team offer top tips for designing the ideal interactive exhibit.

The post Interactive exhibits: theory and practice appeared first on CERN Courier.

]]>
Alongside a hands-on education laboratory and large auditorium, Science Gateway houses three permanent exhibitions: Discover CERN, Our Universe and Quantum World. As they come through the doors, visitors discover a rich mixture of exhibition elements: authentic objects, contemporary artworks, audiovisual content, immersive spaces – and, of course, an abundance of interactive exhibits. The latter go through many carefully considered steps to present a spot-on experience to visitors, and must meet a number of criteria (see “Criteria that a good interactive exhibit should meet” panel).

Irrespective of the topic, there is a basic recipe for making an interactive exhibit. Once the clear message the exhibit aims to convey has been identified, developers write a draft that sets up a scenario of visitor interaction and sketches how the exhibit may look. What will visitors see when they approach the exhibit? What can they do? Are there several ways in which it is possible to interact with the exhibit?

Model making

The next step is to make a prototype. Depending on the nature of the exhibit, it may be a “quick and dirty” mockup, a simple 3D model, a paper prototype, or even just a verbal description. Then, the prototype is tested with at least several members of the target audience. What do they conclude from their interaction with the exhibit and why? How enjoyable and interesting do they find it? How well does the exhibit convey its key message? Afterwards comes the design and building of the exhibit. This stage often involves a lot of technical testing – for example, when choosing the materials or trying to keep within the available budget. In addition, texts that accompany the exhibit need to be written and translated. When the (nearly) final version of the exhibit is ready, it is evaluated again with the target audience. How clear are the instructions and the gameplay? What needs to be changed in the exhibit and how exactly? The final step, if necessary, is to reiterate.

Visitors discover the phenomenon by going through the stages of the scientific method

In reality, however, the development process rarely turns out to be simply moving from one step to the next. Sometimes, the results of testing a prototype with the public show that the scenario needs to be rewritten completely, bringing developers back to where they started. In other cases, time or budgetary constraints force the team to merge some steps or even skip them entirely. Two Science Gateway exhibits illustrate the twists and turns of developing a state-of-the-art science exhibit.

Criteria that a good interactive exhibit should (or at least should try to) meet

Focus  The exhibit should aim to convey only one, clearly defined message. For example: “In the LHC, particles are accelerated with the help of an electric field.”

Edutainment  Visitors can/should have fun when interacting with the exhibit and at the same time learn something new.

Responsiveness  Visitors should immediately receive a reaction from the exhibit when they do something. Simple encouragement like “Good job!” or “Keep going!” is helpful.

Multi-sensory experience  Interaction with the exhibit should involve as many senses as possible.

Self-sufficiency  People should understand how to use the exhibit, as well as the key ideas of the exhibit, without the help of a guide.

Zero position  After visitors leave the exhibit, it should self-restore to the state where it is ready to be used by the next person.

Safety  The exhibit should be safe for everyone to use, including children.

Maintenance  The technical team needs to have easy access to the exhibit and be able to use standard components to fix it if necessary.

Physical accessibility  The exhibit must be usable by – and feel welcoming for – diverse groups of visitors, for example, wheelchair users.

Content accessibility  Target audiences need to understand the language, key messages, ideas.

3 Social interactions  Co-operation between visitors should be encouraged, for example through gameplay.

The starting point for the antimatter-trap exhibit (see “Trial and error” image) was a real antimatter trap – an eye-catching piece of scientific equipment that is well suited to an interactive exhibit. Following several brainstorms with antimatter scientists, a PhD student who specialised in the design of interactive science-communication experiences, and who developed the exhibit scenario further, made a paper prototype. In this version of the exhibit, visitors first had to slow an antiproton down in a decelerator, and only after that shoot the antiproton into the trap. The trap had several parameters (for example, a magnetic field that could be switched on and off) that visitors could play with before injecting the antiproton into the decelerator. Depending on how the parameters had been set, the antiproton would fly through the trap, annihilate or become captured.

We then tested this prototype with six small groups of visitors at the CERN Microcosm exhibition and six groups of CERN members of personnel who did not have a background in science or technology. Results from the testing led to many changes. For example, we learnt that many people were confused about deceleration. We therefore removed this stage from the gameplay but kept the speed of the antiproton coming into the trap as one of the parameters. A bigger problem was the fact that visitors, most of whom had heard nothing about antimatter prior to their interaction with the exhibit, still did not understand anything about it after successfully trapping the antiproton. We were faced with a dilemma: should the main message of the exhibit be about the antimatter itself, or should the exhibit still focus on how the antimatter trap works? After a difficult consideration, we decided to stick to the latter, whilst ensuring the former is included elsewhere in the exhibition.

An updated version of the exhibit scenario was handed over to the multimedia company that had been contracted to develop the exhibit, and further tests were conducted with two classes of Italian middle-school students. Apart from some minor usability issues, the exhibit proved to be challenging yet engaging: students yelled proudly and happily high-fived their team members after managing to trap the antiproton. The exhibit was then improved further, eventually taking its current shape in Science Gateway’s “Back to the Big Bang” exhibition. This was also the moment to come back to the antimatter scientists who helped ensure that all the texts and drawings in the exhibit were correct.

Keep the heat out

The goal here was to come up with a hands-on exhibit that would allow Science Gateway visitors to discover what keeps the LHC superconducting magnets cold. Similar to the experience with the antimatter exhibit, we again worked side-by-side with a CERN scientist. The original plan was to recreate a real situation: place a cold source at the centre and cover it with layers of different insulation materials. Visitors would be able to open and close these layers, as well as create a vacuum. They would observe that when the cold source was shielded from the environment, it required less power consumption to stay cool. This idea looked very promising on paper, especially given that the exhibit would be integrated into a full-scale mockup of the LHC.

Heated plate

However, calculations showed that the effect would only become visible after approximately half an hour. While this may not seem like a deal-breaker, in this context it certainly is: visitors at science exhibitions expect immediate feedback and typically do not spend more than just a few minutes at each exhibit. Moreover, having a surface that is sufficiently colder that the environment leads to condensation, which would create difficulties for the technical maintenance of the exhibit.

Alternative ideas were needed. To avoid reinventing the wheel, we explored which exhibits on the topic already existed in other science centres and museums. Eventually we decided to focus on a very basic idea: certain materials block heat, other materials conduct it. Communicating this message required a plate heated to 45 °C (not so hot that visitors risk burning themselves, but still warm enough to see the effect), a bunch of conducting and insulating materials that come in different shapes and thicknesses, and an infrared camera. Lots of technical prototyping and testing allowed us to determine how thick the materials should be to ensure that the effect was both visible and revealed itself quickly enough for the exhibit to be interesting. As the reflective metallic surfaces produce incorrect readings on the infrared camera, all metal materials were painted black. Finally, to keep the link between the exhibit and the actual LHC insulation, key elements such as mylar, vacuum and minimal surface contact were incorporated.

In its final form, the exhibit enables open-ended exploration, such that visitors discover the phenomenon by going through the stages of the scientific method. First, visitors face a challenge: in the instructions accompanying the exhibit, they are invited to try shielding the heated plate from the infrared camera. Then, by picking a certain type of material, visitors form a hypothesis – “maybe this piece made of copper will do the job?” – and test it by placing the material on the plate. As the copper piece quickly reaches the same temperature as the heated plate, visitors observe it with the help of the infrared camera and conclude that copper is not a good choice. This leads to a new hypothesis, another material… and so on. Visitors are free to explore the exhibit as long as they want, and we hope that for many Science Gateway visitors this open exploration will culminate in the magical “aha!” moment – the reason why we developed all these interactive exhibits in the first place.

The post Interactive exhibits: theory and practice appeared first on CERN Courier.

]]>
Feature Members of the CERN exhibition team offer top tips for designing the ideal interactive exhibit. https://cerncourier.com/wp-content/uploads/2023/10/CCNovDec23_EXHIBITION_antimatter2.jpg
We need to talk about CERN’s future https://cerncourier.com/a/we-need-to-talk-about-cerns-future/ Fri, 03 Nov 2023 12:29:52 +0000 https://preview-courier.web.cern.ch/?p=109636 Fighting for the most adequate words and pictures that give meaning to what we are doing is crucial to keep the community focused and motivated for the long march ahead, says Urs Wiedemann.

The post We need to talk about CERN’s future appeared first on CERN Courier.

]]>
In big science, long-term planning for future colliders is a careful process of consensus building. Particle physics has successfully institutionalised this discourse in the many working groups and R&D projects that contribute, for example, to the European strategy updates and the US Snowmass exercise. But long timescales and political dimensions can render these processes impersonal and uninspiring. Ultimately, a powerful vision that captures the imagination of current and future generations must go beyond consensus building; it should provide a crisp, common intellectual denominator of how we talk about what we are doing and why we are doing it.

A lack of uniqueness

For several decades, the hunt for the Higgs boson has been central to such a captivating narrative. Today, 11 years after its discovery, all other fundamental open questions remain open, and questions about the precise nature of the Higgs mechanism have become newly accessible to experimentation. What the field is facing today is not a lack of long-term challenges and opportunities, but a lack of uniqueness of one scientific hypothesis behind which a broad and intrinsically heterogeneous international research community could be assembled most easily.

We need to learn how to communicate this reality more effectively. Particle physics, even if no longer driven by the hypothesis of a particular particle within guaranteed experimental reach, continues to have a well-defined aim in understanding the fundamental composition of the universe. From discussions, however, I sense that many of my colleagues find it harder to develop long-term motivation in this more versatile situation. As a theorist I know that nature does not care about the words I attach to its equations. And yet, our research community is not immune to the motivational power of snappy formulations.

Urs Wiedemann

The exploration of the Higgs sector provides a two-decade-long perspective for future experimentation at the LHC and its high-luminosity upgrade (HL-LHC). However, any thorough exploration of the Brout–Englert–Higgs mechanism exceeds the capabilities of the HL-LHC and motivates a new machine. Why is it then challenging to communicate to the greater public that collecting 3 ab–1 of data by the end of the HL-LHC is more than filling-in details on a discovery made in 2012? How can our narrative better reflect the evolving emphasis of our research? Should we talk, for example, about the Higgs’ self-interaction as a “fifth force”? Or would this be misleading cheerleader language, given that the Higgs self-coupling, unlike the other forces in the Standard Model Lagrangian, is not gauged? Whatever the best pitch is, it deserves to be sharpened within our community and more homogeneously disseminated.

Another compelling narrative for a future collider is the growing synergy with other fields. In recent decades, space-based astrophysical observatories have started to reach a complexity and cost comparable to the LHC. In addition, there is a multitude of smaller astrophysical observatories. We should welcome the important complementarities between lab-based experimental and space-based observational approaches. In the case of dark matter, for example, there are strong generic reasons to expect that collider experiments can constrain (and finally, establish) the microscopic nature of dark matter and that the solution lies in experimentally unchartered territory, such as either very massive or very feebly interacting particles.

What makes the physics of the infinitesimally small exciting for the public is also what makes it difficult to communicate

What makes the physics of the infinitesimally small exciting for the public is also what makes it difficult to communicate, starting with subtle differences in the use of everyday language. For a lay audience, for instance, a “search for something” is easy to picture, and not finding the something is a failure. In physics, however, particles can reveal themselves in quantum fluctuations even if the energy needed to produce them can’t be reached. Far from being a failure, not-finding with increased precision becomes an intrinsic mark of progress. When talking to non-scientists, should we try to bring to the forefront such unique and subtle features of our search logic? Could this be a safeguard against the foes of our science who misrepresent the perspectives and consequences of our research by naively equating any unconfirmed hypothesis with failure? Or is this simply too subtle and intellectual to be heard?

Clearly, in our everyday work at CERN, getting the numbers out is the focus. But going beyond this operational attitude and fighting for the most adequate words and pictures that give meaning to what we are doing is crucial to keep the community focused and motivated for the long march ahead.

• Adapted from text originally published in the CERN Staff Association newsletter.

The post We need to talk about CERN’s future appeared first on CERN Courier.

]]>
Opinion Fighting for the most adequate words and pictures that give meaning to what we are doing is crucial to keep the community focused and motivated for the long march ahead, says Urs Wiedemann. https://cerncourier.com/wp-content/uploads/2023/11/CCNovDec23_VIEW-telepathy.jpg
PHYSTAT systematics at BIRS https://cerncourier.com/a/phystat-systematics-at-birs/ Fri, 01 Sep 2023 12:58:05 +0000 https://preview-courier.web.cern.ch/?p=109213 Systematic errors are becoming increasingly important as larger datasets reduce statistical errors in many analysis channels.

The post PHYSTAT systematics at BIRS appeared first on CERN Courier.

]]>
Ann Lee

The PHYSTAT series of seminars and workshops provides a unique meeting ground for physicists and statisticians. The latest in-person meeting, after previously being postponed due to COVID, covered the field of systematic errors (sometimes known as nuisance parameters), which are becoming increasingly important in particle physics as larger datasets reduce statistical errors in many analysis channels. Taking place from 23 to 28 April at the Banff International Research Station (BIRS) in the Canadian Rockies, the workshop attracted 42 delegates working not only on the LHC experiments but also on neutrino physics, cosmic-ray detectors and astrophysics.

The organisers had assigned half of the time to discussions, and that time was used. Information flowed in both directions: physicists learned about the Wasserstein distance and statisticians learned about jet energy scales. The dialogue was constructive and positive – we have moved on from the “Frequentist versus Bayesian” days and now everyone is happy to use both – and the discussions continued during coffee, dinner and hikes up the nearby snow-covered mountains. 

Our understanding of traditional problems continues to grow. The “signal plus background” problem always has new features to surprise us, unfolding continues to present challenges, and it seems we always have more to learn about simple concepts like errors and significance. There were also ideas that were new to many of us. Optimal transport and the Monge problem provide a range of tools whose use is only beginning to be appreciated, while neural networks and other machine-learning techniques can be used to help find anomalies and understand uncertainties. The similarities and differences between marginalisation and profiling require exploration, and we probably need to go beyond the asymptotic formulae more often than we do in practice.

Another “Banff challenge”, the third in a sequence, was set by Tom Junk of Fermilab. The first two had a big impact on the community and statistical practice. This time Tom provided simulated data for which contestants had to find the signal and background sizes, using samples with several systematic uncertainties – these uncertainties were unspecified, but dark hints were dropped. It’s an open competition and anyone can try for the glory of winning the challenge.

Collaborations were visibly forming during the latest PHYSTAT event, and results will be appearing in the next few months, not only in papers but in practical procedures and software that will be adopted and used in the front line of experimental research.

This and other PHYSTAT activities continue, with frequent seminars and several workshops (zoom, in-person and hybrid) in the planning stage.

The post PHYSTAT systematics at BIRS appeared first on CERN Courier.

]]>
Meeting report Systematic errors are becoming increasingly important as larger datasets reduce statistical errors in many analysis channels. https://cerncourier.com/wp-content/uploads/2023/08/CCSepOct23_FN_barlow_feature.jpg
Five sigma revisited https://cerncourier.com/a/five-sigma-revisited/ Mon, 03 Jul 2023 13:33:03 +0000 https://preview-courier.web.cern.ch/?p=108685 Louis Lyons traces the origins of the “five sigma” criterion in particle physics, and asks whether it remains a relevant marker for claiming the discovery of new physics.

The post Five sigma revisited appeared first on CERN Courier.

]]>
The standard criterion for claiming a discovery in particle physics is that the observed effect should have the equivalent of a five standard-deviation (5σ) discrepancy with already known physics, i.e. the Standard Model (SM). This means that the chance of observing such an effect or larger should be at most 3 × 10–7, assuming it is merely a statistical fluctuation, which corresponds to the probability of correctly guessing whether a coin will fall down heads or tails for each of 22 tosses. Statisticians claim that it is crazy to believe probability distributions so far into their tails, especially when systematic uncertainties are involved; particle physicists still hope that they provide some measure of the level of (dis)agreement between data and theory. But what is the origin of this convention, and does it remain a relevant marker for claiming the discovery of new physics?

There are several reasons why the stringent 5σ rule is used in particle physics. The first is that it provides some degree of protection against falsely claiming the observation of a discrepancy with the SM. There have been numerous 3σ and 4σ effects in the past that have gone away when more data was collected. A relatively recent example was an excess of diphoton events at an energy of 750 GeV seen in both the ATLAS and CMS data of 2015, but which was absent in the larger data samples of 2016. 

Systematic errors provide another reason, since such effects are more difficult to assess than statistical uncertainties and may be underestimated. Thus in a systematics-dominated scenario, if our estimate is a factor of two too small, a more mundane 3σ fluctuation could incorrectly be inflated to an apparently exciting 6σ effect. A potentially more serious problem is a source of systematics that has not even been considered by the analysts, the so-called “unknown unknowns”. 

Know your p-values 

Another reason underlying the 5σ criterion is the look-elsewhere effect, which involves the “p-values” for the observed effect. These are defined as the probability of a statistical fluctuation causing a result to be as extreme as the one observed, or more so, assuming some null hypothesis. For example, in tossing an unbiased coin 10 times, and observing eight of them to be tails when we bet on each of them being heads, it is the probability of being wrong eight or nine or 10 times (5.5%). A small p-value indicates a tension between the theory and the observation. 

Higgs signals

Particle-physics analyses often look for peaks in mass spectra, which could be the sign of a new particle. An example is shown in the “Higgs signals” figure, which contains data from CMS used to discover the Higgs boson (ATLAS has similar data). Whereas the local p-value of an observed effect is the chance of a statistical fluctuation being at least as large as the observed one at its specific location, more relevant is a global p-value corresponding to a fluctuation anywhere in the analysis, which has a higher probability and hence reduces the significance. The local p-values corresponding to the data in “Higgs signals” are shown in the figure “p-values”. 

A non-physics example highlighting the difference between local and global p-values was provided by an archaeologist who noticed that a direction defined by two of the large stones at the Stonehenge monument pointed at a specific ancient monument in France. He calculated that the probability of this was very small, assuming that the placement of the stones was random (local p-value), and hence that this favoured the hypothesis that Stonehenge was designed to point in that way. However, the chance that one of the directions, defined by any pair of stones, was pointing at an ancient monument anywhere in the world (global p-value) is above 50%. 

Current practice for model-dependent searches in particle physics, however, is to apply the 5σ criterion to the local p-value, as was done in the search for the Higgs boson. One reason for this is that there is no unique definition of “elsewhere”; if you are a graduate student, it may be just your own analysis, while for CERN’s Director-General, “anywhere in any analysis carried out with data from CERN” may be more appropriate. Another is that model-independent searches involving machine-learning techniques are capable of being sensitive to a wide variety of possible new effects, and it is hard to estimate what their look-elsewhere factor should be. Clearly, in quoting global p-values it is essential to specify your interpretation of elsewhere. 

Local p-values

A fourth factor behind the 5σ rule is plausibility. The likelihood of an observation is the probability of the data, given the model. To convert this to the more interesting probability of the model, given the data, requires the Bayesian prior probability of the model. This is an example of the probability of an event A, assuming that B is true, not in general being the same as the probability of B, given A. Thus the probability of a murderer eating toast for breakfast may be 60%, but the probability of someone who eats toast for breakfast being a murderer is thankfully much smaller (about one in a million). In general, our belief in the plausibility in a model for a particular version of new physics is much smaller than for the SM, thus being an example of the old adage that “extraordinary claims require extraordinary evidence”.  Since these factors vary from one analysis to another, one can argue that it is unreasonable to use the same discovery criterion everywhere. 

There are other relevant aspects of the discovery procedure. Searches for new physics can be just tests for consistency with the SM; or they can see which of two competing hypotheses (“just SM” or “SM plus new physics”) provides a better fit to the data. The former are known as goodness-of-fit tests and may involve χ2, Kolmogorov–Smirnov or similar tests; the latter are hypothesis tests, often using the likelihood ratio. They are sometimes referred to as model-independent and model-dependent, respectively, each having its own advantages and limitations. However, the degree of model dependence is a continuous spectrum rather than a binary choice.

It is unreasonable to regard 5.1σ as a discovery, but 4.9σ as not. Also, should we regard the one with better observed accuracy or better expected accuracy as the preferred result? Blind analyses are recommended, in that this removes the possibility of the analyser adjusting selections to influence the significance of the observed effect. Some non-blind searches have such a large and indeterminate look-elsewhere effect that they can only be regarded as hints of new physics, to be confirmed by future independent data. Theory calculations also have uncertainties, due for example to parameters in the model or difficulties with numerical predictions. 

Discoveries in progress 

A useful exercise is to review a few examples that might be (or might have been) discoveries. A recent example involves the ATLAS and CMS observation of events involving four-top quarks. Apart from the similarity of the heroic work of the physicists involved, these analyses have interesting contrasts with the Higgs-boson discovery. First, the Higgs discovery involved clear mass peaks, while the four-top events simply caused an enhancement of events in the relevant region of phase space (see “Four tops” figure). Then, the four-top production is just a verification of an SM prediction and indeed it would have been more of a surprise if the measured rate had been zero. So this is just an observation of an expected process, rather than a new discovery. Indeed, both preprints use the word “observation” rather than “discovery”. Finally, although 5σ was the required criterion for discovering the Higgs boson, surely a lower level of significance would have been sufficient for the observation of four-top events. 

The output from a graph neural network

Going back further in time, an experiment in 1979 claimed to observe free quarks by measuring the electrical charge of small spheres levitated in an oscillating electric field; several gave multiples of 1/3, which was regarded as a signature of single quarks. Luis Alvarez noted that the raw results required sizeable corrections and suggested that a blind analysis should be performed on future data. The net result was that no further papers were published on this work. This demonstrates the value of blind analyses.

A second historical example is precision measurements at the Large Electron Positron collider (LEP). Compared with the predictions of the SM, including the then-known particles, deviations were observed in the many measurements made by the four LEP experiments. A much better fit to the data was achieved by including corrections from the (at that time hypothesised) top quark and Higgs boson, which enabled approximate mass ranges to be derived for them. However, it is now accepted that the discoveries of the top quark and the Higgs boson were subsequently made by their direct observations at the Tevatron and at the LHC, rather than by their virtual effects at LEP.

The muon magnetic moment is a more contemporary case. This quantity has been measured and also predicted to incredible precision, but a discrepancy between the two values exists at around the 4σ level, which could be an indication of contributions from virtual new particles. The experiment essentially measures just this one quantity, so there is no look-elsewhere effect. However, even if this discrepancy persists in new data, it will be difficult to tell if it is due to the theory or experiment being wrong, or whether it requires the existence of new, virtual particles. Also, the nature of such virtual particles could remain obscure. Furthermore, a recent calculation using lattice gauge theory of the “vacuum hadronic polarisation” contribution to the predicted value of the magnetic moment brings it closer to the observed value (see “Measurement of the moment” figure). Clearly it will be worth watching how this develops. 

Our hope for the future is that the current 5σ criterion will be replaced by a more nuanced approach for what qualifies as a discovery

The so-called flavour anomalies are another topical example. The LHCb experiment has observed several anomalous results in the decays of B mesons, especially those involving transitions of a b quark to an s quark and a lepton pair. It is not yet clear whether these could be evidence for some real discrepancies with the SM prediction (i.e. evidence for new physics), or simply and more mundanely an underestimate of the systematics. The magnitude of the look-elsewhere effect is hard to estimate, so independent confirmation of the observed effects would be helpful. Indeed, the most recent result from LHCb for the R(K) parameter, published in December 2022, is much more consistent with the SM. It appears that the original result was affected by an overlooked background source. Repeated measurements by other experiments are eagerly awaited. 

A surprise last year was the new result by the CDF collaboration at the former Tevatron collider at Fermilab, which finished collecting data many years ago, on the mass of the W boson (mW), which disagreed with the SM prediction by 7σ. It is of course more reasonable to use the weighted average of all mW measurements, which reduces the discrepancy, but only slightly. A subsequent measurement by ATLAS disagreed with the CDF result; the CMS determination of mW is awaited with interest. 

Nuanced approach

It is worth noting that the muon g-2, flavour and mW discrepancies concern tests of the SM predictions, rather than direct observation of a new particle or its interactions. Independent confirmations of the observations and the theoretical calculations would be desirable.

Measurement of the moment

One of the big hopes for further running of the LHC is that it will result in the “discovery” of Higgs pair production. But surely there is no reason to require a 5σ discrepancy with the SM in order to make such claim? After all, the Higgs boson is known to exist, its mass is known and there is no big surprise in observing its pair-production rate being consistent with the SM prediction. “Confirmation” would be a better word than “discovery” for this process. In fact, it would be a real discovery if the di-Higgs production rate was found to be significantly above or below the SM prediction. A similar argument could be applied to the searches for single top-quark production at hadron colliders, and decays such as H → μμ or Bs→ μμ. This should not be taken to imply that LHC running can be stopped once a suitable lower level of significance is reached. Clearly there will be interest in using more data to study di-Higgs production in greater detail. 

Our hope for the future is that the current 5σ criterion will be replaced by a more nuanced approach for what qualifies as a discovery. This would include just quoting the observed and expected p-values; whether the analysis is dominated by systematic uncertainties or statistical ones; the look-elsewhere effect; whether the analysis is robust; the degree of surprise; etc. This may mean leaving it for future measurements to determine who deserves the credit for a discovery. It may need a group of respected physicists (e.g. the directors of large labs) to make decisions as to whether a given result merits being considered a discovery or needs further verification. Hopefully we will have several of these interesting decisions to make in the not-too-distant future. 

The post Five sigma revisited appeared first on CERN Courier.

]]>
Feature Louis Lyons traces the origins of the “five sigma” criterion in particle physics, and asks whether it remains a relevant marker for claiming the discovery of new physics. https://cerncourier.com/wp-content/uploads/2023/06/CCJulAug23_SIGMA_frontis.jpg
An insight into the European Spallation Source https://cerncourier.com/a/an-insight-into-the-european-spallation-source/ Wed, 21 Jun 2023 12:56:51 +0000 https://preview-courier.web.cern.ch/?p=108575 Available to watch now as Mats Lindroos, head of accelerator at EES, explores the European Spallation Source.

The post An insight into the European Spallation Source appeared first on CERN Courier.

]]>
By clicking the “Watch now” button you will be taken to our third-party webinar provider in order to register your details.

Want to learn more on this subject?

The European Spallation Source (ESS) is a European project with 13 members states and two host states. In this talk, Mats Lindroos will give examples of the science that will be done at ESS both in applied physics and fundamental physics. He will speak about the in-kind model, which made it possible to build this facility on a greenfield site in a country without any previous experience of much of the required technology.

Also reviewed will be the status of the project with beam on target planned for 2025 and the start of the full user programme in 2027.

Want to learn more on this subject?

Mats Lindroos has a PhD in subatomic physics from Chalmers University of technology in Gothenburg, Sweden, and since 2014, is adjunct professor at Lund’s university. He worked at CERN from 1993–2009 starting as a research fellow at the ISOLDE facility and from 1995 as a staff member in the CERN accelerator sector. He has among other tasks been responsible for PS Booster operation and technical coordination of the CERN ISOLDE facility. He has also been project leader of several CERN projects and had leading roles in several EC-supported design studies for future nuclear physics and neutrino facilities. Mats co-authored a book in 2009 on a future neutrino beam concept, beta-beams. Since 2009 he has been head of the accelerator division and sub-project leader at the European Spallation Source ERIC (ESS) in Lund.

The post An insight into the European Spallation Source appeared first on CERN Courier.

]]>
Webinar Available to watch now as Mats Lindroos, head of accelerator at EES, explores the European Spallation Source. https://cerncourier.com/wp-content/uploads/2023/05/2023-07-07-webinar-image.jpg
A celebration of physics in the Balkans https://cerncourier.com/a/a-celebration-of-physics-in-the-balkans/ Fri, 03 Mar 2023 12:09:04 +0000 https://preview-courier.web.cern.ch/?p=107918 The BPU11 Congress contributed to a closer cooperation between the Balkan countries and CERN, ICTP, SISSA, the Central European Initiative and others.

The post A celebration of physics in the Balkans appeared first on CERN Courier.

]]>
The 11th General Conference of the Balkan Physical Union (BPU11 Congress) took place from 28 August to 1 September 2022 in Belgrade, with the Serbian Academy of Science and Arts as the main host. Initiated in 1991 in Thessaloniki, Greece, and open to participants globally, the series provides a platform for reviewing, disseminating and discussing novel research results in physics and related fields. 

The scientific scope of BPU11 covered the full landscape of physics via 139 lectures (12 plenary and 23 invited) and 150 poster presentations. A novel addition was five roundtables dedicated to high-energy physics (HEP), widening participation, careers in physics, quantum and new technologies, and models of studying physics in European universities with a focus on Balkan countries. The hybrid event attracted about 476 participants (325 on site) from 31 countries, 159 of whom were students, and demonstrated the high level of research conducted in the Balkan states.

Roadmaps to the future

The first roundtable “HEP – roadmaps to the future” showed the strong collaboration between CERN and the Balkan states. Four out of 23 CERN Member States come from the region (Bulgaria, Greece, Serbia and Romania); two out of three Associate Member States in the pre-stage to membership are Cyprus and Slovenia; and two out of seven Associate Member States are Croatia and Turkey. A further four countries have cooperation agreements with CERN, and more than 400 CERN users come from the Balkans. 

Kicking off the HEP roundtable discussions, CERN director for research and computing Joachim Mnich presented the recently launched accelerator and detector R&D roadmaps in Europe. Paris Sphicas (CERN and the University of Athens) reported on the future of particle-physics research, during which he underlined the current challenges and opportunities. These included: dark matter (for example the search for WIMPs in the thermal parameter region, the need to check simplified models such as axial-vector and di-lepton resonances, and indirect searches); supersymmetry (the search for “holes” in the low-mass region that will exist even after the LHC); neutrinos (whether neutrinos are Majorana or Dirac particles, their mass measurement and exploration of a possible “sterile” sector); as well as a comprehensive review of the Higgs sector. 

CERN’s Emmanuel Tsesmelis, who was awarded the Balkan Physical Union charter and honorary membership in recognition of his contributions to cooperation between the Balkan states and CERN, reflected on the proposed Future Circular Collider (FCC). Describing the status of the FCC feasibility study, due to be completed by the end of 2025, he stressed that the success of the project relies on strong global participation. His presentation initiated a substantial discussion about the role of the Balkan countries, which will be continued in May 2023 at the 11th LHCP conference in Belgrade.

The roundtable devoted to quantum technologies (QTs), chaired by Enrique Sanchez of the European Physical Society (EPS), was another highlight with strong relevance to HEP. Various perspectives on the different QT sectors – computing and simulation, communication, metrology and sensing – were discussed, touching upon the impact they could have on society at large. Europe plays a leading role in quantum research, concluded the panel. However, despite increased interest in QTs, including at CERN, issues such as how to obtain appropriate funding to enhance European technological leadership, remain. Discussions highlighted the opportunities for new generations of physicists from the Balkans to help build this “second quantum revolution”. 

In addition to the roundtables, four high-level scientific satellite events took place, attracting a further 150 on-site participants: the COST Workshop on Theoretical Aspects of Quantum Gravity; the SEENET–MTP Assessment Meeting and Workshop; the COST School on Quantum Gravity Phenomenology in the Multi-Messenger Approach; and the CERN–SEENET–MTP–ICTP PhD School on Gravitation, Cosmology and Astroparticle Physics. The latter is part of a unique regional programme in HEP initiated by SEENET–MTP (Southeastern European Network in Mathematical and Theoretical Physics) and CERN in 2015, and joined by the ICTP in 2018, which has contributed to the training of more than 200 students in 12 SEENET countries. 

The BPU11 Congress, the largest event of its type in the region since the beginning of the COVID-19 pandemic, contributed to closer cooperation between the Balkan countries and CERN, ICTP, SISSA, the Central European Initiative and others. It was possible thanks to the support of the EPS, ICTP and CEI-Trieste, CERN, EPJ, as well as the Serbian ministry of science and institutions active in physics and mathematics in Serbia. In addition to the BPU11 PoS Proceedings, several articles based on invited lectures will be published in a focus issue of EPJ Plus “On Physics in the Balkans: Perspectives and Challenges”, as well as in a special issue of IJMPA.

The post A celebration of physics in the Balkans appeared first on CERN Courier.

]]>
Meeting report The BPU11 Congress contributed to a closer cooperation between the Balkan countries and CERN, ICTP, SISSA, the Central European Initiative and others. https://cerncourier.com/wp-content/uploads/2023/02/CCMarApr23_FN_BPU11.jpg
Physics is about principles, not particles https://cerncourier.com/a/physics-is-about-principles-not-particles/ Wed, 01 Mar 2023 13:35:42 +0000 https://preview-courier.web.cern.ch/?p=107923 As long as the aim is to answer nature’s outstanding mysteries, the path is worth following, says Veronica Sanz.

The post Physics is about principles, not particles appeared first on CERN Courier.

]]>
Last year marked the 10th anniversary of the discovery of the Higgs particle. Ten years is a short lapse of time when we consider the profound implications of this discovery. Breakthroughs in science mark a leap in understanding, and their ripples may extend for decades and even centuries. Take Kirchhoffs’ blackbody proposal more than 150 years ago: a theoretical construction, an academic exercise that opened the path towards a quantum revolution, the implications of which we are still trying to understand today. 

Imagine now the vast network of paths opened by ideas, such as emission theory, that led to no fruition despite their originality. Was pursuing these useful, or a waste of resources? Scientists would answer that the spirit of basic research is precisely to follow those paths with unknown destinations; it’s how humanity reached the level of knowledge that sustains modern life. As particle physicists, as long as the aim is to answer nature’s outstanding mysteries, the path is worth following. The Higgs-boson discovery is the latest triumph of this approach and, as for the quantum revolution, we are still working hard to make sense of it. 

Particle discoveries are milestones in the history of our field, but they signify something more profound: the realisation of a new principle in nature. Naively, it may seem that the Higgs discovery marked the end of our quest to understand the TeV scale. The opposite is true. The behaviour of the Higgs boson, in the form it was initially proposed, does not make sense at a quantum level. As a fundamental scalar, it experiences quantum effects that grow with their energy, doggedly pushing its mass towards the Planck scale. The Higgs discovery solidified the idea that gauge symmetries could be hidden, spontaneously broken by the vacuum. But it did not provide an explanation of how this mechanism makes sense with a fundamental scalar sensitive to mysterious phenomena such as quantum gravity. 

Veronica Sanz

Now comes the hard part. From the plethora of ideas proposed during the past decades to make sense of the Higgs boson – supersymmetry being the most prominent – most physicists predicted that it would have an entourage of companion particles with electroweak or even strong couplings. Arguments of naturalness, that these companions should be close-by to prevent troublesome fine-tunings of nature, led to the expectation that discoveries would follow or even precede that of the Higgs. Ten years on, this wish has not been fulfilled. Instead, we are faced with a cold reality that can lead us to sway between attitudes of nihilism and hubris, especially when it comes to the question of whether particle physics has a future beyond the Higgs. Although these extremes do not apply to everyone, they are understandable reactions to viewing our field next to those with more immediate applications, or to the personal disappointment of a lifelong career devoted to ideas that were not chosen by nature. 

Such despondence is not useful. Remember that the no-lose theorem we enjoyed when planning the LHC, i.e. the certainty that we would find something new, Higgs boson or not, at the TeV scale, was an exception to the rules of basic research. Currently, there is no no-lose theorem for the LHC, or for any future collider. But this is precisely the inherent premise of any exploration worth doing. After the incredible success we have had, we need to refocus and unify our discourse. We face the uncertainty of searching in the dark, with the hope that we will initiate the path to a breakthrough, still aware of the small likelihood that this actually happens. 

The no-lose theorem we enjoyed when planning the LHC was an exception to the rules of basic research

Those hopes are shared by wider society, which understands the importance of exploring big questions. From searching for exoplanets that may support life to understanding the human mind, few people assume these paths will lead to immediate results. The challenge for our field is to work out a coherent message that can enthuse people. Without straying far from collider physics, we could notice that there is a different type of conversation going on in the search for dark matter. Here, there is no no-lose theorem either, and despite the current exclusion of most vanilla scenarios, there is excitement and cohesion, which are effectively communicated. As for our critics, they should be openly confronted and viewed as an opportunity to build stronger arguments.

We have powerful arguments to keep delving into the smallest scales, with the unknown nature of dark matter, neutrinos and the matter–antimatter asymmetry the most well-known examples. As a field, we need to renew the excitement that led us where we are, from the shock of watching alpha particles bounce back from a thin gold sheet, to building a colossus like the LHC. We should be outspoken about our ambition to know the true face of nature and the profound ideas we explore, and embrace the new path that the Higgs discovery has opened. 

The post Physics is about principles, not particles appeared first on CERN Courier.

]]>
Opinion As long as the aim is to answer nature’s outstanding mysteries, the path is worth following, says Veronica Sanz. https://cerncourier.com/wp-content/uploads/2023/02/CCMarApr23_VIEW_path.jpg
JENAS picks up the pace in Spain https://cerncourier.com/a/jenas-picks-up-the-pace-in-spain/ Tue, 08 Nov 2022 13:53:01 +0000 https://preview-courier.web.cern.ch/?p=107108 The second joint ECFA, NuPECC and APPEC symposium offered participants a comprehensive assessment of overlapping research topics.

The post JENAS picks up the pace in Spain appeared first on CERN Courier.

]]>
The second joint ECFA (European Committee for Future Accelerators), NuPECC (Nuclear Physics European Collaboration Committee) and APPEC (AstroParticle Physics European Consortium) symposium, JENAS, was held from 3 to 6 May in Madrid, Spain. Senior and junior members of the astroparticle, nuclear and particle-physics communities presented their challenges and discussed common issues with the goal of achieving a more comprehensive assessment of overlapping research topics. For many of the more than 160 participants, it was their first in-person attendance at a conference after more than two years due to the COVID-19 pandemic.

Focal point

The symposium began with the research highlights and strategies of the three research fields. A major part of this concerned the progress and plans of the six joint projects that have emerged since the first JENAS event in 2019: dark matter (iDMEu initiative); gravitational waves for fundamental physics; machine-learning optimised design of experiments; nuclear physics at the LHC; storage rings to search for charged-particle electric dipole moments; and synergies between the LHC and future electron–ion collider experiments. The discussions on the joint projects were complemented by a poster session where young scientists presented the details of many of these activities.

The goal was achieving a more comprehensive assessment of overlapping research topics

Detector R&D, software and computing, as well as the application of artificial intelligence, are important examples where large synergies between the three fields can be exploited. On detector R&D there is interest in collaborating on important research topics such as those identified in the 2021 ECFA roadmap on detector R&D. In this roadmap, colleagues from the astro­particle and nuclear-physics communities were involved. Likewise, the challenges of processing and handling large datasets, distributed computing, as well as developing modern analysis methods for complex data analyses involving machine learning, can be addressed together.

Overview talks and round-table discussions related to education, outreach, open science and knowledge transfer allowed participants to emphasise and exchange best practices. In addition, the first results of surveys on diversity and the recognition of individual achievements in large collaborations were presented and discussed. For the latter, a joint APPEC–ECFA–NuPECC working group has presented an aggregation of best practices already in place. A major finding is that many collaborations have already addressed this topic thoroughly. However, they are encouraged to further monitor progress and consider introducing more of the best practices that were identified.  

Synergy

One day was dedicated to presentations and closed-session discussions with representatives from both European funding agencies and the European Commission. The aim was to evaluate whether appropriate funding schemes and organisational structures can be established to better exploit the synergies between astroparticle, nuclear and particle physics, and thus enable a more efficient use of resources. The positive and constructive feedback will be taken into account when carrying out the common projects and towards the preparation of the third JENAS event, which is planned to take place in about three years’ time.

The post JENAS picks up the pace in Spain appeared first on CERN Courier.

]]>
Meeting report The second joint ECFA, NuPECC and APPEC symposium offered participants a comprehensive assessment of overlapping research topics. https://cerncourier.com/wp-content/uploads/2022/11/CCNovDec22_FN_Jenas.jpg
Your guide to becoming a CERN guide https://cerncourier.com/a/your-guide-to-becoming-a-cern-guide/ Mon, 05 Sep 2022 11:54:27 +0000 https://preview-courier.web.cern.ch/?p=106158 The most satisfying thing is witnessing people’s enthusiasm and their desire to learn more about CERN and its mission, says Bryan Pérez Tapia.

The post Your guide to becoming a CERN guide appeared first on CERN Courier.

]]>
Bryan Pérez Tapia

Do you remember the first time you heard about CERN? The first time someone told you about that magical place where bright minds from all over the world work together towards a common goal? Perhaps you saw a picture in a book, or had the chance to visit in person as a student? It is experiences like these that motivate many people to pursue a career in science, whether in particle physics or beyond.

In 2016 I had the pleasure of visiting CERN on a school trip. We toured the Synchrocyclotron and the SM18 magnet test facility. I was hooked. The tour guides talked with passion about the laboratory, the film presenting CERN’s first particle accelerator and the laboratory’s mission, and all those big magnets being tested in SM18. It was this experience that motivated me to study physics at university and to try to come back as soon as I could.

Accreditation

That chance arrived in September 2021 when I started a one-year technical studentship as editorial assistant on the Courier. From the first day I was eager to see as much as I could. During the final months of Long Shutdown 2, my supervisor and I visited the ATLAS cavern. The experience motivated me to ask one of my newly made friends, also a technical student who had recently become a tour guide, how to apply. The process was positive and efficient. After completing all the required courses from the learning hub and shadowing experienced guides, I became a certified ATLAS underground guide in November 2021 and gave my first tour soon after. I was nervous and struggled with the iris scanner when accessing the cavern, but all ended well, and further tours were scheduled. Then, in mid-December, all in-person tours were cancelled due to COVID-19 restrictions. I needn’t have worried, as CERN was fully geared up to provide virtual visits. Among my first virtual audience members were students from the high school that brought me to CERN five years earlier and from my university, Nottingham Trent in the UK. 

The most satisfying thing is people’s enthusiasm and their desire to learn more about CERN and its mission

The virtual visits were quite challenging at first. It was harder to connect with the audience than during an in-person visit. But managing these difficulties helped me to improve my communication skills and to develop self-confidence. During this period, I conducted more than 10 virtual visits for different institutes, universities, family and friends, in both English and Spanish. 

At the beginning of March 2022, CERN moved into “level yellow” and in-person visits were resumed. Although only possible for a short period, I had the chance to guide visitors underground and had the honour of guiding the last in-person visit into the ATLAS cavern on 23 March before preparations for LHC Run 3 got under way. With the ATLAS cavern then off-limits, I signed up to present at as many CERN visit points as possible. At the time of writing, I am a guide for the Synchrocyclotron, the ATLAS Visitor Centre, Antimatter Factory, Data Centre, Low Energy Ion Ring and CERN Control Centre. 

Get involved

The CERN visits service always welcomes new guides and is working towards opening new visit points. Anyone working at CERN or registered as a user can take part by signing up for visit-point training on the tour-guide website: guides.web.cern.ch. General training for new guides is also available. All you need to show CERN to the public is passion and enthusiasm, and you can sign up for as many or as few as your day job allows. Diversity is encouraged and those who are multilingual are also highly valued.

Today, visits are handled by a dedicated section in the Education, Communications and Outreach group. The number of visitors has gradually increased over recent years, with 152,000 annual visitors before the pandemic started, excluding special events such as the CERN Open Days. The profile of visitors ranges from school pupils and university students to common-interest groups such as engineers and scientists, politicians and VIPs, and people with a wide range of interests and educational levels.

The benefits of becoming a CERN guide are immense. It gives you access to areas that would otherwise not be possible, the chance to experience important events in-person and to see your work at CERN, whatever it involves, from a fresh perspective. My personal highlight was watching test collisions at 13.6 TeV before the official start of Run 3 while showing Portuguese high-school students the ATLAS control room. The most satisfying thing is people’s enthusiasm and their desire to learn more about CERN and its mission. I particularly remember how a small child asked me a question about the matter–antimatter asymmetry of the universe, and how another young visitor ran from Entrance B at the end of a tour just to tell me how much she loved the visit.

The visits service makes it as easy as possible to get involved, and exciting times for guides lie ahead with the opening of the CERN Science Gateway next year, which will enable CERN to welcome even more visitors. If a technical student based at CERN for just one year can get involved, so can you!

The post Your guide to becoming a CERN guide appeared first on CERN Courier.

]]>
Careers The most satisfying thing is witnessing people’s enthusiasm and their desire to learn more about CERN and its mission, says Bryan Pérez Tapia. https://cerncourier.com/wp-content/uploads/2022/09/CCSepOct22_Careers_feature.jpg
SESAME revives the ancient Near East https://cerncourier.com/a/sesame-revives-the-ancient-near-east/ Thu, 25 Aug 2022 08:28:26 +0000 https://preview-courier.web.cern.ch/?p=102031 Around 240 registrants in 39 countries gathered for the first SESAME Cultural Heritage Day.

The post SESAME revives the ancient Near East appeared first on CERN Courier.

]]>
The IR microscope at SESAME

The Synchrotron-light for Experimental Science and Applications in the Middle East (SESAME) is a 2.5 GeV third-generation synchrotron radiation (SR) source developed under the auspices of UNESCO and modelled after CERN. Located in Allan, Jordan, it aims to foster scientific and technological excellence as well as international cooperation amongst its members, which are currently Cyprus, Egypt, Iran, Israel, Jordan, Pakistan, Palestine and Turkey. As a user facility, SESAME hosts visiting scientists from a wide range of disciplines, allowing them to access advanced SR techniques that link the functions and properties of samples and materials to their micro, nano and atomic structure.

The location of SESAME is known for its richness in archaeological and cultural heritage. Many important museums, collections, research institutions and universities host departments dedicated to the study of materials and tools that are inextricably linked to prehistory and human history, demanding interdisciplinary research agendas and teams. As materials science and condensed-matter physics play an increasing role in understanding and reconstructing the properties of artefacts, SESAME offers a highly versatile tool for the researchers, conservators and cultural-heritage specialists in the region.

The high photon flux, small source size and low divergence available at SR sources allow for advanced spectroscopy and imaging techniques that are well suited for studying ancient and historical materials, and which often present very complex and heterogeneous structures. SR techniques are non-destructive, and the existence of several beamlines at SR facilities means that samples can easily be transferred and reanalysed using complementary techniques.

SESAME offers a versatile tool for researchers, conservators and cultural-heritage specialists in the region

At SESAME, an infrared microspectroscopy beamline, an X-ray fluorescence and absorption spectroscopy beamline, and a powder diffraction beamline are available, while a soft X-ray beamline called “HESEB” has been designed and constructed by five Helmholtz research centres and is now being commissioned. Next year, the BEAmline for Tomography at SESAME (BEATS) will also be completed, with the construction and commissioning of a beamline for hard X-ray full-field tomography. BEATS involves the INFN, The Cyprus Institute and the European SR facilities ALBA-CELLS (Spain), DESY (Germany), ESRF (France), Elettra (Italy), PSI (Switzerland) and SOLARIS (Poland).

To explore the potential of these beamlines, the First SESAME Cultural Heritage Day took place online on 16 February with more than 240 registrants in 39 countries. After a welcome by SESAME director Khaled Toukan and president of council Rolf Heuer, Mohamed ElMorsi (Conservation Centre, National Museum of Egyptian Civilization), Marine Cotte (ESRF) and Andrea Lausi (SESAME) presented overviews of ancient Egyptian cultural heritage, heritage studies at the ESRF, and the experimental capabilities of SESAME, respectively. This was followed by several research insights obtained by studies at SESAME and other SR facilities: Maram Na’es (TU Berlin) showed the reconstruction of colour in Petra paintings; Heinz-Eberhard Mahnke and Verena Lepper (Egyptian Museum and Papyrus Collection, FU/HU Berlin and HZB) explained how to analyse ancient Elephantine papyri using X-rays and tomography; Amir Rozatian (University of Isfahan) and Fatma Marii (University of Jordan) determined the material of pottery, glass, metal and textiles from Iran and ancient glass from the Petra church; and Gonca Dardeniz Arıkan (Istanbul University) provided an overview of current research into the metallurgy of Iran and Anatolia, the origins of glassmaking, and the future of cultural heritage studies in Turkey. Palaeontology with computed tomography and bioarchaeological samples were highlighted in talks by Kudakwashe Jakata (ESRF) and Kirsi Lorentz (The Cyprus Institute).

During the following discussions, it was clear that institutions devoted to the research, preservation and restoration of materials would benefit from developing research programmes in close cooperation with SESAME. Because of the multiple applications in archaeology, palaeontology, palaeo-environmental science and cultural heritage, it will be necessary to establish a multi-disciplinary working group, which should also share its expertise on practical issues such as handling, packaging, customs paperwork, shipping and insurance. 

The post SESAME revives the ancient Near East appeared first on CERN Courier.

]]>
Meeting report Around 240 registrants in 39 countries gathered for the first SESAME Cultural Heritage Day. https://cerncourier.com/wp-content/uploads/2022/06/CCJulAug22_FN-sesame.jpg
Accelerating knowledge transfer with physics https://cerncourier.com/a/accelerating-knowledge-transfer-with-physics/ Thu, 14 Jul 2022 16:53:37 +0000 https://preview-courier.web.cern.ch/?p=102024 The African Conference on Fundamental and Applied Physics attracted more than 600 participants.

The post Accelerating knowledge transfer with physics appeared first on CERN Courier.

]]>
Countries in Africa participating in ACP2021

Science and technology are key instruments for a society’s economic growth and development. Yet Africa’s science, innovation and education have been chronically under-funded. Transferring knowledge, building research capacity and developing competencies through training and education are major priorities for Africa in the 21st century. Physics combines these priorities by extending the frontiers of knowledge and inspiring young people. It is therefore essential to make basic knowledge of emerging technologies available and accessible to all African citizens to build a steady supply of trained and competent researchers. 

In this spirit, the African School of Fundamental Physics and Applications was initiated in 2010 as a three-week biennial event. To increase networking opportunities among participants, the African Conference on Fundamental and Applied Physics (ACP) was included as a one-week extension of the school. The first edition was held in Namibia in 2018 and the second, co-organised jointly by Mohammed V University and Cadi Ayyad University in Morocco, was rebranded ACP2021, originally scheduled to take place in December but postponed due to COVID-19. The virtual event held from 7 to 11 March attracted more than 600 registrants, an order of magnitude higher than its first edition. 

The ACP2021 scientific programme covered the three major physics areas of interest in Africa defined by the African Physical Society: particles and related applications; light sources and their applications; and cross-cutting fields covering accelerator physics, computing, instrumentation and detectors. The programme also included topics in quantum computing and quantum information, as well as machine learning and artificial intelligence. Furthermore, ACP2021 focused on topics related to physics education, community engagement, women in physics and early-career physicists. The agenda was stretched to accommodate different time zones and 15 parallel sessions took place.

Welcome speeches by Hassan Hbid (Cadi Ayyad University) and by Mohammed Rhachi (Mohammed V University) were followed by a plenary talk by former CERN Director-General Rolf Heuer, “Science bridging Cultures and Nations” and an overview of the African Strategy for Fundamental and Applied Physics (ASFAP). Launched in 2021, the ASFAP aims to increase African education and research capabilities, build the foundations and frameworks to attract the participation of African physicists, and establish a culture of awareness of grassroots physics activities contrary to the top-down strategies initiated by governments. Shamila Nair-Bedouelle (UNESCO) conveyed a deep appreciation of and support for the ASFAP initiative, which is aligned with the agenda of the United Nations Sustainable Development Goals. A rich panel discussion followed, raising different views on physics education and research roadmaps in Africa.

A central element of the ACP2021 physics programme is the ASFAP community planning meeting, where physics and community-engagement groups discussed progress in soliciting the community input that is critical for the ASFAP report. The report will outline the direction for the next decade to encourage and strengthen higher education, capacity building and scientific research in Africa.

The motivation and enthusiasm of the ACP2021 participants was notable, and the efforts in support of research and education across Africa were encouraged. The next ACP in 2023 will be hosted by South Africa. 

The post Accelerating knowledge transfer with physics appeared first on CERN Courier.

]]>
Meeting report The African Conference on Fundamental and Applied Physics attracted more than 600 participants. https://cerncourier.com/wp-content/uploads/2022/06/CCJulAug22_FN-Africa_feature.jpg
Canadian particle physics at 50 https://cerncourier.com/a/canadian-particle-physics-at-50/ Mon, 23 May 2022 07:11:45 +0000 https://preview-courier.web.cern.ch/?p=100364 The Institute of Particle Physics continues to support novel projects, while diversifying and extending Canadian particle physics.

The post Canadian particle physics at 50 appeared first on CERN Courier.

]]>
Ernest Rutherford’s pioneering work on radioactive decay at McGill University, Montreal, in the early 1900s marked the beginning of Canadian subatomic physics. By the middle of the century, research in nuclear physics and the properties of fissionable material was being conducted at the Montréal Laboratory of the National Research Council of Canada (NRC) and at Chalk River Laboratories, which later became Atomic Energy of Canada Limited, by Bruno Pontecorvo, Bernice Weldon Sargent, Ted Hincks and John Robson, and others. Many Canadian physicists were starting to participate in experiments abroad, and in the 1960s the NRC began funding university professors to work on high-energy physics experiments at US labs. When the US National Accelerator Laboratory (now Fermilab) was approved in 1966, Canadian physicists expressed their strong interest and formed the “200 GeV study group”, chaired by Hincks. Their report, published in March 1969, formed the basis of the foundation of the Institute of Particle Physics (IPP) to steer Canadian involvement at Fermilab.

IPP research scientists are the glue in the Canadian particle-physics community and enable university-based researchers to have an impact in international collaborations.

The IPP serves both as the focal point for particle-physics activities across Canada and as the point of contact for research partners in laboratories and universities worldwide. For the past 50 years, the non-profit corporation owned by institutional members has expanded Canada’s particle-physics programme. Today, it is operated by 17 institutional members, including the TRIUMF laboratory, the Sudbury Neutrino Observatory Laboratory (SNOLAB) and the Perimeter Institute for Theoretical Physics, as well as 230 individual members. In addition to projects at TRIUMF and SNOLAB, IPP members are heavily engaged in international collaborations, such as in ATLAS at CERN and T2K in Japan. Almost all individual members are university faculty or permanent scientific staff working on a diverse set of experiments and theories.

Strong collaboration

The IPP was incorporated on 10 March 1971, with institutional members appointing a board of trustees and electing a six-member scientific council. The board selects the IPP director, who is endorsed by the individual members. Incumbent Michael Roney (Victoria) is the institute’s eighth director.

A cornerstone since the 1970s has been the IPP research scientists’ programme. IPP research scientists serve as the glue in the Canadian particle-physics community and are essential in enabling university-based researchers to have an impact well above what their numbers would warrant in large international collaborations. Recruited by the IPP council via a national and competitive process, each scientist holds an appointment at a host IPP member university, enabling them to work with graduate students, hold grants and undertake long-term stays at international laboratories. This has resulted in significant leadership roles in a number of projects, including ARGUS, OPAL, ZEUS, SNO, BABAR, and more recently ATLAS, T2K, Belle II, SNO+, PICO and DUNE. So far, IPP has had 21 research scientists, 13 of which have either retired or moved to faculty positions.

IPP_projects

Promoting participation in large international particle physics experiments by Canadian university-based physicists is a core mission of the IPP. In addition to the work of the IPP research scientist programme, this is accomplished by coordinating particle-physics activities in research and society in Canada and by introducing young Canadians to opportunities in particle physics through the CERN summer student and teacher programmes. Due to its decentralised organisation, individual university interests are parked at the door of the IPP. This enables healthy, vigorous, and highly collaborative teams built from multiple Canadian universities to have a substantial impact in international collaborations.

From CHEER to SNO

The first major IPP project proposal was the Canadian High Energy Electron Ring (CHEER), an electron storage ring feeding off a straight section of the Main Ring to study high-energy electron-proton collisions. Although the 1980 proposal was not approved by the Fermilab directorate, it paved the way for many fruitful collaborations for Canadian physicists. The CHEER team was invited to join HERA at DESY in 1981, which led to a long-term Canadian-DESY collaboration also involving ARGUS, HERMES and ZEUS. Contributions to ARGUS included the construction of the vertex detector’s mechanical structure and the online data acquisition system by Toronto and York universities. For ZEUS, a collaboration between the universities of York, McGill, Toronto and Manitoba from 1987 to 1990 constructed 26 large calorimeter modules. HERA also marked a major Canadian contribution to an offshore accelerator, with a proton transfer line designed and built by TRIUMF and proton-ring radio-frequency cavities built by Atomic Energy of Canada Limited.

Cheer

Another part of the CHEER team, including Carleton University and the NRC particle physics group, joined the OPAL collaboration at CERN’s LEP collider in 1982. OPAL was the largest particle-physics project in Canada between 1989 and 2000. Canadian teams were responsible for building the detector’s central vertex-wire chamber and “zed” chambers at the outer radius of the large OPAL jet chamber, with the Montréal group building parts of the tracker data acquisition system. In 1992, researchers from TRIUMF and the universities of Victoria and British Columbia, who had constructed part of the SLD calorimeter at SLAC, joined OPAL and deployed its online processing system, while a team from Alberta developed the OPAL scintillating tile end-cap.

IPP’s precision flavour-physics programme using e+e colliders at the Upsilon resonance began with ARGUS and continued with the BaBar experiment at SLAC’s PEP-II collider. The BaBar drift chamber was built at TRIUMF in the late 1990s in a collaboration with McGill, Montréal, British Columbia, Victoria and US colleagues. In addition to physics positions held by the Canadian team, including three IPP research scientists, they contributed to senior BaBar management roles over the years. Canada’s flavour-physics programme continues today with the Belle II experiment at KEK.

One of the great successes in Canadian particle physics, led by Queen’s, Carleton, Laurentian and other universities, was the Sudbury Neutrino Observatory (SNO). Centered around a 1000-tonne tank of heavy water located at a depth of 2100 m in the Vale Creighton mine in Sudbury, Ontario, SNO was built to investigate the properties of neutrinos and to confront the solar neutrino puzzle. The observatory operated from 1999 to 2006 and its director Art McDonald sharing the 2015 Nobel Prize in Physics for SNO’s contributions to the discovery of neutrino oscillations. Following SNO’s tremendous success, SNOLAB expanded the facility to 5000 m2 of clean and well-equipped underground space for experiments that benefit from the low-background environment. Current IPP projects operating at SNOLAB include the SNO+ neutrinoless double-beta decay experiment, and the DEAP-3600 and PICO direct dark-matter searches. Next-generation dark-matter projects, such as SuperCDMS and the Scintillating Bubble Chamber, are currently under construction.

IPP also has a strong involvement in accelerator-based neutrino projects, in particular the Tokai to Kamioka (T2K) long-baseline neutrino experiment and the new Hyper-Kamiokande experiment in Japan and DUNE in the US, which are both under construction. Canadians were among the founding members of T2K, making strong intellectual contributions to the off-axis neutrino beam concept at the heart of the project. They have led the design and construction of key elements of the ND280 near-detector, which characterises the neutrino beams, and built the optical transition radiation monitor that measures the extremely active neutrino production target.T2K’s remote handling infrastructure was designed and deployed by TRIUMF. Today, Canadian physicists have leadership roles in a wide range of T2K physics studies, including the first indications of oscillations of muon neutrinos to electron neutrinos and placing strong constraints on the CP violating phase in the neutrino-oscillation matrix.

LHC and beyond

ATLAS at the LHC is currently the largest particle-physics project in Canada, with around 40 faculty participants. Canadians were among the founding members of ATLAS in 1992, making key contributions to both the design and construction of parts of the liquid argon calorimeters with additional work on the high-level trigger, pixel detector, transition radiation tracker, luminosity and radiation monitoring, and computing. They are involved in many ATLAS physics studies, and have a leading role in several upgrade projects, including the new muon small wheel, liquid-argon trigger and the silicon-strip detectors for the inner tracker (ITk) detector for the HL-LHC. The LHC is another accelerator complex to which TRIUMF made important contributions, including power supply systems for the proton synchronic upgrades and the quadrupole magnets in the LHC ring used for “scrubbing”.

ATLAS_2002

The also IPP has had a diverse set of projects in the US, including the Large Acceptance Superconducting Solenoid multi-particle spectrometer facility at SLAC (LASS) and the fixed-target experiments E691 and E705 at Fermilab, as well the CDF experiment at the Tevatron. Canadians were also leaders in the rare-kaon decay experiments E787 and E949 at Brookhaven, which made the first observations of the decay of a charged kaon to a pion and a neutrino pair, leading to an involvement in the NA62 experiment at CERN. The precision MOLLER experiment at JLab is another IPP project and IPP is also engaged in the particle-astrophysics projects IceCube at the South Pole, the P-ONE deep ocean neutrino observatory off the Canadian west coast, and VERITAS at the Fred Lawrence Whipple Observatory.

For the past five decades, the IPP has united Canadians working on a diverse set of particle-physics projects, advocated for support by the Canadian government and funding agencies, organised long-range planning for the community, and represented Canada in international committees and steering groups. IPP also employs research scientists playing important roles in international collaborations and enabling Canadian scientific leadership. The IPP continues to support new and novel projects (see figure), while diversifying and extending the Canadian particle-physics programme.

The post Canadian particle physics at 50 appeared first on CERN Courier.

]]>
Feature The Institute of Particle Physics continues to support novel projects, while diversifying and extending Canadian particle physics. https://cerncourier.com/wp-content/uploads/2022/05/DEAP-3600.jpg
Have you got what it takes to teach? https://cerncourier.com/a/have-you-got-what-it-takes-to-teach/ Sat, 19 Feb 2022 15:18:45 +0000 https://preview-courier.web.cern.ch/?p=97796 CERN alumni who have returned to the classroom reveal teaching to be one of the hardest but most rewarding things they have ever done.

The post Have you got what it takes to teach? appeared first on CERN Courier.

]]>
ICTP Physics Without Frontiers event

Particle physicists are no strangers to outreach, be it giving public talks, writing popular books or taking part in science shows. But how many are brave enough to enter a career in teaching, arguably the most important science-communication activity of all? CERN alumni who have returned to the classroom reveal teaching to be one of the hardest but most rewarding things they have ever done. 

“I love my job,” exclaims Octavio Dominguez, who completed his PhD in 2013 studying the appearance of electron-cloud build-up in the LHC before deciding to switch to teaching. Having personally benefitted from some excellent teachers who sparked an “unquenchable curiosity”, he says, the idea of being a teacher had been on his mind ever since he was at secondary school. “The profession is definitely not exempt of challenges. Well, in fact I can say it’s the most difficult thing I’ve ever done… But if I keep doing it, it’s because the feedback from students is absolutely priceless. It’s truly amazing seeing my students evolve into the best version of themselves.”

Job satisfaction

Despite giving as many as 25 lessons per week, including presentations and practicals, and spending long hours outside school preparing materials and marking assignments, happiness and personal satisfaction are cited as the main rewards of working as a teacher. “I particularly enjoy seeing the enthusiasm in students’ eyes – it is something that cannot be explained with words,” says Eleni Ntomari, who was a summer student at CERN in 2006, then a PhD student and postdoc working on the CMS experiment. “From the outside, teaching might not appear difficult, but in reality it is not just a profession but a ‘project’ with no timetable and a continuation of trying to learn new things in order to become more efficient and helpful for your students.” Ntomari took advantage of every teaching opportunity that academic life offered, from being a lab instructor, becoming a CERN guide and giving talks at local schools when a teaching opportunity in Greece arose during her postdoctoral fellowship at DESY. “I realised teaching was highly gratifying, so I decided to continue my career as a physics teacher in secondary and high schools.”

I particularly enjoy seeing the enthusiasm in students’ eyes

Eleni Ntomari

Teachers of STEM subjects are in acute demand. In the US, physics has the most severe teacher shortage followed by mathematics and chemistry, with large surpluses of biology and earth-science teachers, according to the Cornell physics teacher education coalition. Furthermore, around two thirds of US high-school physics teachers do not have a degree in physics or physics education. The picture is similar in Europe, with a brief teacher survey carried out by the European Physical Society in 2020 revealing the overwhelming opinion that a serious problem exists: 81% of respondents believed there is a shortage of specialist teachers in their country, of which 87% thought that physics is being taught by non-specialists. 

Initiatives such as the UN International Day of Education on 24 January help to bring visibility and recognition to the profession, says Dominguez: “Education is one of the principal means to change the world for the better, but I feel that the teaching profession is frequently disregarded by many people in our society,” he says. “I’ve spent most of my career as a teacher in schools in deprived areas of the UK, and now I’m doing my second year in one of the most affluent schools in the country. This has given me a new perspective on society and has helped me understand better why some behaviour patterns appear.”

The CERN effect

The fascinating machines and thought-provoking concepts underpinning particle physics make a research background at CERN a major bonus in the classroom, explains Alexandra Galloni, a CERN summer student in 1995 who completed her PhD at the DELPHI experiment in 1998, spent a decade in IT consultancy, and is now head of science and technology at one of the UK’s top-performing secondary schools. “I milked my PhD as much as I could – I promised a visit from Brian Cox to my first school at interview, and although I didn’t pull that one off, contacts at CERN have enriched life both at school and on many of the CERN trips I inevitably ended up running. The Liverpool LHCb team have hosted incredible ‘Particle Schools’ at CERN for students and staff from many schools almost every year since then, leading to gushing feedback from all involved.”

I love the variety, the unexpected moments and the human interaction in the classroom

Alexandra Sheridan

Keeping in touch with events at CERN has also led to exciting moments for the students, she adds, such as watching the Higgs-discovery announcement in 2012, applying for Beamline for Schools in 2014, taking part in the ATLAS Open Data project and participating in Zoom calls with CERN contacts about future colliders and antimatter. “The surrounding tasks to teaching can be gruelling, and I would be lying if I said I didn’t resent the never-ending to-do list and lack of being able to plan much personal time during term-time. But I love the variety, the unexpected moments and the human interaction in the classroom.”

CERN offers many professional-development programmes for teachers to keep up-to-date with developments in particle physics and related areas, as well as dedicated experiment sessions at “S’Cool LAB”, the coordination of the highly popular Beamline for Schools competition and internships for high-school students. These efforts are also underpinned by an education-research programme that has seen five PhD theses produced during the past five years as well as 67 published articles since the programme began in 2009. “We are reaching out to all our member states and beyond to enthuse the next generations of STEM professionals and contribute to their science education,” says Sascha Schmeling, who leads the CERN teacher and student programmes. “Engaging the public with fundamental research is a vital part of CERN’s mission.” 

The post Have you got what it takes to teach? appeared first on CERN Courier.

]]>
Careers CERN alumni who have returned to the classroom reveal teaching to be one of the hardest but most rewarding things they have ever done. https://cerncourier.com/wp-content/uploads/2022/03/CCMarApr22_CAREERS-frontis.jpg
A systematic approach to systematics https://cerncourier.com/a/a-systematic-approach-to-systematics/ Mon, 20 Dec 2021 15:26:46 +0000 https://preview-courier.web.cern.ch/?p=96664 The latest PHYSTAT meeting concentrated on the way systematic effects are incorporated in a range of particle-physics analyses.

The post A systematic approach to systematics appeared first on CERN Courier.

]]>
Whenever we perform an analysis of our data, whether measuring a physical quantity of interest or testing some hypothesis, it is necessary to assess the accuracy of our result. Statistical uncertainties arise from the limited accuracy with which we can measure anything, or from the natural Poisson fluctuations involved in counting independent events. They have the property that repeated measurements result in greater accuracy.

Systematic uncertainties, on the other hand, arise from many sources and may not cause a spread in results when experiments are repeated, but merely shift them away from the true value. Accumulating more data usually does not reduce the magnitude of a systematic effect. As a result, estimating systematic uncertainties typically requires much more effort than for statistical ones, and more personal judgement and skill is involved. Furthermore, statistical uncertainties between different analyses usually are independent; this often is not so for systematics.

The November event saw the largest number of statisticians at any PHYSTAT meeting

In particle-physics analyses, many systematics are related to detector and analysis effects. Examples include trigger efficiency; jet energy scale and resolution; identification of different particle types; and the strength of backgrounds and their distributions. There are also theoretical uncertainties which, as well as affecting predicted values for comparison with measured ones, can also influence the experimental variables extracted from the data. Another systematic comes from the intensity of accelerator beams (the integrated luminosity at the LHC for example), which is likely to be correlated for the various measurements made using the same beams.

At the LHC, it is in analyses with large amounts of data where systematics are likely to be most relevant. For example, a measurement of the mass of the W boson published by the ATLAS collaboration in 2018, based on a sample of 14 million W-boson decays, had a statistical uncertainty of 7 MeV but a systematic uncertainty of 18 MeV.

PHYSTAT-Systematics

Two big issues for systematics are how the magnitudes of the different sources are estimated, and how they are then incorporated in the analysis. The PHYSTAT-Systematics meeting concentrated on the latter, as it was thought that this was more likely to benefit from the presence of statisticians – a powerful feature of the PHYSTAT series, which started at CERN in 2000.

The 20 talks fell into three categories. The first were those devoted to analyses in different particle-physics areas: the LHC experiments; neutrino-oscillation experiments; dark-matter searches; and flavour physics. A large amount of relevant information was discussed, with interesting differences in the separate sub-fields of particle physics. For example, in dark-matter searches, upper limits sometimes are set using Yellin’s Maximum Gap method when the expected background is low, or by using Power Constrained Limits, whereas these tend not to be used in other contexts.

The second group followed themes: theoretical systematics; unfolding; mis-modelling; an appeal for experiments to publish their likelihood functions; and some of the many aspects that arise in using machine learning (where the machine-learning process itself can result in a systematic, and the increased precision of a result should not be at the expense of accuracy).

Finally, there was a series of talks and responses by statisticians. The November event saw the largest number of statisticians at any PHYSTAT meeting, and the efforts that they made to understand our intricate analyses and the statistical procedures that we use were much appreciated. It was valuable to have insights from a different viewpoint on the largely experimental talks. David van Dyk, for instance, emphasised the conceptual and practical differences between simply using the result of a subsidiary experiment’s estimate of a systematic to assess its effect on a result, and using the combined likelihood function for the main and the subsidiary measurements. Also, in response to talks about flavour physics and neutrino-oscillation experiments, attention was drawn to the growing impact in cosmology of non-parametric, likelihood-free (simulation-based likelihoods) and Bayesian methods. Likelihood-free methods came up again in response to a modelling talk based on LHC-experiment analyses, and the role of risk estimation was emphasised by statisticians. Such suggestions for alternative statistical strategies open the door to further discussions about the merits of new ideas in particular contexts.

A novel feature of this remote meeting was that the summary talks were held a week later, to give speakers Nick Wardle and Sara Algeri more time. In her presentation, Algeri, a statistician, called for improved interaction between physicists and statisticians in dealing with these interesting issues.

Overall, the meeting was a good step on the path towards having a systematic approach to systematics. Systematics is an immense topic, and it was clear that one meeting spread over four afternoons was not going to solve all the issues. Ongoing PHYSTAT activities are therefore planned, and the organisers welcome further suggestions.

The post A systematic approach to systematics appeared first on CERN Courier.

]]>
Meeting report The latest PHYSTAT meeting concentrated on the way systematic effects are incorporated in a range of particle-physics analyses. https://cerncourier.com/wp-content/uploads/2021/12/PHYSTAT-feature-image.jpg
Counting collisions precisely at CMS https://cerncourier.com/a/counting-collisions-precisely-at-cms/ Wed, 03 Nov 2021 13:06:56 +0000 https://preview-courier.web.cern.ch/?p=95700 Beyond the setting of new records, precise knowledge of the luminosity at particle colliders is vital for physics analyses.

The post Counting collisions precisely at CMS appeared first on CERN Courier.

]]>
The start of Run-2 physics

Year after year, particle physicists celebrate the luminosity records established at accelerators around the world. On 15 June 2020, for example, a new world record for the highest luminosity at a particle collider was claimed by SuperKEKB at the KEK laboratory in Tsukuba, Japan. Electron–positron collisions at the 3 km-circumference machine had reached an instantaneous luminosity of 2.22 × 1034 cm–2s–1 – surpassing the 27 km-circumference LHC’s record of 2.14 × 1034 cm–2s–1 set with proton–proton collisions in 2018. Within a year, SuperKEKB had celebrated a new record of 3.1 × 1034 cm–2s–1 (CERN Courier September/October 2021 p8).

Integrated proton–proton luminosity

Beyond the setting of new records, precise knowledge of the luminosity at particle colliders is vital for physics analyses. Luminosity is our “standard candle” in determining how many particles can be squeezed through a given space (per square centimetre) at a given time (per second); the more particles we can squeeze into a given space, the more likely they are to collide, and the quicker the experiments fill up their tapes with data. Multiplied by the cross section, the luminosity gives the rate at which physicists can expect a given process to happen, which is vital for searches for new phenomena and precision measurements alike. Luminosity milestones therefore mark the dawn of new eras, like the B-hadron or top-quark factories at SuperKEKB and LHC (see “High-energy data” figure). But what ensures we didn’t make an accidental blunder in calculating these luminosity record values?

Physics focus

Physicists working at the precision frontier need to infer with percent-or-less accuracy how many collisions are needed to reach a certain event rate. Even though we can produce particles at an unprecedented event rate at the LHC, however, their cross section is either too small (as in the case of Higgs-boson production processes) or impacted too much by theoretical uncertainty (for example in the case of Z-boson and top-quark production processes) to enable us to establish the primary event rate with a high level of confidence. The solution comes down to extracting one universal number: the absolute luminosity.

Schematic view of the CMS detector

The fundamental difference between quantum electrodynamics (QED) and chromodynamics (QCD) influences how luminosity is measured at different types of colliders. On the one hand, QED provides a straightforward path to high precision because the absolute rate of simple final states is calculable to very high accuracy. On the other, the complexity in QCD calculations shapes the luminosity determination at hadron colliders. In principle, the luminosity can be inferred by measuring the total number of interactions occurring in the experiment (i.e. the inelastic cross section) and normalising to the theoretical QCD prediction. This technique was used at the SppS and Tevatron colliders. A second technique, proposed by Simon van der Meer at the ISR (and generalised by Carlo Rubbia for the pp case), could not be applied to such single-ring colliders. However, this van der Meer-scan method is a natural choice at the double-ring RHIC and LHC colliders, and is described in the following.

Beam-separation-dependent event rate

Absolute calibration

The LHC-experiment collaborations perform a precise luminosity inference from data (“absolute calibration”) by relating the collision rate recorded by the subdetectors to the luminosity of the beams. With the implementation of multiple collisions per bunch crossing (“pileup”) and intense collision-induced radiation, which acts as a background source, dedicated luminosity-sensitive detector systems called luminometers also had to be developed (see “Luminometers” figure). To maximise the precision of the absolute calibration, beams with large transverse dimensions and relatively low intensities are delivered by the LHC operators during a dedicated machine preparatory session, usually held once a year and lasting for several hours. During these unconventional sessions, called van der Meer beam-separation scans, the beams are carefully displaced with respect to each other in discrete steps, horizontally and vertically, while observing the collision rate in the luminometers (see “Closing in” figure). This allows the effective width and height of the two-dimensional interaction region, and thus the beam’s transverse size, to be measured. Sources of systematic uncertainty are either common to all experiments and are estimated in situ, for example residual differences between the measured beam positions and those provided by the operational settings of the LHC magnets, or depend on the scatter between luminometers. A major challenge with this technique is therefore to ensure that the obtained absolute calibration as extracted under the specialised van der Meer conditions is still valid when the LHC operates at nominal pileup (see “Stability shines” figure).

Stepwise approach

Using such a stepwise approach, the CMS collaboration obtained a total systematic uncertainty of 1.2% in the luminosity estimate (36.3 fb–1) of proton–proton collisions in 2016 – one of the most precise luminosity measurements ever made at bunched-beam hadron colliders. Recently, taking into account correlations between the years 2015–2018, CMS further improved on its preliminary estimate for the proton–proton luminosity at higher collision energies of 13 TeV. The full Run-2 data sample corresponds to a cumulative (“integrated”) luminosity of 140 fb–1 with a total uncertainty of 1.6%, which is comparable to the preliminary estimate from the ATLAS experiment.

Ratio of luminosities between luminometers

In the coming years, in particular when the High-Luminosity LHC (HL-LHC) comes online, a similarly precise luminosity calibration will become increasingly important as the LHC pushes the precision frontier further. Under those conditions, which are expected to produce 3000 fb–1 of proton–proton data by the end of LHC operations in the late 2030s (see “Precision frontier” figure), the impact from (at least some of) the sources of uncertainty is expected to be larger due to the expected high pileup. However, they can be mitigated using techniques already established in Run 2 and/or are currently under deployment. Overall, the strategy for the HL-LHC should combine three different elements: maintenance and upgrades of existing detectors; development of new detectors; and adding dedicated readouts to other planned subdetectors for luminosity and beam monitoring data. This will allow us to meet the tight luminosity performance target ( 1%) while maintaining a good diversity of luminometers. 

Given that accurate knowledge of luminosity is a key ingredient of most physics analyses, experiments also release precision estimates for specialised data sets, for example using either proton–proton collisions at lower centre-of-mass energies or involving nuclear collisions at different per-nucleon centre-of-mass energies, as needed by the ALICE but also ATLAS, CMS and LHCb experiments. On top of the van der Meer method, the LHCb collaboration uniquely employs a “beam-gas imaging” technique in which vertices of interactions between beam particles and gas nuclei in the beam vacuum are used to measure the transverse size of the beams without the need to displace them. In all cases, and despite the fact that the experiments are located at different interaction points, their luminosity-related data are used in combination with input from the LHC beam instrumentation. Close collaboration among the experiments and LHC operators is therefore a key prerequisite for precise luminosity determination.

Protons versus electrons

Contrary to the approach at hadron colliders, the operation of the SuperKEKB accelerator with electron–positron collisions allows for an even more precise luminosity determination. Following well-known QED processes, the Belle II experiment recently reported an almost unprecedented precision of 0.7% for data collected during April–July 2018. Though electrons and positrons conceptually give the SuperKEKB team a slightly easier task, its new record for the highest luminosity set at a collider is thus well established. 

Expected uncertainties

SuperKEKB’s record is achieved thanks to a novel “crabbed waist” scheme, originally proposed by accelerator physicist Pantaleo Raimondi. In the coming years this will enable the luminosity of SuperKEKB is to be increased by a factor of almost 30 to reach its design target of 8 × 1035 cm–2s–1. The crabbed waist scheme, which works by squeezing the vertical height of the beams at the interaction point, is also envisaged for the proposed Future Circular Collider (FCC-ee) at CERN. It also differs from the “crab-crossing” technology, based on special radio­frequency cavities, which is now being implemented at CERN for the high-luminosity phase of the LHC. While the LHC has passed the luminosity crown to SuperKEKB, taken together, novel techniques and the precise evaluation of their outcome continue to push forward both the accelerator and related physics frontiers. 

The post Counting collisions precisely at CMS appeared first on CERN Courier.

]]>
Feature Beyond the setting of new records, precise knowledge of the luminosity at particle colliders is vital for physics analyses. https://cerncourier.com/wp-content/uploads/2021/10/CCNovDec21_CMS_frontis.jpg
Breaking records at EPS-HEP https://cerncourier.com/a/breaking-records-at-eps-hep/ Tue, 05 Oct 2021 14:27:14 +0000 https://preview-courier.web.cern.ch/?p=95346 EPS-HEP 2021 saw breathtaking results from LHC Run 2, writes Christophe Grojean.

The post Breaking records at EPS-HEP appeared first on CERN Courier.

]]>
2021-EPS-HEP-Poster-WEB-final

In this year’s unusual Olympic summer, high-energy physicists pushed back the frontiers of knowledge and broke many records. The first one is surely the number of registrants to the EPS-HEP conference, hosted online from 26 to 30 July by the University of Hamburg and DESY: nearly 2000 participants scrutinised more than 600 talks and 280 posters. After 18 months of the COVID pandemic, the community showed a strong desire to meet and discuss physics with international colleagues. 

200 trillion b-quarks, 40 billion electroweak bosons, 300 million top quarks and 10 million Higgs bosons

The conference offered the opportunity to hear about analyses using the full LHC Run-2 data set, which is the richest hadron-collision data sample ever recorded. The results are breathtaking. As my CERN colleague Michelangelo Mangano explained recently to summer students, “The LHC works and is more powerful than expected, the experiments work and are more precise than expected, and the Standard Model works beautifully and is more reliable than expected.” About 3000 papers have been published by the LHC collaborations in the past decade. They have established the LHC as a truly multi-messenger endeavour, not so much because of the multitude of elementary particles produced – 200 trillion b-quarks, 40 billion electroweak bosons, 300 million top quarks and 10 million Higgs bosons – but because of the diversity of scientifically independent experiments that historically would have required different detectors and facilities, built and operated by different communities. “Data first” should always remain the leitmotif of the natural sciences. 

Paula Alvarez Cartelle (Cambridge) reminded us that the LHC has revealed new states of matter, with LHCb confirming that four or even five quarks can assemble themselves into new long-lived bound states, stabilised by the presence of two charm quarks. For theorists, these new quark-molecules provide valuable input data to tune their lattice simulations and to refine their understanding of the non-perturbative dynamics of strong interactions.

Theoretical tours de force

While Run 1 was a time for inclusive measurements, a multitude of differential measurements were performed during Run 2. Paolo Azzurri (INFN Pisa) reviewed the transverse momentum distribution of the jets produced in association with electroweak gauge bosons. These offer a way to test quantum chromodynamics and electroweak predictions at the highest achievable precision through higher-order computations, resummation and matching to parton showers. The work is fuelled by remarkable theoretical tours de force reported by Jonas Lindert (Sussex) and Lorenzo Tancredi (Oxford), which build on advanced mathematical techniques, including inspiring new mathematical developments in algebraic geometry and finite-field arithmetic. We experienced a historic moment: the LHC definitively became a precision machine, achieving measurements reaching and even surpassing LEP’s precision. This new situation also induced a shift more towards precision measurements, model-independent interpretations and Standard Model (SM) compatibility checks, and away from model-dependent searches for new physics. Effective-field-theory analyses are therefore gaining popularity, explained Veronica Sanz (Valencia and Sussex).

We know for certain that the SM is not the ultimate theory of nature. How and when the first cracks will be revealed is the big question that motivates future collider design studies. The enduring and compelling “B anomalies” reported by LHCb could well be the revolutionary surprise that challenges our current understanding of the structure of matter. The ratios of the decay widths of B mesons, either through charged or neutral currents, b→cℓν and b→sℓ+, could finally reveal that the electron, muon and tau lepton differ by more than just their masses.

The statistical significance of the lepton flavour anomalies is growing, reported Franz Muheim (Edinburgh and CERN), creating “cautious” excitement and stimulating the creativity of theorists like Ana Teixeira (Clermont-Ferrand), who builds new physics models with leptoquarks and heavy vectors with different couplings to the three families of leptons, to accommodate the apparent lepton-flavour-universality violations. Belle II should soon bring new additional input to the debate, said Carsten Niebuhr (DESY).

Long-awaited results

The other excitement of the year came from the long-awaited results from the muon g-2 experiment at Fermilab, presented by Alex Keshavarzi (Manchester). The spin precession frequency of a sample of 10 billion muons was measured with a precision of a few hundred parts per million, confirming the deviation from the SM prediction observed nearly 20 years ago by the E821 experiment at Brookhaven. With the current statistics, the deviation now amounts to 4.2σ. With an increase by a factor 20 of the dataset foreseen in the next run, the measurement will soon become systematics limited. Gilberto Colangelo (Bern) also discussed new and improved lattice computations of the hadronic vacuum polarisation, significantly reducing the discrepancy between the theoretical prediction and the experimental measurement. The jury is still out – and the final word might come from the g-2/EDM experiment at J-PARC.

Accelerator-based experiments might not be the place to prove the SM wrong. Astrophysical and cosmological observations have already taught us that SM matter only constitutes around 5% of the stuff that the universe is made of. The traditional idea that the gap in the energy budget of the universe is filled by new TeV-scale particles that stabilise the electroweak scale under radiative corrections is fading away. And a huge range of possible dark-matter scales opens up a rich and reinvigorated experimental programme that can profit from original techniques exploiting electron and nuclear recoils caused by the scattering of dark-matter particles. A front-runner in the new dark-matter landscape is the QCD axion originally introduced to explain why strong interactions do not distinguish matter from antimatter. Babette Döbrich (CERN) discussed the challenges inherent in capturing an axion, and described the many new experiments around the globe designed to overcome them.

Progress could also come directly from theory

Progress could also come directly from theory. Juan Maldacena (IAS Princeton) recalled the remarkable breakthroughs on the black-hole information problem. The Higgs discovery in 2012 established the non-trivial vacuum structure of space–time. We are now on our way to understanding the quantum mechanics of this space–time.

Like at the Olympics, where breaking records requires a lot of work and effort by the athletes, their teams and society, the quest to understand nature relies on the enthusiasm and the determination of physicists and their funding agencies. What we have learnt so far has allowed us to formulate precise and profound questions. We now need to create opportunities to answer them and to move ahead.

One cannot underestimate how quickly the landscape of physics can change, whether the B-anomalies will be confirmed or whether a dark-matter particle will be discovered. Let’s see what will be awaiting us at the next EPS-HEP conference in 2023 in Hamburg – in person this time!

The post Breaking records at EPS-HEP appeared first on CERN Courier.

]]>
Meeting report EPS-HEP 2021 saw breathtaking results from LHC Run 2, writes Christophe Grojean. https://cerncourier.com/wp-content/uploads/2021/10/EPS.png
African physicists begin strategy process https://cerncourier.com/a/african-physicists-begin-strategy-process/ Mon, 04 Oct 2021 19:30:24 +0000 https://preview-courier.web.cern.ch/?p=95307 A town-hall meeting initiated a broad and community-driven discussion leading to a final strategy document in two to three years’ time.

The post African physicists begin strategy process appeared first on CERN Courier.

]]>
The African Strategy for Fundamental and Applied Physics

Africa’s science, innovation, education and research infrastructures have over the years been undervalued and under-resourced. This is particularly true in physics. The African Strategy for Fundamental and Applied Physics (ASFAP) initiative aims to define the education and physics priorities that can be most impactful for Africa. The first ASFAP community town hall was held from 12 to 15 July. The event was virtual, with 147 people participating, including international speakers and members of the ASFAP community. The purpose of the meeting was to initiate a broad and community-driven discussion and action programme, leading to a final strategy document in two to three years’ time.

The first day began with an overview of the ASFAP by Simon Connell (University of Johannesburg) on behalf of the steering committee and addresses by Shamila Nair-Bedouelle (UNESCO assistant director-general for natural sciences), Sarah Mbi Enow Anyang Agbor (African Union commissioner for human resources, science and technology) and Raissa Malu (member of the Democratic Republic of Congo’s Presidential Panel to the African Union). These honoured guests encouraged delegates to establish a culture of gender balance in African physics. Later, in a dedicated forum for women in physics, Iroka Chidinma Joy (chief engineer at the National Space Research and Development Agency) noted that women are drastically underrepresented in scientific fields across the continent, and pointed out a number of cultural, religious and social barriers that prevent women from pursuing higher education. Barriers can come as early as primary education: in most cases, girls are not encouraged to take leading roles in conducting science experiments in classrooms. Improved strategies should include outreach, mentorship, dedicated funding for women, the removal of age limits for women wishing to conduct scientific research or further their education, and awards and recognition for women who excel in scientific fields. 

Community-driven

Representatives of scientific organisations such as the African Physical Society, the Network of African Science Academies and the African Academy of Science all presented messages of support for ASFAP, and delegates from other regions, including Japan, China, India, Europe, the US and Latin America, all presented their regional strategies. The consensus is that strategic planning should be a bottom-up and community-driven process, even if this means it may take two to three years to produce a final report. 

The meeting was updated on the progress of a diverse and well-established range of working groups (WGs) on accelerators, astrophysics and cosmology; computing and the fourth industrial revolution (4IR); energy needs for Africa; instrumentation and detectors; light sources; materials physics; medical physics; nuclear physics; particle physics; and community engagement (CE), which comprises physics education (PE), knowledge transfer, entrepreneurship and stakeholder and governmental-agency engagement. The WGs must also maintain dynamic communications with each other as key topics often impact multiple working groups.

Marie Clémentine Nibamureke (University of Johannesburg) highlighted the importance of the CE WG’s vision “to improve science education and research in African countries in order to position Africa as a co-leader in science research globally”. Convener Jamal Mimouni (Mentouri University) stressed that for ASFAP to establish a successful CE programme, it is crucial to reflect on challenges in teaching and learning physics in Africa – and on why students may be reluctant to choose physics as their study field. Nibamureke explained that the CE WG is seeking to appoint liaison officers between all the ASFAP working groups. Sam Ramaila (University of Johannesburg), representing the PE WG, indicated four main points the group has identified as crucial for the transformation and empowering of physics practices in Africa: strengthening teacher training; developing 21st-century skills and competences; introducing the 4IR in physics teaching and learning; and attracting and retaining students in physics programmes. Ramaila identified problem-based learning, self-directed learning and technology-enhanced learning as new educational strategies that could make a difference in Africa if applied more widely. 

On the subject of youth engagement, Mounia Laassiri (Mohammed V University) led a young-person’s forum to discuss the major issues young African physicists face in their career progression: outreach, professional development and networking will be a central focus for this new forum going forwards, she explained, and the forum aims to encourage young physics researchers to take up leadership roles. So far, there are about 40 members of the young-people’s forum. Laassiri explained that the long-term vision, which goes beyond ASFAP, is to develop into an association of young physicists affiliated to the African Physical Society.

We are now soliciting inputs for the development of the African Strategy for Fundamental and Applied Physics

The ability to generate scientific innovation and technological knowledge, and translate this into new products, is vital for a society’s economic growth and development. The ASFAP is a key step towards unlocking Africa’s potential. We are now soliciting inputs for the development of the African Strategy for Fundamental and Applied Physics. Letters of interest may be submitted by individuals, research groups, professional societies, policymakers, education officials and research institutes on anything they think is an issue, needs to be improved, or is important for fundamental or applied physics education and research in Africa.

The post African physicists begin strategy process appeared first on CERN Courier.

]]>
Meeting report A town-hall meeting initiated a broad and community-driven discussion leading to a final strategy document in two to three years’ time. https://cerncourier.com/wp-content/uploads/2021/10/ASFAP-191.png
From sea quarks to sea shanties https://cerncourier.com/a/from-sea-quarks-to-sea-shanties/ Thu, 30 Sep 2021 12:36:02 +0000 https://preview-courier.web.cern.ch/?p=94899 As TikTok dethrones Facebook as the most popular app, the Courier explores how high-energy physicists are taking advantage of the social-media phenomenon.

The post From sea quarks to sea shanties appeared first on CERN Courier.

]]>
Social media apps

After being shown the app by her mother during lockdown, ATLAS physicist Clara Nellist downloaded TikTok and created her first two “shorts” in January this year. Jumping on a TikTok trend, the first saw her sing a CERN-themed sea shanty, while the second was an informal introduction to her page as she meandered around a park near the CERN site. Together, these two videos now total almost 600,000 views. Six months later, another ATLAS physicist, James Beacham, joined the platform, also with a quick introduction video explaining his work while using the ATLAS New Small Wheels as a backdrop. The video now has over 1.7 million views. With TikTok videos giving other social-media channels a run for their money, soon more of the high-energy physics community may want to join the rising media tide.

Surfing the wave

From blogs in the early 2000s through to Twitter and YouTube today, user-generated ‘Web 2.0’ platforms have allowed scientists to discuss their work and share their excitement directly. In the case of particle physics, researchers and their labs have never been within closer reach to the public, with a tour of the Large Hadron Collider always just a few clicks away. In 2005, as blogs were mushrooming, CERN and other players in particle physics joined forces to create Quantum Diaries. As the popularity of blogs began to dwindle towards the late noughties, CERN hopped on the next wave, joining YouTube in 2007 and Twitter in 2008 – at a time when public interest in the LHC was at its peak. CERN’s Twitter account currently boasts an impressive 2.5 million followers.

While joining later than some other laboratories, Fermilab caught onto a winning formula on YouTube, with physicist Don Lincoln fronting a long-standing educational series that began in 2011 and still runs today, attracting millions of views. Most major particle-physics laboratories also have a presence on Facebook and Instagram, with CERN joining the platforms in 2011 and 2014 respectively, not to mention LinkedIn, where CERN also possesses a significant following.

Particle physics laboratories are yet to launch themselves on TikTok. But that hasn’t stopped others from creating videos about particle physics, and not always “on message”. Type ‘CERN’ into the TikTok search bar and you are met with almost a 50/50 mix of informative videos and conspiracy theories – and even then, some of the former are merely debunking the latter. Is it time for institutions to get in on the trend?

Rising to the moment

Nellist, who has 123,000 followers on TikTok after less than nine months on the site, believes that it’s the human aspect and uniqueness of her content that has caused the quick success. “I started because I wanted to humanise science – people don’t realise that normal humans work at CERN. When I started there was nobody else at CERN doing it.” Beacham also uses CERN as a way of capturing attention, as illustrated in his weekly livestreams to his 230,000 followers. “If someone is scrolling and sees someone sitting in a DUNE cryostat discussing science, they’re going to stop and check it out,” he says. Beacham sees himself as a filmmaker, rather than a “TikTok-er”, and flexes his bachelor’s degree in film studies with longer form videos that take him across the CERN campus. “There is a desire on TikTok to know about CERN,” he says.

Clara Nellist

TikTok is different to other social-media platforms in several ways, one being  incompatibility. While a single piece of media such as a video can be shared across YouTube, Twitter, Instagram, Facebook, etc., this same media would not work on TikTok. Videos can also only be a maximum of three minutes, although the majority are shorter. This encourages concise productions, with a lot of information put across in a short period of time. Arguably the biggest difference is that TikTok insists that every video is in portrait mode – creating a feeling of authenticity and an intimate environment. YouTube and Instagram are now following suit with their portrait-mode ‘YouTube Shorts’, and ‘Instagram Reels’ respectively, with CERN already using the latter to create quick and informative clips that have attracted large audiences.

Nellist and Beacham, both engaging physicists in their own right who the viewer feels they can trust, create a perfect blend for TikTok. While there are some topics that will always generate more interest, they have a core audience that consistently returns for all videos. This gives a strong sense of editorial freedom, says Nellist. “While it is important to be aware of views, I get to make what I want.”

Changing demographics

When CERN joined Twitter in 2008, says James Gillies, head of CERN communications at the time, young people were a key factor as CERN tried to maximise its digital footprint. But things have changed since then. It is estimated that there are over 1 billion active TikTok users per month, and according to data firm Statista, in the US almost 50% of them are aged 30 and under, with other reports stating that up to 32.5% of users are between the ages of 10 and 19. Statista also estimates that only 24% of today’s Twitter users are under 25 – the so-called ‘Gen-Z’ who will fund and possibly work on future colliders.

If you want to lead the conversation, you have to be part of it

James Gillies

Another reason for CERN to enter the Twitter-verse (and which facilitated the creation of Quantum Diaries), says Gillies, was to allow CERN to take their communication into their own hands. Although Nellist and Beacham are already encouraging this discussion on TikTok, they are not official CERN communication channels. Were they to decide to stop or talk about different topics, it would be hard to find any positive high-energy physics discussions on the most popular app on the planet.

Whilst Nellist believes CERN should be joining the platform, she urges that someone “who knows about it” should be dedicated to creating the content, as it is obvious to TikTok audiences when someone doesn’t understand it. Beacham states, “humans don’t respond to ideas as much as they respond to people.” Creators have their own unique styles and personalities that the viewers enjoy. So, if a large institution were to join, how would it create this personal environment?

James Beacham

The ATLAS experiment is currently the only particle-physics experiment to be found on the platform. The content is less face-to-face and more focused on showing the detector and how it works – similar in style to a CERN Instagram story. Despite being active for a similar amount of time as Nellist and Beacham, however, the account has significantly fewer followers. Nellist, who runs the ATLAS TikTok account, thinks there is room for both personal and institutional creators on the platform, though the content should be different. Beacham agrees, stating that it should show individual scientists expressing information in an off-the-cuff way. “There is a huge opportunity to do something great with it, there are thousands of things you could do. There are amazing visuals that CERN is capable of creating that can grab a viewer’s attention.”

 

Keeping up

There may be some who scoff at the idea of CERN joining a platform that has a public image of creating dance crazes rather than educational content. It is easy to forget that when first established, YouTube was seen as the place for funny cat videos, while Twitter was viewed as an unnecessary platform for people to tell others what they had for breakfast. Now these two platforms are the only reason some may know CERN exists, and unfortunately, not always for the right reasons.

Social media gives physicists and laboratories the opportunity to contact and influence audiences more directly than traditional channels. The challenge is to keep up with the pace of change. It’s clearly early days for a platform that only took off in 2018. Even NASA, which has the largest number of social-media followers of any scientific institution, is yet to launch an official TikTok channel. But, says Gillies, “If you want to lead the conversation, you have to be part of it.”

The post From sea quarks to sea shanties appeared first on CERN Courier.

]]>
Feature As TikTok dethrones Facebook as the most popular app, the Courier explores how high-energy physicists are taking advantage of the social-media phenomenon. https://cerncourier.com/wp-content/uploads/2021/09/27E90F40-7BBD-4166-BB51-A52C6968AB11.jpg
On your way to Cyclotron Road? https://cerncourier.com/a/on-your-way-to-cyclotron-road/ Mon, 27 Sep 2021 13:05:07 +0000 https://preview-courier.web.cern.ch/?p=94956 Berkeley Lab’s Cyclotron Road initiative is helping science innovators to translate their ideas into high-impact technologies.

The post On your way to Cyclotron Road? appeared first on CERN Courier.

]]>
Rachel Slaybaugh

Entrepreneurial scientists and engineers take note: the next round of applications to Cyclotron Road’s two-year fellowship programme will open in the fourth quarter, offering a funded path for early-stage start-ups in “hard tech” (i.e. physical hardware rather than software) to fast-track development of their applied research innovations. Now in its sixth year, Cyclotron Road is a division of the US Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley, California) and is run in partnership with non-profit Activate, a specialist provider of entrepreneurship education and training. 

Successful applicants who navigate the rigorous merit-review process will receive $100,000 of research support for their project as well as a stipend, health insurance and access to Berkeley Lab’s world-class research facilities and scientific expertise. CERN Courier gets the elevator pitch from Rachel Slaybaugh, Cyclotron Road division director. 

Summarise your objectives for Cyclotron Road

Our mission is to empower science innovators to develop their ideas from concept to first product, positioning them for broad societal impact in the long term. We create the space for fellows to commercialise their ideas by giving them direct access to the world-leading scientists and facilities at Berkeley Lab. Crucially, we reinforce that support with a parallel curriculum of specialist entrepreneurship education from our programme partner Activate. 

What are the benefits of embedding the fellowship programme at Berkeley Lab?

Cyclotron Road is not a one-size-fits-all programme, so the benefits vary from fellow to fellow. Some of the fellows and their teams only loosely make use of Berkeley Lab services, while others will embed in a staff scientist’s lab and engage in close collaborative R&D work. The value proposition is that our fellows have access to Berkeley Lab and its resources but can choose what model works best for them. It seems to work: since 2015, Cyclotron Road fellows have collaborated with more than 70 Berkeley Lab scientists, while the organisations they’ve founded have collectively raised more than $360 million in follow-on funding. 

What do you look for in prospective Cyclotron Road fellows? 

We want smart, talented individuals with a passion to develop and grow their own early-stage hard-tech venture. Adaptability is key: Cyclotron Road fellows need to have the technical and intellectual capability to pivot their business plan if needed. As such, our fellows are collaborative team players by default, coachable and hungry to learn. They don’t need to take all the advice they’re given in the programme, but they do need to be open-minded and willing to listen to a range of viewpoints regarding technology innovation and commercial positioning. 

Explain the role of Activate in the professional development of fellows 

Activate is an essential partner in the Cyclotron Road mission. Its team handles the parallel programme of entrepreneurship education, including an onboarding bootcamp, weekly mentoring and quarterly “deep-dives” on all aspects of technology and business development. The goal is to turn today’s talented scientists and engineers into tomorrow’s technology CEOs and CTOs. Activate also has staff to curate strategic relationships for our fellows, helping start-ups connect with investors, industry partners and equipment suppliers. That’s reinforced by the opportunity to link up with the amazing companies in Cyclotron Road’s alumni network.

How does Cyclotron Road benefit Berkeley Lab?

There are several upsides. We’re bringing entrepreneurship and commercial thinking into the lab, helping Berkeley scientists build bridges with these new technology companies – and the innovators driving them. That has paybacks in terms of future funding proposals, giving our researchers a better understanding of how to position their research from an applications perspective. The knowledge transfer between Cyclotron Road fellows and Berkeley Lab scientists is very much a two-way process: while fellows progress their commercial ideas, they are often sparking new lines of enquiry among their collaborators here at Berkeley Lab. 

How are you broadening participation?

Fellows receive a yearly living stipend of $80,000 to $110,000, health insurance, a relocation stipend and a travel allowance – all of which means they’re able to focus full-time on their R&D. Our priority is to engage a diverse community of researchers – not just those individuals who already have a high net worth or access to a friends-and-family funding round. We’re building links with universities and labs outside the traditional technology hot-spots like Silicon Valley, Boston and Seattle, as well as engaging institutions that serve under-represented minorities. Worth adding that Cyclotron Road welcomes international applicants in a position to relocate to California for two years.  

Further information on the Cyclotron Road fellowship programme: https://cyclotronroad.lbl.gov/.

The post On your way to Cyclotron Road? appeared first on CERN Courier.

]]>
Careers Berkeley Lab’s Cyclotron Road initiative is helping science innovators to translate their ideas into high-impact technologies. https://cerncourier.com/wp-content/uploads/2021/09/CCUSASupp21_CYCLOTRON_feature.jpg
Forging the future of AI https://cerncourier.com/a/forging-the-future-of-ai/ Tue, 31 Aug 2021 22:00:52 +0000 https://preview-courier.web.cern.ch/?p=93653 Leaders in artificial-intelligence research spoke to the Courier about what's next for the field, and how developments may impact fundamental science.

The post Forging the future of AI appeared first on CERN Courier.

]]>
Jennifer Ngadiuba speaks to fellow Sparks! participants Michael Kagan and Bruno Giussani

Field lines arc through the air. By chance, a cosmic ray knocks an electron off a molecule. It hurtles away, crashing into other molecules and multiplying the effect. The temperature rises, liberating a new supply of electrons. A spark lights up the dark.

Vivienne Ming

The absence of causal inference in practical machine learning touches on every aspect of AI research, application, ethics and policy

Vivienne Ming is a theoretical neuroscientist and a serial AI entrepreneur

This is an excellent metaphor for the Sparks! Serendipity Forum – a new annual event at CERN designed to encourage interdisciplinary collaborations between experts on key scientific issues of the day. The first edition, which will take place from 17 to 18 September, will focus on artificial intelligence (AI). Fifty leading thinkers will explore the future of AI in topical groups, with the outcomes of their exchanges to be written up and published in the journal Machine Learning: Science and Technology. The forum reflects the growing use of machine-learning techniques in particle physics and emphasises the importance that CERN and the wider community places on collaborating with diverse technological sectors. Such interactions are essential to the long-term success of the field. 

Anima Anandkumar

AI is orders of magnitude faster than traditional numerical simulations. On the other side of the coin, simulations are being used to train AI in domains such as robotics where real data is very scarce

Anima Anandkumar is Bren professor at Caltech and director of machine learning research at NVIDIA

The likelihood of sparks flying depends on the weather. To take the temperature, CERN Courier spoke to a sample of the Sparks! participants to preview themes for the September event.

Genevieve Bell

2020 revealed unexpectedly fragile technological and socio-cultural infrastructures. How we locate our conversations and research about AI in those contexts feels as important as the research itself

Genevieve Bell is director of the School of Cybernetics at the Australian National University and vice president at Intel

Back to the future

In the 1980s, AI research was dominated by code that emulated logical reasoning. In the 1990s and 2000s, attention turned to softening its strong syllogisms into probabilistic reasoning. Huge strides forward in the past decade have rejected logical reasoning, however, instead capitalising on computing power by letting layer upon layer of artificial neurons discern the relationships inherent in vast data sets. Such “deep learning” has been transformative, fuelling innumerable innovations, from self-driving cars to searches for exotica at the LHC (see Hunting anomalies with an AI trigger). But many Sparks! participants think that the time has come to reintegrate causal logic into AI.

Stuart Russell

Geneva is the home not only of CERN but also of the UN negotiations on lethal autonomous weapons. The major powers must put the evil genie back in the bottle before it’s too late

Stuart Russell is professor of computer science at the University of California, Berkeley and coauthor of the seminal text on AI

“A purely predictive system, such as the current machine learning that we have, that lacks a notion of causality, seems to be very severely limited in its ability to simulate the way that people think,” says Nobel-prize-winning cognitive psychologist Daniel Kahneman. “Current AI is built to solve one specific task, which usually does not include reasoning about that task,” agrees AAAI president-elect Francesca Rossi. “Leveraging what we know about how people reason and behave can help build more robust, adaptable and generalisable AI – and also AI that can support humans in making better decisions.”

Tomaso Poggio

AI is converging on forms of intelligence that are useful but very likely not human-like

Tomaso Poggio is a cofounder of computational neuroscience and Eugene McDermott professor at MIT

Google’s Nyalleng Moorosi identifies another weakness of deep-learning models that are trained with imperfect data: whether AI is deciding who deserves a loan or whether an event resembles physics beyond the Standard Model, its decisions are only as good as its training. “What we call the ground truth is actually a system that is full of errors,” she says.

Nyalleng Moorosi

We always had privacy violation, we had people being blamed falsely for crimes they didn’t do, we had mis-diagnostics, we also had false news, but what AI has done is amplify all this, and make it bigger

Nyalleng Moorosi is a research software engineer at Google and a founding member of Deep Learning Indaba

Furthermore, says influential computational neuroscientist Tomaso Poggio, we don’t yet understand the statistical behaviour of deep-learning algorithms with mathematical precision. “There is a risk in trying to understand things like particle physics using tools we don’t really understand,” he explains, also citing attempts to use artificial neural networks to model organic neural networks. “It seems a very ironic situation, and something that is not very scientific.”

Daniel Kahneman

This idea of partnership, that worries me. It looks to me like a very unstable equilibrium. If the AI is good enough to help the person, then pretty soon it will not need the person

Daniel Kahneman is a renowned cognitive psychologist and a winner of the 2002 Nobel Prize in Economics

Stuart Russell, one of the world’s most respected voices on AI, echoes Poggio’s concerns, and also calls for a greater focus on controlled experimentation in AI research itself. “Instead of trying to compete between Deep Mind and OpenAI on who can do the biggest demo, let’s try to answer scientific questions,” he says. “Let’s work the way scientists work.”

Good or bad?

Though most Sparks! participants firmly believe that AI benefits humanity, ethical concerns are uppermost in their minds. From social-media algorithms to autonomous weapons, current AI overwhelmingly lacks compassion and moral reasoning, is inflexible and unaware of its fallibility, and cannot explain its decisions. Fairness, inclusivity, accountability, social cohesion, security and international law are all impacted, deepening links between the ethical responsibilities of individuals, multinational corporations and governments. “This is where I appeal to the human-rights framework,” says philosopher S Matthew Liao. “There’s a basic minimum that we need to make sure everyone has access to. If we start from there, a lot of these problems become more tractable.”

S Matthew Liao

We need to understand ethical principles, rather than just list them, because then theres a worry that were just doing ethics washing – they sound good but they dont have any bite

S Matthew Liao is a philosopher and the director of the Center for Bioethics at New York University

Far-term ethical considerations will be even more profound if AI develops human-level intelligence. When Sparks! participants were invited to put a confidence interval on when they expect human-level AI to emerge, answers ranged from [2050, 2100] at 90% confidence to [2040, ] at 99% confidence. Other participants said simply “in 100 years” or noted that this is “delightfully the wrong question” as it’s too human-centric. But by any estimation, talking about AI cannot wait.

Francesca Rossi

Only a multi-stakeholder and multi-disciplinary approach can build an ecosystem of trust around AI. Education, cultural change, diversity and governance are equally as important as making AI explainable, robust and transparent

Francesca Rossi co-leads the World Economic Forum Council on AI for humanity and is IBM AI ethics global leader and the president-elect of AAAI

“With Sparks!, we plan to give a nudge to serendipity in interdisciplinary science by inviting experts from a range of fields to share their knowledge, their visions and their concerns for an area of common interest, first with each other, and then with the public,” says Joachim Mnich, CERN’s director for research and computing. “For the first edition of Sparks!, we’ve chosen the theme of AI, which is as important in particle physics as it is in society at large. Sparks! is a unique experiment in interdisciplinarity, which I hope will inspire continued innovative uses of AI in high-energy physics. I invite the whole community to get involved in the public event on 18 September.”

 

The post Forging the future of AI appeared first on CERN Courier.

]]>
Feature Leaders in artificial-intelligence research spoke to the Courier about what's next for the field, and how developments may impact fundamental science. https://cerncourier.com/wp-content/uploads/2021/08/CCSepOct21_SPARKS_frontis.jpg
Hunting anomalies with an AI trigger https://cerncourier.com/a/hunting-anomalies-with-an-ai-trigger/ Tue, 31 Aug 2021 21:55:21 +0000 https://preview-courier.web.cern.ch/?p=93617 Jennifer Ngadiuba and Maurizio Pierini describe how ‘unsupervised’ machine learning could keep watch for signs of new physics at the LHC that have not yet been dreamt up by physicists.

The post Hunting anomalies with an AI trigger appeared first on CERN Courier.

]]>
In the 1970s, the robust mathematical framework of the Standard Model (SM) replaced data observation as the dominant starting point for scientific inquiry in particle physics. Decades-long physics programmes were put together based on its predictions. Physicists built complex and highly successful experiments at particle colliders, culminating in the discovery of the Higgs boson at the LHC in 2012.

Along this journey, particle physicists adapted their methods to deal with ever growing data volumes and rates. To handle the large amount of data generated in collisions, they had to optimise real-time selection algorithms, or triggers. The field became an early adopter of artificial intelligence (AI) techniques, especially those falling under the umbrella of “supervised” machine learning. Verifying the SM’s predictions or exposing its shortcomings became the main goal of particle physics. But with the SM now apparently complete, and supervised studies incrementally excluding favoured models of new physics, “unsupervised” learning has the potential to lead the field into the uncharted waters beyond the SM.

Blind faith

To maximise discovery potential while minimising the risk of false discovery claims, physicists design rigorous data-analysis protocols to minimise the risk of human bias. Data analysis at the LHC is blind: physicists prevent themselves from combing through data in search of surprises. Simulations and “control regions” adjacent to the data of interest are instead used to design a measurement. When the solidity of the procedure is demonstrated, an internal review process gives the analysts the green light to look at the result on the real data and produce the experimental result. 

A blind analysis is by necessity a supervised approach. The hypothesis being tested is specified upfront and tested against the null hypothesis – for example, the existence of the Higgs boson in a particular mass range versus its absence. Once spelled out, the hypothesis determines other aspects of the experimental process: how to select the data, how to separate signals from background and how to interpret the result. The analysis is supervised in the sense that humans identify what the possible signals and backgrounds are, and label examples of both for the algorithm.

rtist’s impression of an FPGA

The data flow at the LHC makes the need to specify a signal hypothesis upfront even more compelling. The LHC produces 40 million collision events every second. Each overlaps with 34 others from the same bunch crossing, on average, like many pictures superimposed on top of each other. However, the computing infrastructure of a typical experiment is designed to sustain a data flow of just 1000 events per second. To avoid being overwhelmed by the data pressure, it’s necessary to select these 1000 out of every 40 million events in a short time. But how do you decide what’s interesting? 

This is where the supervised nature of data analysis at the LHC comes into play. A set of selection rules – the trigger algorithms – are designed so that the kind of collisions predicted by the signal hypotheses being studied are present among the 1000 (see “Big data” figure). As long as you know what to look for, this strategy optimises your resources. The discovery in 2012 of the Higgs boson demonstrates this: a mission considered impossible in the 1980s was accomplished with less data and less time than anticipated by the most optimistic guesses when the LHC was being designed. Machine learning played a crucial role in this.

Machine learning

Machine learning (ML) is a branch of computer science that deals with algorithms capable of accomplishing a task without being explicitly programmed to do so. Unlike traditional algorithms, which are sets of pre-determined operations, an ML algorithm is not programmed. It is trained on data, so that it can adjust itself to maximise its chances of success, as defined by a quantitative figure of merit. 

To explain further, let’s use the example of a dataset of images of cats and dogs. We’ll label the cats as “0” and the dogs as “1”, and represent the images as a two-dimensional array of coloured pixels, each with a fraction of red, green and blue. Each dog or cat is now a stack of three two-dimensional arrays of numbers between 0 and 1 – essentially just the animal pictured in red, green and blue light. We would like to have a mathematical function converting this stack of arrays into a score ranging from 0 to 1. The larger the score, the higher the probability that the image is a dog. The smaller the score, the higher the probability that the image is a cat. An ML algorithm is a function of this kind, whose parameters are fixed by looking at a given dataset for which the correct labels are known. Through a training process, the algorithm is tuned to minimise the number of wrong answers by comparing its prediction to the labels.

Data flow from the ATLAS and CMS experiments

Now replace the dogs with photons from the decay of a Higgs boson, and the cats with detector noise that is mistaken to be photons. Repeat the procedure, and you will obtain a photon-identification algorithm that you can use on LHC data to improve the search for Higgs bosons. This is what happened in the CMS experiment back in 2012. Thanks to the use of a special kind of ML algorithm called boosted decision trees, it was possible to maximise the accuracy of the Higgs-boson search, exploiting the rich information provided by the experiment’s electromagnetic calorimeter. The ATLAS collaboration developed a similar procedure to identify Higgs bosons decaying into a pair of tau leptons.

Photon and tau-lepton classifiers are both examples of supervised learning, and the success of the discovery of the Higgs boson was also a success story for applied ML. So far so good. But what about searching for new physics?

Typical examples of new physics such as supersymmetry, extra dimensions and the underlying structure for the Higgs boson have been extensively investigated at the LHC, with no evidence for them found in data. This has told us a great deal about what the particles predicted by these scenarios cannot look like, but what if the signal hypotheses are simply wrong, and we’re not looking for the right thing? This situation calls for “unsupervised” learning, where humans are not required to label data. As with supervised learning, this idea doesn’t originate in physics. Marketing teams use clustering algorithms based on it to identify customer segments. Banks use it to detect credit-card fraud by looking for anomalous access patterns in customers’ accounts. Similar anomaly detection techniques could be used at the LHC to single out rare events, possibly originating from new, previously undreamt of, mechanisms.

Unsupervised learning

Anomaly detection is a possible strategy for keeping watch for new physics without having to specify an exact signal. A kind of unsupervised ML, it involves ranking an unlabelled dataset from the most typical to the most atypical, using a ranking metric learned during training. One of the advantages of this approach is that the algorithm can be trained on data recorded by the experiment rather than simulations. This could, for example, be a control sample that we know to be dominated by SM processes: the algorithm will learn how to reconstruct these events “exactly” – and conversely how to rank unknown processes as atypical. As a proof of principle, this strategy has already been applied to re-discover the top quark using the first open-data release by the CMS collaboration.

This approach could be used in the online data processing at the LHC and applied to the full 40 million collision events produced every second. Clustering techniques commonly used in observational astronomy could be used to highlight the recurrence of special kinds of events.

Unsupervised detection of leptoquark and neutral-scalar-boson decays

In case a new kind of process happens in an LHC collision, but is discarded by the trigger algorithms serving the traditional physics programme, an anomaly-detection algorithm could save the relevant events, storing them in a special stream of anomalous events (see “Anomaly hunting” figure). The ultimate goal of this approach would be the creation of an “anomaly catalogue” of event topologies for further studies, which could inspire novel ideas for new-physics scenarios to test using more traditional techniques. With an anomaly catalogue, we could return to the first stage of the scientific method, and recover a data-driven alternative approach to the theory-driven investigation that we have come to rely on. 

This idea comes with severe technological challenges. To apply this technique to all collision events, we would need to integrate the algorithm, typically a special kind of neural network called an autoencoder, into the very first stage of the online data selection, the level-one (L1) trigger. The L1 trigger consists of logic algorithms integrated onto custom electronic boards based on field programmable gate arrays (FPGAs) – a highly parallelisable chip that serves as a programmable emulator of electronic circuits. Any L1 trigger algorithm has to run within the order of one microsecond, and take only a fraction of the available computing resources. To run in the L1 trigger system, an anomaly detection network needs to be converted into an electronic circuit that would fulfill these constraints. This goal can be met using the “hls4ml” (high-level synthesis for ML) library – a tool designed by an international collaboration of LHC physicists that exploits automatic workflows. 

Computer-science collaboration

Recently, we collaborated with a team of researchers from Google to integrate the hls4ml library into Google’s “QKeras” – a tool for developing accurate ML models on FPGAs with a limited computing footprint. Thanks to this partnership, we developed a workflow that can design a ML model in concert with its final implementation on the experimental hardware. The resulting QKeras+hls4ml bundle is designed to allow LHC physicists to deploy anomaly-detection algorithms in the L1 trigger system. This approach could practically be deployed in L1 trigger systems before the end of LHC Run 3 – a powerful complement to the anomaly-detection techniques that are already being considered for “offline” data analysis on the traditionally triggered samples.

AI techniques could help the field break beyond the limits of human creativity in theory building

If this strategy is endorsed by the experimental collaborations, it could create a public dataset of anomalous data that could be investigated during the third LHC long shutdown, from 2025 to 2027. By studying those events, phenomenologists and theoretical physicists could formulate creative hypotheses about new-physics scenarios to test, potentially opening up new search directions for the High-Luminosity LHC.

Blind analyses minimise human bias if you know what to look for, but risk yielding diminishing returns when the theoretical picture is uncertain, as is the case in particle physics after the first 10 years of LHC physics. Unsupervised AI techniques such as anomaly detection could help the field break beyond the limits of human creativity in theory building. In the big-data environment of the LHC, they offer a powerful means to move the field back to data-driven discovery, after 50 years of theory-driven progress. To maximise their impact, they should be applied to every collision produced at the LHC. For that reason, we argue that anomaly-detection algorithms should be deployed in the L1 triggers of the LHC experiments, despite the technological challenges that must be overcome to make that happen.

The post Hunting anomalies with an AI trigger appeared first on CERN Courier.

]]>
Feature Jennifer Ngadiuba and Maurizio Pierini describe how ‘unsupervised’ machine learning could keep watch for signs of new physics at the LHC that have not yet been dreamt up by physicists. https://cerncourier.com/wp-content/uploads/2021/08/CCSepOct21_AI_frontis.jpg
What’s in the box? https://cerncourier.com/a/whats-in-the-box/ Tue, 31 Aug 2021 21:50:42 +0000 https://preview-courier.web.cern.ch/?p=93625 The LHC Olympics and Dark Machines data challenges stimulated innovation in the use of machine learning to search for new physics, write Benjamin Nachman and Melissa van Beekveld.

The post What’s in the box? appeared first on CERN Courier.

]]>
A neural network probing a black box of complex final states

The need for innovation in machine learning (ML) transcends any single experimental collaboration, and requires more in-depth work than can take place at a workshop. Data challenges, wherein simulated “black box” datasets are made public, and contestants design algorithms to analyse them, have become essential tools to spark interdisciplinary collaboration and innovation. Two have recently concluded. In both cases, contestants were challenged to use ML to figure out “what’s in the box?”

LHC Olympics

The LHC Olympics (LHCO) data challenge was launched in autumn 2019, and the results were presented at the ML4Jets and Anomaly Detection workshops in spring and summer 2020. A final report summarising the challenge was posted to arXiv earlier this year, written by around 50 authors from a variety of backgrounds in theory, the ATLAS and CMS experiments, and beyond. The name of this community effort was inspired by the first LHC Olympics that took place more than a decade ago, before the start of the LHC. In those olympics, researchers were worried about being able to categorise all of the new particles that would be discovered when the machine turned on. Since then, we have learned a great deal about nature at TeV energy scales, with no evidence yet for new particles or forces of nature. The latest LHC Olympics focused on a different challenge – being able to find new physics in the first place. We now know that new physics must be rare and not exactly like what we expected.

In order to prepare for rare and unexpected new physics, organisers Gregor Kasieczka (University of Hamburg), Benjamin Nachman (Lawrence Berkeley National Laboratory) and David Shih (Rutgers University) provided a set of black-box datasets composed mostly of Standard Model (SM) background events. Contestants were charged with identifying any anomalous events that would be a sign of new physics. These datasets focused on resonant anomaly detection, whereby the anomaly is assumed to be localised – a “bump hunt”, in effect. This is a generic feature of new physics produced from massive new particles: the reconstructed parent mass is the resonant feature. By assuming that the signal is localised, one can use regions away from the signal to estimate the background. The LHCO provided one R&D dataset with labels and three black boxes to play with: one with an anomaly decaying into two two-pronged resonances, one without an anomaly, and one with an anomaly featuring two different decay modes (a dijet decay X → qq and a trijet decay X → gY, Y → qq).  There are currently no dedicated searches for these signals in LHC data.

No labels

About 20 algorithms were deployed on the LHCO datasets, including supervised learning, unsupervised learning, weakly supervised learning and semi-supervised learning. Supervised learning is the most widely used method across science and industry, whereby each training example has a label: “background” or “signal”. For this challenge, the data do not have labels as we do not know exactly what we are looking for, and so strategies trained with labels from a different dataset often did not work well. By contrast, unsupervised learning generally tries to identify events that are rarely or never produced by the background; weakly supervised methods use some context from data to provide noisy labels; and semi-supervised methods use some simulation information in order to have a partial set of labels. Each method has its strengths and weaknesses, and multiple approaches are usually needed to achieve a broad coverage of possible signals.

The Dark Machines data challenge focused on developing algorithms broadly sensitive to non-resonant anomalies

The best performance on the first black box in the LHCO challenge, as measured by finding and correctly characterising the anomalous signals, was by a team of cosmologists at Berkeley (George Stein, Uros Seljak and Biwei Dai) who compared the phase-space density between a sliding signal region and sidebands (see “Olympian algorithm” figure). Overall, the algorithms did well on the R&D dataset, and some also did well on the first black box, with methods that made use of likelihood ratios proving particularly effective. But no method was able to detect the anomalies in the third black box, and many teams reported a false signal for the second black box. This “placebo effect’’ illustrates the need for ML approaches to have an accurate estimation of the background and not just a procedure for identifying signals. The challenge for the third black box, however, required algorithms to identify multiple clusters of anomalous events rather than a single cluster. Future innovation is needed in this department.

Dark Machines

A second data challenge was launched in June 2020 within the Dark Machines initiative. Dark Machines is a research collective of physicists and data scientists who apply ML techniques to understand the nature of dark matter – as we don’t know the nature of dark matter, it is critical to search broadly for its anomalous signatures. The challenge was organised by Sascha Caron (Radboud University), Caterina Doglioni (University of Lund) and Maurizio Pierini (CERN), with notable contributions from Bryan Ostidiek (Harvard University) in the development of a common software infrastructure, and Melissa van Beekveld (University of Oxford) for dataset generation. In total, 39 participants arranged in 13 teams explored various unsupervised techniques, with each team submitting multiple algorithms.

The anomaly score

By contrast with LHCO, the Dark Machines data challenge focused on developing algorithms broadly sensitive to non-resonant anomalies. Good examples of non-resonant new physics include many supersymmetric models and models of dark matter – anything where “invisible” particles don’t interact with the detector. In such a situation, resonant peaks become excesses in the tails of the missing-transverse-energy distribution. Two datasets were provided: R&D datasets including a concoction of SM processes and many signal samples for contestants to develop their approaches on; and a black-box dataset mixing SM events with events from unspecified signal processes. The challenge has now formally concluded, and its outcome was posted on arXiv in May, but the black-box has not been opened to allow the community to continue to test ideas on it.

A wide variety of unsupervised methods have been deployed so far. The algorithms use diverse representations of the collider events (for example, lists of particle four-momenta, or physics quantities computed from them), and both implicit and explicit approaches for estimating the probability density of the background (for example, autoencoders and “normalising flows”). While no single method universally achieved the highest sensitivity to new-physics events, methods that mapped the background to a fixed point and looked for events that were not described well by this mapping generally did better than techniques that had a so-called dynamic embedding. A key question exposed by this challenge that will inspire future innovation is how best to tune and combine unsupervised machine-learning algorithms in a way that is model independent with respect to the new physics describing the signal.

The enthusiastic response to the LHCO and Dark Machines data challenges highlights the important future role of unsupervised ML at the LHC and elsewhere in fundamental physics. So far, just one analysis has been published – a dijet-resonance search by the ATLAS collaboration using weakly-supervised ML – but many more are underway, and these techniques are even being considered for use in the level-one triggers of LHC experiments (see Hunting anomalies with an AI trigger). And as the detection of outliers also has a large number of real-world applications, from fraud detection to industrial maintenance, fruitful cross-talk between fundamental research and industry is possible. 

The LHCO and Dark Machines data challenges are a stepping stone to an exciting experimental programme that is just beginning. 

The post What’s in the box? appeared first on CERN Courier.

]]>
Feature The LHC Olympics and Dark Machines data challenges stimulated innovation in the use of machine learning to search for new physics, write Benjamin Nachman and Melissa van Beekveld. https://cerncourier.com/wp-content/uploads/2021/08/Data-Challenges.jpg
Sustainable high-energy physics https://cerncourier.com/a/sustainable-high-energy-physics/ Sun, 18 Jul 2021 16:47:23 +0000 https://preview-courier.web.cern.ch/?p=93410 The workshop attracted more than 300 participants from 45 countries to discuss how the lessons learned in the past two years might help HEP transition to a more sustainable future.

The post Sustainable high-energy physics appeared first on CERN Courier.

]]>
SustHEP 2021

COVID-19 put the community on a steep learning curve regarding new forms of online communication and collaboration. Before the pandemic, a typical high-energy physics (HEP) researcher was expected to cross the world several times a year for conferences, collaboration meetings and detector shifts, at the cost of thousands of dollars and a sizeable carbon footprint. The online workshop Sustainable HEP — a new initiative this year — attracted more than 300 participants from 45 countries from 28 to 30 June to discuss how the lessons learned in the past two years might help HEP transition to a more sustainable future.

The first day of the workshop focused on how new forms of online interaction could change our professional travel culture. Shaun Hotchkiss (University of Auckland) stressed in a session dedicated to best-practice examples that the purpose of online meetings should not simply be to emulate traditional 20th-century in-person conferences and collaboration meetings. Instead, the community needs to rethink what virtual scientific exchange could look like in the 21st century. This might, for instance, include replacing traditional live presentations by pre-recorded talks that are pre-watched by the audience at their own convenience, leaving more precious conference time for in-depth discussions and interactions among the participants.

Social justice

The second day highlighted social-justice issues, and the potential for greater inclusivity using online formats. Alice Gathoni (British Institute in Eastern Africa) powerfully described the true meaning of online meetings to her: everyone wants to belong. It was only during the first online meetings during the pandemic that she truly felt a real sense of belonging to the global scientific community.

The third day was dedicated to existing sustainability initiatives and new technologies. Mike Seidel (PSI) presented studies on energy-recovery linacs and discussed energy-management concepts for future colliders, including daily “standby modes”. Other options include beam dynamics explicitly designed to maximise the ratio of luminosity to power, more efficient radio-frequency cavities, the use of permanent magnets, and high-temperature superconductor cables and cavities. He concluded his talk by asking thought-provoking questions such as whether the HEP community should engage with its international networks to help establish sustainable energy-supply solutions.

The workshop ended by drafting a closing statement that calls upon the HEP community to align its activities with the Paris Climate Agreement and the goal of limiting global warming to 1.5 degrees. This statement can be signed by members of the HEP community until 20 August.

The post Sustainable high-energy physics appeared first on CERN Courier.

]]>
Meeting report The workshop attracted more than 300 participants from 45 countries to discuss how the lessons learned in the past two years might help HEP transition to a more sustainable future. https://cerncourier.com/wp-content/uploads/2021/07/SustHEP.jpg
ESCAPE towards open virtual research https://cerncourier.com/a/escape-towards-open-virtual-research/ Fri, 21 May 2021 08:30:02 +0000 https://preview-courier.web.cern.ch/?p=92365 Launched in February 2019, the European Union project ESCAPE is making strides towards an open scientific analysis infrastructure for particle physics and astronomy.

The post ESCAPE towards open virtual research appeared first on CERN Courier.

]]>
Open science has become a pillar of the policies of national and international research-funding bodies. The ambition is to increase scientific value by sharing data and transferring knowledge within and across scientific communities. To this end, in 2015 the European Union (EU) launched the European Open Science Cloud (EOSC) to support research based on open-data science.

To help European research infrastructures adapt to this future, in 2019 the domains of astrophysics, nuclear and particle physics joined efforts to create an open scientific analysis infrastructure to support the principles of data “FAIRness” (Findable, Accessible, Interoperable and Reusable) through the EU Horizon 2020 project ESCAPE (European Science Cluster of Astronomy & Particle physics ESFRI research infrastructures). The ESCAPE international consortium brings together ESFRI projects (CTA, ELT, EST, FAIR, HL-LHC, KM3NeT and SKA) and other pan-European research infrastructures (RIs) and organisations (CERN, ESO, JIVE and EGO), linking them to EOSC.

Launched in February 2019, the €16M ESCAPE project recently passed its mid-point, with less than 24 months remaining to complete the work programme. Several milestones have already been achieved, with much more in store.

Swimming in data
ESCAPE has implemented the first functioning pilot ‘Data Lake’ infrastructure, which is a new model for federated computing and storage to address the exabyte-scale of data volumes expected from the next generation of RIs and experiments. The Data Lake consists of several components that work together to provide a unified namespace to users who wish to upload, download or access data. Its architecture is based on existing and proven technologies: the Rucio platform for data management; the CERN-developed File Transfer Service for data movement and transfer; and connection to heterogenous storage systems in use across scientific data centres. These components are deployed and integrated in a service that functions seamlessly regardless of which RI the data belong to.

ESCAPE aims to deploy an integrated open “virtual research environment”

The Data Lake is an evolution of the current Worldwide LHC Computing Grid model for the advent of HL-LHC. For the first time, thanks to ESCAPE, it is the product of a cross-domain and cross-project collaboration, where scientists from HL-LHC, SKA, CTA, FAIR and others co-develop and co-operate from the beginning. The first data orchestration tests have been successfully accomplished, and the pilot phase demonstrated a robust architecture that serves the needs and use-cases of the participant experiments and facilities. Monitoring and dashboard services have enabled user access and selection of datasets. A new data challenge also including scientific data-analysis workflows in the Data Lake is planned for later this year.

ESCAPE is also setting up a sustainable open-access repository for deployment, exposure, preservation and sharing of scientific software and services. It will house software and services for data processing and analysis, as well as test datasets of the partner ESFRI projects, and provide user-support documentation, tutorials, presentations and training.

Open software
The collaborative, open-innovation environment and training actions provided by ESCAPE have already enabled the development of original open-source software. High-performance programming methods and deep-learning approaches have been developed, benchmarked and in some cases included in the official analysis pipelines of partner RIs. Definition of data formats has been pursued as well as the harmonisation of approaches for innovative workflows. A common meta-data description of the software packages, community implementation based on an available standard (CodeMeta) and standard guidelines (including licensing) for the full software development lifecycles have been gathered to enable interoperability and re-use.

Following the lead of the HEP Software Foundation (HSF), the community-based foundation of ESCAPE embraces a large community. Establishing a cooperative framework with the HSF will enable HSF packages to be added to the ESCAPE catalogue, and to align efforts.

From the user-access point of view, ESCAPE aims to build a prototype ‘science analysis platform’ that supports data discovery and integration, provides access to the repository, enables user-customised processing and workflows, interfaces with the underlying distributed Data Lake and links to existing infrastructures such as the Virtual Observatory. It also enables researchers’ participation in large citizen-powered research projects such as Zooniverse. Every ESFRI project customizes the analysis platform for their own users on top of some common lower-level services such as JupyterHub, a pre-defined Jupyter Notebook environment and Kubernetes deployment application that ESCAPE is building. First prototypes are under evaluation for SKA, CTA and for the Vera C. Rubin Observatory.

ESCAPE

In summary, ESCAPE aims to deploy an integrated open “virtual research environment” through its services for multi-probe data research, guaranteeing and boosting scientific results while providing a mechanism for acknowledgement and rewarding of researchers committing to open science. In this respect, together with four other thematic clusters (ENVRI-Fair, EOSC-Life, PANOSC and SSHOC), ESCAPE is partner of a new EU funded project ‘EOSC Future’ which aims to gather the efforts of more researchers in some cross-domain open-data ‘Test Science Projects’ (TSP). TSPs are collaborative projects, including two named Dark Matter and Extreme Universe, in which data, results and potential discoveries from a wealth of astrophysics, particle-physics and nuclear-physics experiments, combined with theoretical models and interpretations, will increase our understanding of the universe. This requires the engagement of all scientific communities, as already recommended by the 2020 update of the European Strategy for Particle Physics.

Open-data science projects
In particular, the Dark Matter TSP aims at further understanding the nature of dark matter by performing new analyses within the experiments involved, and collecting all the digital objects related to those analyses (data, metadata and software) on a broad open-science platform that will allow these analyses to be reproducible by the entire community wherever possible.

The Extreme Universe TSP, meanwhile, intends to develop a platform to enable multi-messenger/multi-probe astronomy (MMA). There are many studies of transient astrophysical phenomena that benefit from the combined use of multiple instruments at different wavelengths and different probe types. Many of these are based on the trigger of one instrument generating follow-ups from others at different timescales, from seconds to days. Such observations could lead to images of strong gravitational effects that are expected near a black hole, for example. Extreme energetic astrophysical pulsing phenomena such as gamma-ray bursts, active galactic nuclei and fast radio bursts are also high-energy phenomena not yet fully understood. The intention within ESCAPE is to build such a platform for MMA science in such a way as to make it sustainable.

ESCAPE is also setting up a sustainable open-access repository for deployment, exposure, preservation and sharing of scientific software and services

The idea in both of these TSPs is to exploit for validation purposes all the prototype services developed by ESCAPE and the uptake of its virtual research environment. At the same time the TSPs aim to promote the innovative impact of data analysis in open science, validate the reward scheme acknowledging scientists’ participation, and demonstrate the increased scientific value implied by sharing data. This approach was discussed at the last JENAS 2019 workshop and will be linked to two homologue joint ECFA-NuPECC-APPEC actions (iDMEu and gravitational-wave probes of fundamental physics).

Half-way through, ESCAPE is clearly proving itself as a powerful catalyst to make the world’s leading research infrastructures in particle physics and astronomy as open as possible. The next two years will see the consolidation of the cluster programme and the inclusion of further world-class RIs in astrophysics, nuclear and particle physics. Through the TSPs and further science projects, the ESCAPE community will continue to engage in building within EOSC the open-science virtual research environment of choice for European researchers. In the even longer term, ESCAPE and the other science clusters are exploring how to evolve into sustained “Platform Infrastructures” federating large domain-based RIs. The platforms would operate to study, define and set up a series of new focuses around which they engage with the European Commission and national research institutes to take part in the European data strategy at large.

The post ESCAPE towards open virtual research appeared first on CERN Courier.

]]>
Feature Launched in February 2019, the European Union project ESCAPE is making strides towards an open scientific analysis infrastructure for particle physics and astronomy. https://cerncourier.com/wp-content/uploads/2021/05/ESCAPE.png
Making a difference https://cerncourier.com/a/making-a-difference/ Mon, 03 May 2021 08:59:56 +0000 https://preview-courier.web.cern.ch/?p=92163 Accelerator physicist and science communicator Suzie Sheehy discusses her work, her new book, and how to increase the appeal of a research career.

The post Making a difference appeared first on CERN Courier.

]]>
Suzie Sheehy

How did you end up as an accelerator physicist?

Somewhat accidentally, because I didn’t even know that being a researcher in physics was a thing you could be until my second year of university. It was around then that I realised that someone like me could ask questions that didn’t have answers. That hooked my interest. My first project was in nuclear physics, and it involved using a particle accelerator for an experiment. I then attended the CERN summer student programme, working on ATLAS, which was my first proper exposure to the technology of particle physics. When it came to the time to do my PhD in around 2006, I had the choice to either stay in Melbourne to do particle physics, or go to Oxford, which had a strong accelerator programme. When I learned they were designing accelerators for cancer treatment, it blew my mind! I took the leap and decided to move to the other side of the world.

What did you do as a postdoc? 

I was lucky enough to get an 1851 Royal Commission Fellowship, which allowed me to start an independent research programme. It was a bit of a baptism of fire, as I had been working on medical machines but then moved to high-intensity proton accelerators. I was looking at fixed-field alternating gradient accelerators and their application to things like accelerator-driven reactors. After a while I found myself spending a lot of time building sophisticated simulations, and was getting a bit bored of computing. So I started a couple of collaborations with some teams in Japan – one of which was using ion traps to mimic the dynamics of particle beams at very high intensity. What I found really interesting is how beams behave at a fundamental level, and I am currently working on upgrading a small experiment called IBEX to test a new type of optics called non-linear integral optics, which is a focus of Fermilab at the moment. 

And now you’re back in the medical arena?

Yes – a few years ago I started working with people from CERN and the UK on compact medical accelerators for low- and middle-income countries. Then in 2019 I felt the pull to return to Australia to grow accelerator physics there. They have accelerators and facilities but didn’t have a strong academic accelerator community, so I am building up a group at Melbourne University that has a medical applications focus, but also looks at other areas. After 20 years of pushing for a proton therapy centre here, the first one is now being built. 

How and when did your career in science communication take off?

I was doing things like stage shows for primary-school children when I was a first-year undergraduate. I have always seen it as part of the process of being a scientist. Before my PhD I worked in a science museum and, while at Oxford, I started an outreach programme called Accelerate! that took live science shows to some 30,000 students in its first two years and is still running. From there, it sort of branched out. I did more public lectures, but also a little bit of TV, radio and some writing.

Sheehy presenting

Any advice for physicists who want to get into communication?

You need to build a portfolio, and demonstrate that you have a range of different styles, delivery modes and use language that people understand. The other thing that really helped me was working with professional organisations such as the Royal Institution in London. It does take a lot of time to do both your research and academic job well, and also do the communication well. A lot of my communication is about my research field – so luckily they enrich each other. I think my communication has the potential to have a much bigger societal impact than my research, so I am very serious about it. The first time someone pointed a video camera at me I was terrified. Now I can say what I want to say. We shouldn’t underestimate how much the public wants to hear from real working scientists, so keeping a very strong research base keeps my authenticity and credibility.

What is your work/life balance like? 

I am not a fan of the term “work/life balance” as it tends to imply that one is necessarily in conflict with the other. I think it’s important to set up a kind of work/life integration that supports well-being while allowing you to do the work you want to do. When I was invited back to Melbourne to build an accelerator group, I’d just started a new research group in Oxford. I stepped down my teaching and we agreed that I would take periods of sabbatical to spend time in Melbourne until I finished my experiment. I have been so incredibly grateful to everyone on both sides for their understanding. Key to that has been learning how other people’s expectations affect you and finding a way to filter them out and drive your own goals. Working in two completely different time zones, it would be easy to work ridiculously long days, so I have had to learn to protect my health. The hardest thing, and I think a lot of early/mid-career researchers will relate to this, is that academia is an infinite job: you will never do enough for someone to tell you that you have done enough. The pressure always feels like it’s increasing, especially when you are a post-doc or on tenure track, or in the process of establishing a new group or lab. You have to learn how to take care of your mental health and well-being so that you don’t burn out. With everything else that’s going on in the world right now, this is even more important. 

You are active in trying to raise the profile of women in physics. What does this involve on a practical level?

There has been a lot of focus for many years in getting more women into subjects like physics. My view is that whenever I meet young people they’re interested already. In many countries the gender balance at undergraduate level is similar. So what’s happening instead is that we are pushing women and minorities out. My focus, within my sphere of influence, is to make sure that the culture that I am perpetuating and the values that I hold within my research groups are well defined and communicated. 

I kind of pulled back from active engagement in panel sessions and things like that a number of years ago, because I realised that the most important way I can contribute is by being the best scientist that I can be. The fact that I happen to have a public profile is great in that it makes people aware that people like me exist. One of the things that has helped me the most is to build a really great community of peers of other women in physics. I think for the first seven or eight years of my career, when imposter syndrome was strong and I questioned if I fitted in, I realised that I didn’t have a single direct female colleague. With most people in my field being men, it’s likely that when choosing a speaker, for example, the first person we think of is male. Taking time to be well-networked with women in the field is incredibly important in that regard. Today, I find that creating the right environment means that people will seek out my research group because they hear it’s a nice place to be. Students today are much savvier with this stuff – they can tell toxic professors a mile away. I am trying to show them that there is a way of doing research that doesn’t involve the horrible sides to it. Research is hard enough already, so why make it harder? 

Tell us about your debut book The Matter of Everything?

It’s published by Bloomsbury (UK/Commonwealth) and Knopf (US) and is due out in early 2022. Its subtitle is “The 12 experiments that made the modern world”, starting with the cathode-ray tube and going all the way through to the LHC and what might come next. It’s told from the perspective of an experimental physicist. What isn’t always captured in popular physics books is how science is actually done, but it’s very human to feel like you’re failing in the lab. I also delve into what first interested me in accelerators, specifically the things that have emerged unexpectedly from these research areas. People think that Apple invented everything in the iPhone, but if it wasn’t for curiosity-driven physics experiments then it wouldn’t be possible. On a personal note, as I went through these stories in the field, often in the biographies and the acknowledgments, I would end up going down these rabbit holes of women whose careers were cut short because they got married and had to quit their job. It’s been lovely to have the opportunity to learn that these women were there, and it wasn’t just white men. 

You have to learn how to take care of your mental health and well-being so that you don’t burn out

Do you have a preference as to which collider should come next after the LHC? 

I think it should be one of the linear ones. The size of future circular colliders and the timescales involved are quite overwhelming, and you have to wonder if the politics might change throughout the project. A linear machine such as the ILC is more ready to go, if the money and will was there. But I also think there is value in the diversity of the technology. The scaling of SLAC’s linear electron machine, for example, really pushed the industrialisation of that accelerator technology – which is part of the reason why we have 3 GHz electron accelerators now in every hospital. There will be other implications to what we build, other than physics results – even though the decisions will be made on the physics. 

What do you say to students considering a career in particle physics? 

I will answer that from the perspective of the accelerator field, which is very exciting. If you look historically, new technologies have always driven new discoveries. The accelerator field is going through an interesting “technology discovery phase”, for example with laser-driven plasma accelerators, so there will be huge changes to what we are doing in 10–15 years’ time that could blow the decisions surrounding future colliders out of the water. This happened in the 1960s in the era of proton accelerators, where suddenly there was a new technology and it meant you could build machines with a much higher energy with smaller magnets, and suddenly the people who took that risk were the ones who ended up pushing the field forward. I sometimes feel experimental and theoretical physicists are slightly disconnected to what’s going on with accelerator physics now. When making future decisions, people should attend accelerator conferences…it may influence their choices.

The post Making a difference appeared first on CERN Courier.

]]>
Opinion Accelerator physicist and science communicator Suzie Sheehy discusses her work, her new book, and how to increase the appeal of a research career. https://cerncourier.com/wp-content/uploads/2021/04/CCMayJun21_INT_Sheehy.jpg
LHC reinterpreters think long-term https://cerncourier.com/a/lhc-reinterpreters-think-long-term/ Wed, 28 Apr 2021 08:36:24 +0000 https://preview-courier.web.cern.ch/?p=92151 The question of making research data findable, accessible, interoperable and reusable is a burning one throughout modern science.

The post LHC reinterpreters think long-term appeared first on CERN Courier.

]]>
A Map of the Invisible

The ATLAS, CMS and LHCb collaborations perform precise measurements of Standard Model (SM) processes and direct searches for physics beyond the Standard Model (BSM) in a vast variety of channels. Despite the multitude of BSM scenarios tested this way by the experiments, it still constitutes only a small subset of the possible theories and parameter combinations to which the experiments are sensitive. The (re)interpretation of the LHC results in order to fully understand their implications for new physics has become a very active field, with close theory–experiment interaction and with new computational tools and related infrastructure being developed. 

From 15 to 19 February, almost 300 theorists and experimental physicists gathered for a week-long online workshop to discuss the latest developments. The topics covered ranged from advances in public software packages for reinterpretation to the provision of detailed analysis information by the experiments, from phenomenological studies to global fits, and from long-term preservation to public data.

Open likelihoods

One of the leading questions throughout the workshop was that of public likelihoods. The statistical model of an experimental analysis provides its complete mathematical description; it is essential information for determining the compatibility of the observations with theoretical predictions. In his keynote talk “Open science needs open likelihoods’’, Harrison Prosper (Florida State University) explained why it is in our scientific interest to make the publication of full likelihoods routine and straightforward. The ATLAS collaboration has recently made an important step in this direction by releasing full likelihoods in a JSON format, which provides background estimates, changes under systematic variations, and observed data counts at the same fidelity as used in the experiment, as presented by Eric Schanet (LMU Munich). Matthew Feickert (University of Illinois) and colleagues gave a detailed tutorial on how to use these likelihoods with the pyhf python package. Two public reinterpretation tools, MadAnalysis5 presented by Jack Araz (IPPP Durham) and SModelS presented by Andre Lessa (UFABC Santo Andre) can already make use of pyhf and JSON likelihoods, and others are to follow. An alternative approach to the plain-text JSON serialisation is to encode the experimental likelihood functions in deep neural networks, as discussed by Andrea Coccaro (INFN Genova) who presented the DNNLikelihood framework. Several more contributions from CMS, LHCb and from theorists addressed the question of how to present and use likelihood information, and this will certainly stay an active topic at future workshops.  

The question of making research data findable, accessible, interoperable and reusable is a burning one throughout modern science

A novelty for the Reinterpretation workshop was that the discussion was extended to experiences and best practices beyond the LHC, to see how experiments in other fields address the need for publicly released data and reusable results. This included presentations on dark-matter direct detection, the high-intensity frontier, and neutrino oscillation experiments. Supporting Prosper’s call for data reusability 40 years into the future – “for science 2061” – Eligio Lisi (INFN Bari) pointed out the challenges met in reinterpreting the 1998 Super-Kamiokande data, initially published in terms of the then-sufficient two-flavour neutrino-oscillation paradigm, in terms of contemporary three-neutrino descriptions, and beyond. On the astrophysics side, the LIGO and Virgo collaborations actively pursue an open-science programme. Here, Agata Trovato (APC Paris) presented the Gravitational Wave Open Science Center, giving details on the available data, on their format and on the tools to access them. An open-data policy also exists at the LHC, spearheaded by the CMS collaboration, and Edgar Carrera Jarrin (USF Quito) shared experiences from the first CMS open-data workshop. 

The question of making research data findable, accessible, interoperable and reusable (“FAIR” in short) is a burning one throughout modern science. In a keynote talk, the head of the GO FAIR Foundation, Barend Mons, explained the FAIR Guiding Principles together with the technical and social aspects of FAIR data management and data reuse, using the example of COVID-19 disease modelling. There is much to be learned here for our field. 

The wrap-up session revolved around the question of how to implement the recommendations of the Reinterpretation workshop in a more systematic way. An important aspect here is the proper recognition, within the collaborations as well as the community at large, of the additional work required to this end. More rigorous citation of HEPData entries by theorists may help in this regard. Moreover, a “Reinterpretation: Auxiliary Mat­erial Presentation” (RAMP) seminar series will be launched to give more visibility and explicit recognition to the efforts of preparing and providing extensive mat­erial for reinterpretation. The first RAMP meetings took place on 9 and 23 April.

The post LHC reinterpreters think long-term appeared first on CERN Courier.

]]>
Meeting report The question of making research data findable, accessible, interoperable and reusable is a burning one throughout modern science. https://cerncourier.com/wp-content/uploads/2021/04/CCMayJun21_FN_map_feature.jpg
Tooling up to hunt dark matter https://cerncourier.com/a/tooling-up-to-hunt-dark-matter/ Thu, 04 Mar 2021 13:33:55 +0000 https://preview-courier.web.cern.ch/?p=91450 The TOOLS 2020 conference attracted around 200 phenomenologists and experimental physicists to work on numerical tools for dark-matter models, and more.

The post Tooling up to hunt dark matter appeared first on CERN Courier.

]]>
Bullet Cluster

The past century has seen ever stronger links forged between the physics of elementary particles and the universe at large. But the picture is mostly incomplete. For example, numerous observations indicate that 87% of the matter of the universe is dark, suggesting the existence of a new matter constituent. Given a plethora of dark-matter candidates, numerical tools are essential to advance our understanding. Fostering cooperation in the development of such software, the TOOLS 2020 conference attracted around 200 phenomenologists and experimental physicists for a week-long online workshop in November.

The viable mass range for dark matter spans 90 orders of magnitude, while the uncertainty about its interaction cross section with ordinary matter is even larger (see “Theoretical landscape” figure). Dark matter may be new particles belonging to theories beyond-the-Standard Model (BSM), an aggregate of new or SM particles, or very heavy objects such as primordial black holes (PBHs). On the latter subject, Jérémy Auffinger (IP2I Lyon) updated TOOLS 2020 delegates on codes for very light PBHs, noting that “BlackHawk” is the first open-source code for Hawking-radiation calculations.

Flourishing models

Weakly interacting massive particles (WIMPs) have enduring popularity as dark-matter candidates, and are amenable to search strategies ranging from colliders to astrophysical observations. In the absence of any clear detection of WIMPs at the electroweak scale, the number of models has flourished. Above the TeV scale, these include general hidden-sector models, FIMPs (feebly interacting massive particles), SIMPs (strongly interacting massive particles), super-heavy and/or composite candidates and PBHs. Below the GeV scale, besides FIMPs, candidates include the QCD axion, more generic ALPs (axion-like particles) and ultra-light bosonic candidates. ALPs are a class of models that received particular attention at TOOLS 2020, and is now being sought in fixed-target experiments across the globe.

For each dark-matter model, astro­particle physicists must compute the theoretical predictions and characteristic signatures of the model and confront those predictions with the experimental bounds to select the model parameter space that is consistent with observations. To this end, the past decade has seen the development of a huge variety of software – a trend mapped and encouraged by the TOOLS conference series, initiated by Fawzi Boudjema (LAPTh Annecy) in 1999, which has brought the community together every couple of years since.

Models connecting dark matter with collider experiments are becoming ever more optimised to the needs of users

Three continuously tested codes currently dominate generic BSM dark-matter model computations. Each allows for the computation of relic density from freeze-out and predictions for direct and indirect detection, often up to next-to-leading corrections. Agreement between them is kept below the percentage level. “micrOMEGAs” is by far the most used code, and is capable of predicting observables for any generic model of WIMPs, including those with multiple dark-matter candidates. “DarkSUSY” is more oriented towards supersymmetric theories, but it can be used for generic models as the code has a very convenient modular structure. Finally, “MadDM” can compute WIMP observables for any BSM model from MeV to hundreds of TeV. As MadDM is a plugin of MadGraph, it inherits unique features such as its automatic computation of new dark-matter observables, including indirect-detection processes with an arbitrary number of final-state particles and loop-induced processes. This is essential for analysing sharp spectral features in indirect-detection gamma-ray measurements that cannot be mimicked by any known astrophysical background.

Interaction cross sections versus mass

Both micrOMEGAs and MadDM permit the user to confront theories with recast experimental likelihoods for several direct and indirect detection experiments. Jan Heisig (UCLouvain) reported that this is a work in progress, with many more experimental data sets to be included shortly. Torsten Bringmann (University of Oslo) noted that a strength of DarkSUSY is the modelling of qualitatively different production mechanisms in the early universe. Alongside the standard freeze-out mechanism, several new scenarios can arise, such as freeze-in (FIMP models, as chemical and kinetic equilibrium cannot be achieved), dark freeze-out, reannihilation and “cannibalism”, to name just a few. Freeze-in is now supported by micrOMEGAs.

Models connecting dark matter with collider experiments are becoming ever more optimised to the needs of users. For example, micrOMEGAs interfaces with SModelS, which is capable of quickly applying all possible LHC-relevant supersymmetric searches. The software also includes long-lived particles, as commonly found in FIMP models. As MadDM is embedded in MadGraph, noted Benjamin Fuks (LPTHE Paris), tools such as MadAnalysis may be used to recast CMS and ATLAS searches. Celine Degrande (UCLouvain) described another nice tool, FeynRules, which produces model files in both the MadDM and micrOMEGAs formats given the Lagrangian for the BSM model, providing a very useful automatised chain from the model directly to the dark-matter observables, high-energy predictions and comparisons with experimental results. Meanwhile, MadDump expands MadGraph’s predictions and detector simulations from the high-energy collider limits to fixed-target experiments such as NA62. To complete a vibrant landscape of development efforts, Tomas Gonzalo (Monash) presented the GAMBIT collaboration’s work to provide tools for global fits to generic dark-matter models.

A phenomenologists dream

Huge efforts are underway to develop a computational platform to study new directions in experimental searches for dark matter, and TOOLS 2020 showed that we are already very close to the phenomenologist’s dream for WIMPs. TOOLS 2020 wasn’t just about dark matter either – it also covered developments in Higgs and flavour physics, precision tests and general fitting, and other tools. Interested parties are welcome to join in the next TOOLS conference due to take place in Annecy in 2022.

The post Tooling up to hunt dark matter appeared first on CERN Courier.

]]>
Meeting report The TOOLS 2020 conference attracted around 200 phenomenologists and experimental physicists to work on numerical tools for dark-matter models, and more. https://cerncourier.com/wp-content/uploads/2021/02/CCMarApr21_FN_bulletcluster.jpg
CERN establishes COVID-19 task force https://cerncourier.com/a/cern-establishes-covid-19-task-force/ Mon, 06 Apr 2020 07:59:59 +0000 https://preview-courier.web.cern.ch/?p=87109 CERN technologies and expertise are helping in the collective global fight against COVID-19.

The post CERN establishes COVID-19 task force appeared first on CERN Courier.

]]>
The CERN-against-COVID-19 logo. Credit: CERN.

The CERN management has established a task force to collect and coordinate ideas from the global CERN community to fight the COVID-19 pandemic. Drawing on the scientific and technical expertise of some 18,000 people worldwide who have links with CERN, these initiatives range from the production of sanitiser gel to novel proposals for ventilators to help meet rising clinical demand.

CERN-against-COVID-19 was established on 27 March, followed by the launch of a dedicated website on 4 April. The group aims to draw on CERN’s many competencies and to work closely with experts in healthcare, drug development, epidemiology and emergency response to help ensure effective and well-coordinated action. The CERN management has also written directly to the director general of the World Health Organization, with which CERN has an existing collaboration agreement, to offer CERN’s support.

It’s not about going out there and doing things because we think we know best, but about offering our services and waiting to hear from the experts as to how we may be able to help

Beniamino Di Girolamo

The initiative has already attracted a large number of suggestions at various stages of development. These include three proposals by particle physicists for stripped-down ventilator designs, one of which is led by members of the LHCb collaboration. Other early suggestions range from the use of CERN’s fleet of vehicles to make deliveries in the surrounding region, to offers of computing resources and 3D printing of components for medical equipment. From 3-5 April, CERN supported a 48-hour online hackathon organised by the Swiss government to develop “functional digital or analogue prototypes” to counter the virus. Other ways in which computing resources are being deployed include the deployment of distance-learning tools such as Open Up2U, coordinated by the GÉANT partnership. CERN is also producing sanitiser gel and Perspex shields which will be distributed to gendarmeries in the Pays de Gex region.

Another platform, Science Responds, has been established by “big science” researchers in the US to facilitate interactions between COVID-19 researchers and the broader science community.

“It has been amazing to see so many varied and quality ideas,” says Beniamino Di Girolamo of CERN, who is chair of CERN-against-COVID-19 task force. “It’s not about going out there and doing things because we think we know best, but about offering our services and waiting to hear from the experts as to how we may be able to help. This is also much wider than CERN – these initiatives are coming from everywhere.”

Proposals and ideas can be made by members of the CERN community via an online form, and questions to the task force may be submitted via email.

 

The post CERN establishes COVID-19 task force appeared first on CERN Courier.

]]>
News CERN technologies and expertise are helping in the collective global fight against COVID-19. https://cerncourier.com/wp-content/uploads/2020/04/CERNagainstCOVID19.jpg
Success in scientific management https://cerncourier.com/a/success-in-scientific-management/ Fri, 13 Mar 2020 10:59:18 +0000 https://preview-courier.web.cern.ch/?p=86542 Barry Barish speaks to the Courier about his role in turning LIGO into a Nobel Prize-winning machine.

The post Success in scientific management appeared first on CERN Courier.

]]>
Barry Barish

Your co-Nobelists in the discovery of gravitational waves, Kip Thorne and Rainer Weiss, have both recognised your special skills in the management of the LIGO collaboration. When you landed in LIGO in 1994, what was the first thing you changed?

When I arrived in LIGO, there was a lot of dysfunction and people were going after each other. So, the first difficult problem was to make LIGO smaller, not bigger, by moving people out who weren’t going to be able to contribute constructively in the longer term. Then, I started to address what I felt were the technical and management weaknesses. Along with my colleague, Gary Sanders, who had worked with me on one of the would-be detectors for the Superconducting Super Collider (SSC) before the project was cancelled, we started looking for the kind of people that were missing in technical areas.

For example, LIGO relies on very advanced lasers but I was convinced that the laser that was being planned for, a gas laser, was not the best choice because lasers were, and still are, a very fast-moving technology and solid-state lasers were more forward-looking. Coming from particle physics, I’m used to not seeing a beam with my own eyes. So I wasn’t disturbed that the most promising lasers at that time emitted light in the infrared, instead of green, and that technology had advanced to where they could be built in industry. People who worked with interferometers were used to “little optics” on lab benches where the lasers were all green and the alignment of mirrors etc was straightforward. I asked three of the most advanced groups in the world who worked on lasers of the type we needed (Hannover in Germany, Adelaide in Australia and Stanford in California) if they’d like to work together with us, and we brought these experts into LIGO to form the core of what we still have today as our laser group.

Project management for forefront science experiments is very different, and it is hard for people to do it well

This story is mirrored in many of the different technical areas in LIGO. Physics expertise and expertise in the use of interferometer techniques were in good supply in LIGO, so the main challenge was to find expertise to develop the difficult forefront technologies that we were going to depend on to reach our ambitious sensitivity goals. We also needed to strengthen the engineering and project-management areas, but that just required recruiting very good people. Later, the collaboration grew a lot, but mostly on the data-analysis side, which today makes up much of our collaboration.

According to Gary Sanders of SLAC, “efficient management of large science facilities requires experience and skills not usually found in the repertoire of research scientists”. Are you a rare exception?

Gary Sanders was a student of Sam Ting, then he went to Los Alamos where he got a lot of good experience doing project work. For myself, I learned what was needed kind of organically as my own research grew into larger and larger projects. Maybe my personality matched the problem, but I also studied the subject. I know how engineers go about building a bridge, for example, and I could pass an exam in project management. But, project management for forefront science experiments is very different, and it is hard for people to do it well. If you build a bridge, you have a boss, and he or she has three or four people who do tasks under his/her supervision, so generally the way a large project is structured is a big hierarchical organisation. Doing a physics research project is almost the opposite. For large engineering projects, once you’ve built the bridge, it’s a bridge, and you don’t change it. When you build a physics experiment, it usually doesn’t do what you want it to do. You begin with one plan and then you decide to change to another, or even while you’re building it you develop better approaches and technologies that will improve the instruments. To do research in physics, experience tells us that we need a flat, rather than vertical, organisational style. So, you can’t build a complicated, expensive ever-evolving research project using just what’s taught in the project-management books, and you can’t do what’s needed to succeed in cost, schedule, performance, etc, in the style found in a typical physics-department research group. You have to employ some sort of hybrid. Whether it’s LIGO or an LHC experiment, you need to have enough discipline to make sure things are done on time, yet you also need the flexibility and encouragement to change things for the better. In LIGO, we judiciously adapted various project-management formalities, and used them by not interfering any more than necessary with what we do in a research environment. Then, the only problem – but admittedly a big one – is to get the researchers, who don’t like any structure, to buy into this approach.

How did your SSC experience help?

It helped with the political part, not the technical part, because I came to realise how difficult the politics and things outside of a project are. I think almost anything I worked on before has been very hard, because of what it was or because of some politics in doing it, but I didn’t have enormous problems that were totally outside my control, as we had in the SSC.

How did you convince the US government to keep funding LIGO, which has been described as the most costly project in the history of the NSF?

It’s a miracle, because not only was LIGO costly, but we didn’t have much to show in terms of science for more than 20 years. We were funded in 1994, and we made the first detection more than 20 years later. I think the miracle wasn’t me, rather we were in a unique situation in the US. Our funding agency, the NSF, has a different mission than any other agency I know about. In the US, physical sciences are funded by three big agencies. One is the DOE, which has a division that does research in various areas with national labs that have their own structures and missions. The other big agency that does physical science is NASA, and they have the challenge of safety in space. The NSF gets less money than the other two agencies, but it has a mission that I would characterise by one word: science. LIGO has so far seen five different NSF directors, but all of them were prominent scientists. Having the director of the funding agency be someone who understood the potential importance of gravitational waves, maybe not in detail, helped make NSF decide both to take such a big risk on LIGO and then continue supporting it until it succeeded. The NSF leadership understands that risk-taking is integral to making big advancements in science.

What was your role in LIGO apart from management?

I concentrated more on the technical side in LIGO than on data analysis. In LIGO, the analysis challenges are more theoretical than they are in particle physics. What we have to do is compare general relativity with what happens in a real physical phenomenon that produces gravitational waves. That involves more of a mixed problem of developing numerical relativity, as well as sophisticated data-analysis pipelines. Another challenge is the huge amount of data because, unlike at CERN, there are no triggers. We just take data all the time, so sorting through it is the analysis problem. Nevertheless, I’ve always felt and still feel that the real challenge for LIGO is that we are limited by how sensitive we can make the detector, not by how well we can do the data analysis.

What are you doing now in LIGO?

Now that I can do anything I want, I am focusing on something I am interested in and that we don’t employ very much, which is artificial intelligence and machine learning (ML). In LIGO there are several problems that could adapt themselves very well to ML with recent advances. So we built a small group of people, mostly much younger than me, to do ML in LIGO. I recently started teaching at the University of California Riverside, and have started working with young faculty in the university’s computer-science department on adapting some techniques in ML to problems in physics. In LIGO, we have a problem in the data that we call “glitches”, which appear when something that happens in the apparatus or outside world appears in the data. We need to get rid of glitches, and we use a lot of human manpower to make the data clean. This is a problem that should adapt itself very well to a ML analysis.

Now that gravitational waves have joined the era of multi-messenger astronomy, what’s the most exciting thing that can happen next?

For gravitational waves, knowing what discovery you are going to make is almost impossible because it is really a totally new probe of the universe. Nevertheless, there are some known sources that we should be able to see soon, and maybe even will in the present run. So far we’ve seen two sources of gravitational waves: a collision of two black holes and a collision of two neutron stars, but we haven’t yet seen a black hole with a neutron star going around it. They’re particularly interesting scientifically because they contain information about nuclear physics of very compact objects, and because the two objects are very different in mass and that’s very difficult to calculate using numerical relativity. So it’s not just checking off another source that we found, but new areas of gravitational-wave science. Another attractive possibility is to detect a spinning neutron star, a pulsar. This is a continuous signal that is another interesting source which we hope to detect in a short time. Actually, I’m more interested in seeing unanticipated sources where we have no idea what we’re going to see, perhaps phenomena that uniquely happen in gravity alone.

The NSF leadership understands that risk-taking is integral to making big advancements

Will we ever see gravitons?

That’s a really good question because gravitons don’t exist in Einstein’s equations. But that’s not necessarily nature, that’s Einstein’s equations! The biggest problem we have in physics is that we have two fantastic theories. One describes almost anything you can imagine on a large scale, and that’s Einstein’s equations, and the other, which describes almost too well everything you find here at CERN, is the Standard Model, which is based on quantum field theory. Maybe black holes have the feature that they satisfy Einstein’s equations and at the same time conserve quantum numbers and all the things that happen in quantum physics. What we are missing is the experimental clue, whether it’s gravitons or something else that needs to be explained by both these theories. Because theory alone has not been able to bring them together, I think we need experimental information.

Do particle accelerators still have a role in this?

We never know because we don’t know the future, but our best way of understanding what limits our present understanding has been traditional particle accelerators because we have the most control over the particles we’re studying. The unique feature of particle accelerators is that of being able to measure all the parameters of particles that we want. We’ve found the Higgs boson and that’s wonderful, but now we know that the neutrinos also have mass and the Higgs boson possibly doesn’t describe that. We have three families of particles, and a whole set of other very fundamental questions that we have no handle on at all, despite the fact that we have this nice “standard” model. So is it a good reason to go to higher energy or a different kind of accelerator? Absolutely, though it’s a practical question whether it’s doable and affordable.

What’s the current status of gravitational-wave observatories?

We will continue to improve the sensitivity of LIGO and Virgo in incremental steps over the next few years, and LIGO will add a detector in India to give better global coverage. KAGRA in Japan is also expected to come online. But we can already see that
next-generation interferometers will be needed to pursue the science in the future. A good design study, called the Einstein Telescope, has been developed in Europe. In the US we are also looking at next-generation detectors and have different ideas, which is healthy at this point. We are not limited by nature, but by our ability to develop the technologies to make more sensitive interferometers. The next generation of detectors will enable us to reach large red shifts and study gravitational-wave cosmology. We all look forward to exploiting this new area of physics, and I am sure important discoveries will emerge.

The post Success in scientific management appeared first on CERN Courier.

]]>
Opinion Barry Barish speaks to the Courier about his role in turning LIGO into a Nobel Prize-winning machine. https://cerncourier.com/wp-content/uploads/2020/02/CCMarApr_Interview_Barish_feature.jpg
The Higgs, supersymmetry and all that https://cerncourier.com/a/the-higgs-supersymmetry-and-all-that/ Fri, 10 Jan 2020 09:36:28 +0000 https://preview-courier.web.cern.ch/?p=86015 John Ellis reflects on 50 years at the forefront of theoretical high-energy physics - and whether the field is ripe for a paradigm shift.

The post The Higgs, supersymmetry and all that appeared first on CERN Courier.

]]>
John Ellis

What would you say were the best and the worst of times in your half-century-long career as a theorist?

The two best times, in chronological order, were the 1979 discovery of the gluon in three-jet events at DESY, which Mary Gaillard, Graham Ross and I had proposed three years earlier, and the discovery of the Higgs boson at CERN in 2012, in particular because one of the most distinctive signatures for the Higgs, its decay to two photons, was something Gaillard, Dimitri Nanopoulos and I had calculated in 1975. There was a big build up to the Higgs and it was a really emotional moment. The first of the two worst times was in 2000 with the closure of LEP, because maybe there was a glimpse of the Higgs boson. In fact, in retrospect the decision was correct because the Higgs wasn’t there. The other time was in September 2008 when there was the electrical accident in the LHC soon after it started up. No theoretical missing factor-of-two could be so tragic.

Your 1975 work on the phenomenology of the Higgs boson was the starting point for the Higgs hunt. When did you realise that the particle was more likely than not to exist?

Our paper, published in 1976, helped people think about how to look for the Higgs boson, but it didn’t move to the top of the physics agenda until after the discovery of the W and Z bosons in 1983. When we wrote the paper, things like spontaneous symmetry breaking were regarded as speculative hypotheses by the distinguished grey-haired scientists of the day. Then, in the early 1990s, precision measurements at LEP enabled us to look at the radiative corrections induced by the Higgs and they painted a consistent picture that suggested the Higgs would be relatively light (less than about 300 GeV). I was sort of morally convinced beforehand that the Higgs had to exist, but by the early 1990s it was clear that, indirectly, we had seen it. Before that there were alternative models of electroweak symmetry breaking but LEP killed most of them off.

To what extent does the Higgs boson represent a “portal” to new physics?

The Higgs boson is often presented as completing the Standard Model (SM) and solving lots of problems. Actually, it opens up a whole bunch of new ones. We know now that there is at least one particle that looks like an effective elementary scalar field. It’s an entirely new type of object that we’ve never encountered before, and every single aspect of the Higgs is problematic from a theoretical point of view. Its mass: we know that in the SM it is subject to quadratic corrections that make the hierarchy of mass scales unstable.

Every single aspect of the Higgs is problematic from a theoretical point of view

Its couplings to fermions: those are what produce the mixing of quarks, which is a complete mystery. The quartic term of the Higgs potential in the SM goes negative if you extrapolate it to high energies, the theory becomes unstable and the universe is doomed. And, in principle, you can add a constant term to the Higgs potential, which is the infamous cosmological constant that we know exists in the universe today but that is much, much smaller than would seem natural from the point of view of Higgs theory. Presumably some new physics comes in to fix these problems, and that makes the Higgs sector of the SM Lagrangian look like the obvious portal to that new physics.

In what sense do you feel an emotional connection to theory?

The Higgs discovery is testament to the power of mathematics to describe nature. People often talk about beauty as being a guide to theory, but I am always a bit sceptical about that because it depends on how you define beauty. For me, a piece of engineering can be beautiful even if it looks ugly. The LHC is a beautiful machine from that point of view, and the SM is a beautiful theoretical machine that is driven by mathematics. At the end of the day, mathematics is nothing but logic taken as far as you can.

Do you recall the moment you first encountered supersymmetry (SUSY), and what convinced you of its potential?

I guess it must have been around 1980. Of course I knew that Julius Wess and Bruno Zumino had discovered SUSY as a theoretical framework, but their motivations didn’t convince me. Then people like Luciano Maiani, Ed Witten and others pointed out that SUSY could help stabilise the hierarchy of mass scales that we find in physics, such as the electroweak, Planck and grand unification scales. For me, the first phenomenological indication that indicated SUSY could be related to reality was our realisation in 1983 that SUSY offered a great candidate for dark matter in the form of the lightest supersymmetric particle. The second was a few years later when LEP provided very precise measurements of the electroweak mixing angle, which were in perfect agreement with supersymmetric (but not non-supersymmetric) grand unified theories. The third indication was around 1991 when we calculated the mass of the lightest supersymmetric Higgs boson and got a mass up to about 130 GeV, which was being indicated by LEP as a very plausible value, and agrees with the experimental value.

There was great excitement about SUSY ahead of the LHC start-up. In hindsight, does the non-discovery so far make the idea less likely?

Certainly it’s disappointing. And I have to face the possibility that even if SUSY is there, I might not live to meet her. But I don’t think it’s necessarily a problem for the underlying theory. There are certainly scenarios that can provide the dark matter even if the supersymmetric particles are rather heavier than we originally thought, and such models are still consistent with the mass of the Higgs boson. The information you get from unification of the couplings at high energies also doesn’t exclude SUSY particles weighing 10 TeV or so. Clearly, as the masses of the sparticles increase, you have to do more fine tuning to solve the electroweak hierarchy problem. On the other hand, the amount of fine tuning is still many, many orders of magnitude less than what you’d have to postulate without it! It’s a question of how much resistance to pain you have. That said, to my mind the LHC has actually provided three additional reasons for loving SUSY. One is the correct prediction for the Higgs mass. Another is that SUSY stabilises the electroweak vacuum (without it, SM calculations show that the vacuum is metastable). The third is that in a SUSY model, the Higgs couplings to other particles, while not exactly the same as in the SM, should be pretty close – and of course that’s consistent with what has been measured so far.

To what extent is SUSY driving considerations for the next collider?

I still think it’s a relatively clear-cut and well-motivated scenario for physics at the multi-TeV scale. But obviously its importance is less than it was in the early 1990s when we were proposing the LHC. That said, if you want a specific benchmark scenario for new physics at a future collider, SUSY would still be my go-to model, because you can calculate accurate predictions. As for new physics beyond the Higgs and more generally the precision measurements that you can make in the electroweak sector, the next topic that comes to my mind is dark matter. If dark matter is made of weakly-interacting massive particles (WIMPs), a high-energy Future Circular Collider should be able to discover it. You can look at SUSY at various different levels. One is that you just add in these new particles and make sure they have the right couplings to fix the hierarchy problem. But at a more fundamental level you can write down a Lagrangian, postulate this boson-fermion symmetry and follow the mathematics through. Then there is a deeper picture, which is to talk about additional fermionic (or quantum) dimensions of space–time. If SUSY were to be discovered, that would be one of the most profound insights into the nature of reality that we could get.

If SUSY is not a symmetry of nature, what would be the implications for attempts to go beyond the SM, e.g. quantum gravity?

We are never going to know that SUSY is not there. String theorists could probably live with very heavy SUSY particles. When I first started thinking about SUSY in the 1980s there was this motivation related to fine tuning, but there weren’t many other reasons why SUSY should show up at low energies. More arguments came later, for example, dark matter, which are nice but a matter of taste. I and my grandchildren will have passed on, humans could still be exploring physics way below the Planck scale, and string theorists could still be cool with that.

How high do the masses of the super-partners need to go before SUSY ceases to offer a compelling solution for the hierarchy problem and dark matter?

Beyond about 10 TeV it is difficult to see how it can provide the dark matter unless you change the early expansion history of the universe – which of course is quite possible, because we have no idea what the universe was doing when the temperature was above an MeV. Indeed, many of my string colleagues have been arguing that the expansion history could be rather different from the conventional adiabatic smooth expansion that people tend to use as the default. In this case supersymmetric particles could weigh 10 or even 30 TeV and still provide the dark matter. As for the hierarchy problem, obviously things get tougher to bear.

What can we infer about SUSY as a theory of fundamental particles from its recent “avatars” in lasers and condensed-matter systems?

I don’t know. It’s not really clear to me that the word “SUSY” is being used in the same sense that I would use it. Supersymmetric quantum mechanics was taken as a motivation for the laser setup (CERN Courier March/April 2019 p10), but whether the deeper mathematics of SUSY has much to do with the way this setup works I’m not sure. The case of topological condensed-matter systems is potentially a more interesting place to explore what this particular face of SUSY actually looks like, as you can study more of its properties under controlled conditions. The danger is that, when people bandy around the idea of SUSY, often they just have in mind this fermion–boson partnership. The real essence of SUSY goes beyond that and includes the couplings of these particles, and it’s not clear to me that in these effective-SUSY systems one can talk in a meaningful way about what the couplings look like.

Has the LHC new-physics no-show so far impacted what theorists work on?

In general, I think that members of the theoretical community have diversified their interests and are thinking about alternative dark-matter scenarios, and about alternative ways to stabilise the hierarchy problem. People are certainly exploring new theoretical avenues, which is very healthy and, in a way, there is much more freedom for young theorists today than there might have been in the past. Personally, I would be rather reluctant at this time to propose to a PhD student a thesis that was based solely on SUSY – the people who are hiring are quite likely to want them to be not just working on SUSY and maybe even not working on SUSY at all. I would regard that as a bit unfair, but there are always fashions in theoretical physics.

Following a long and highly successful period of theory-led research, culminating in the completion of the SM, what signposts does theory offer experimentalists from here?

I would broaden your question. In particle physics, yes, we have the SM, which over the past 50 years has been the dominant paradigm. But there is also a paradigm in cosmology and gravitation – general relativity and the idea of a big bang – initiated a century ago by Einstein. The 2016 discovery of gravitational waves almost four years ago was the “Higgs moment” for gravity, and that community now finds itself in the same fix that we do, in that they have this theory-led paradigm that doesn’t indicate where to go next.

The discovery of gravitational waves almost four years ago was the “Higgs moment” for gravity

Gravitational waves are going to tell us a lot about astrophysics, but whether they will tell us about quantum gravity is not so obvious. The Higgs boson, meanwhile, tells us that we have a theory that works fantastically well but leaves many mysteries – such as dark matter, the origin of matter, neutrino masses, cosmological inflation, etc – still standing. These are a mixture of theoretical, phenomenological and experimental problems suggesting life beyond the SM. But we don’t have any clear signposts today. The theoretical cats are wandering off in all directions, and that’s good because maybe one of the cats will find something interesting. But there is still a dialogue going on between theory and experiment, and it’s a dialogue that is maybe less of a monologue than it was during the rise of the SM and general relativity. The problems we face in going beyond the current paradigms in fundamental physics are the hardest we’ve faced yet, and we are going to need all the dialogue we can muster between theorists, experimentalists, astrophysicists and cosmologists.

To explore all our coverage marking the 10th anniversary of the discovery of the Higgs boson ...

The post The Higgs, supersymmetry and all that appeared first on CERN Courier.

]]>
Opinion John Ellis reflects on 50 years at the forefront of theoretical high-energy physics - and whether the field is ripe for a paradigm shift. https://cerncourier.com/wp-content/uploads/2020/01/CCJanFeb20-Interview-ellis.jpg
European astroparticle, nuclear and particle physicists join forces https://cerncourier.com/a/european-astroparticle-nuclear-and-particle-physicists-join-forces/ Fri, 29 Nov 2019 10:31:45 +0000 https://preview-courier.web.cern.ch/?p=85670 The chairs of APPEC, NuPECC and ECFA call for novel expressions of interest to tackle common challenges.

The post European astroparticle, nuclear and particle physicists join forces appeared first on CERN Courier.

]]>
JENAS logo

The first joint meeting of the European Committee for Future Accelerators (ECFA), the Nuclear Physics European Collaboration Committee (NuPECC), and the Astroparticle Physics European Consortium (APPEC) took place from 14 – 16 October in Orsay, France. Making progress in domains such as dark matter, neutrinos and gravitational waves increasingly requires interdisciplinary approaches to scientific and technological challenges, and the new Joint ECFA-NuPECC-APPEC Seminar (JENAS) events are designed to reinforce links between astroparticle, nuclear and particle physicists.

Jointly organised by LAL-Orsay, IPN-Orsay, CSNSM-Orsay, IRFU-Saclay and LPNHE-Paris, the inaugural JENAS meeting saw 230 junior and senior members of the three communities discuss overlapping interests. Readout electronics, silicon photomultipliers, big-data computing and artificial intelligence were just a handful of the topics discussed. For example, the technological evolution of silicon photomultipliers, which are capable of measuring single-photon light signals and can operate at low voltage and in magnetic fields, will be key both for novel calorimeters and timing detectors at the high-luminosity LHC. They will also be used in the Cherenkov Telescope Array – an observatory of more than 100 telescopes which will be installed at La Palma in the northern hemisphere, and in the Atacama Desert in the southern hemisphere, becoming the world’s most powerful instrument for ground-based gamma-ray astronomy.

As chairs of the three consortia, we issued a call for novel expressions of interest

Organisational synergies related to education, outreach, open science, open software and careers are also readily identified, and a diversity charter was launched by the three consortia, whereby statistics on relevant parameters will be collected at each conference and workshop in the three subfields. This will allow the communities to verify how well we embrace diversity.

As chairs of the three consortia, we issued a call for novel expressions of interest to tackle common challenges in subjects as diverse as computing and the search for dark matter. Members of the high-energy physics and related communities can submit their ideas, in particular those concerning synergies in technology, physics, organisation and applications. APPEC, ECFA and NuPECC will discuss and propose actions in advance of the next JENAS event in 2021.

The post European astroparticle, nuclear and particle physicists join forces appeared first on CERN Courier.

]]>
Meeting report The chairs of APPEC, NuPECC and ECFA call for novel expressions of interest to tackle common challenges. https://cerncourier.com/wp-content/uploads/2019/11/JENAS-191.jpg
Ghent event surveys future of the field https://cerncourier.com/a/ghent-event-surveys-future-of-the-field/ Wed, 11 Sep 2019 15:11:23 +0000 https://preview-courier.web.cern.ch/?p=84336 High-energy physics had one eye trained on the future at EPS-HEP 2019.

The post Ghent event surveys future of the field appeared first on CERN Courier.

]]>
EPS-HEP participants

Almost 750 high-energy physicists met from 10–17 July in Ghent, Belgium, for the 2019 edition of EPS-HEP. The full scope of the field was put under a microscope by more than 500 parallel and plenary talks and a vibrant poster session. The ongoing update of the European Strategy for Particle Physics (ESPP) was a strong focus, and the conference began with a session jointly organised by the European Committee for Future Accelerators to seek further input from the community ahead of the publication of the ESPP briefing book in September.

The accepted view, explained ESPP secretary Halina Abramowicz, is that an electron–positron collider should succeed the Large Hadron Collider (LHC). The question is whether to build a linear collider that is extendable to higher energies, or a circular collider whose infrastructure could later be reused for a hadron collider. DESY’s Christophe Grojean weighed up the merits of a Large Electron Positron collider (LEP)-style Z-pole run at a high-luminosity circular machine – a “tera-Z factory” – against the advantages of the polarised beams proposed at linear facilities, and questioned the value of polarisation to measurements of the Higgs boson at energies above 250 GeV. Furthermore, he said, sensitivities should be evaluated in light of the expected performance of the high-luminosity LHC (HL-LHC).

Blue skies required

Presentations on accelerator and detector challenges emphasised the importance of sharing development between competing projects: while detector technology for an electron–positron machine could begin production within about five years, proposed hadron colliders require a technological leap in both radiation hardness and readout speed. CERN’s Ariella Cattai expressed concern for excessive utilitarianism in detector development, with only 5% of R&D being blue-sky despite the historical success of this approach in developing TPC, RICH and silicon strip detectors, among others. She also pointed out that although 80% of R&D specialists believe their work has potential social outcomes, less than a third feel adequately supported to engage in technology transfer. Delegates agreed on the need for more recognition for those who undertake this crucial work. CERN’s Graeme Stewart highlighted the similar plight of theorists developing event generators, whose work is often not adequately rewarded or supported. The field also needs to keep pace with computing developments outside the field, he said, by designing data models and code that are optimised for graphics-processing units rather than CPUs (central-processing units).

The accepted view is that an electron–positron collider should succeed the LHC

The beginning of the main EPS conference was dominated by impressive new results from ATLAS and CMS, as they begin to probe Higgs couplings to second-generation fermions, and as the experiments continue to search for new phenomena and rare processes. Several speakers noted that the LHC even has the potential to exceed LEP in precision electroweak physics: although the hadronic environment increases systematic uncertainties, deviations arising from beyond-Standard Model (SM) phenomena are expected to scale with the centre-of-mass energy squared. Giulia Zanderighi of the Max Planck Institute and Claude Duhr of CERN also highlighted the need to improve the precision of theoretical calculations if they are to match experimental precision by the end of the HL-LHC’s run, showcasing work to extend next-to-next-to-leading order (NNLO) calculations to two-to-three processes, and the latest moves to N3LO calculations.

The flavour-physics scene was updated with new SM-consistent constraints from Belle on the ratios R(D) and R(D*), somewhat lessening the suggestion of lepton-universality violation in B-meson decays. With the advent of Belle II, and the impending analysis of LHCb’s full Run 2 dataset, the flavour anomalies will surely soon be confirmed or resolved. LHCb also presented new measurements of the gamma angle of the unitarity triangle, which show a mild 2σ tension between the values obtained from B+ and Bs0 decays. Meanwhile, long-baseline neutrino-oscillation experiments provided tantalising information on leptonic CP violation, with T2K data excluding CP conservation at 2σ irrespective of the neutrino mass hierarchy, and NOVA disfavouring an inverted hierarchy of neutrino mass eigenstates at 1.9σ.

Background checks

A refrain common to both collider and non-collider searches for dark-matter candidates was the need to eliminate backgrounds. A succession of talks scaled the 90 orders of magnitude in mass that dark-matter candidates might occupy. CERN’s Kfir Blum explained that: “The problem with gravity is that it doesn’t matter if you’re a neutrino or a rhinoceros – if you sit on a geodesic you’re going to move in the same way,” making it difficult to infer the nature of dark matter with cosmological arguments. Nevertheless, he reported work on the recent black-hole image from the Event Horizon Telescope, which excludes some models of ultra-light dark matter. Above this, helioscopes such as CAST continue to encroach on the parameter space of QCD axions, while more novel haloscopes cut thin swathes down to low couplings in the 20 orders of magnitude of mass explored by searches for axion-like particles. Meanwhile, searches for WIMPs are sensitive to masses just beyond this, from 1 to 1000 GeV/c2. Carlos de los Heros of Uppsala University explained that experiments such as XENON1t are pushing close to the so-called neutrino floor, and advocated for the development of directional detection methods that can distinguish solar neutrinos from WIMPs, and plunge into what is rather a neutrino “swamp”.

An exciting synergy between heavy-ion physics and gravitational waves was in evidence, with the two disparate approaches both now able to probe the equation of state of nuclear matter. Particular emphasis was placed on the need to marry the successful hydrodynamical and statistical description of ion–ion collisions with that used to describe proton–proton collisions, especially in the tricky proton-ion regime. These efforts are already bearing fruit in jet modelling. On the cosmological side, speakers reflected on the enduring success of the ΛCDM model to describe the universe in just six parameters, with François Bouchet of the Institut d’Astrophysique de Paris declaring that “the magic of the cosmic microwave background is not dead”, and explaining that Planck data have ruled out several models of inflation. Interdisciplinarity was also on display in reports on multi-messenger astronomy, with particular excitement reserved for the proposed European-led Einstein Telescope gravitational-wave observatory, which Marek Kowalski of DESY reported will most likely be built in either Italy or the Netherlands, and that will boast 10-times better sensitivity than current instruments.

This year’s EPS prize ceremony rewarded the CDF and D0 collaborations for the discovery of the top quark, and the WMAP and Planck collaborations for their outstanding contributions to astroparticle physics and cosmology. Today’s challenges are arguably even greater, and the spirit of EPS-HEP 2019 was to reject a false equivalence between physics being “new” and being beyond the SM. Participants’ hunger for the technological innovation required to answer the many remaining open questions was matched by an openness to reconsider theoretical thinking on fine tuning and naturalness, and how these principles inform the further exploration of the field.

EPS-HEP 2021 will take place in Hamburg from 21–28 July.

The post Ghent event surveys future of the field appeared first on CERN Courier.

]]>
Meeting report High-energy physics had one eye trained on the future at EPS-HEP 2019. https://cerncourier.com/wp-content/uploads/2019/09/CCSepOct19_fn-ghent.jpg
Radio-euphoria rebooted? https://cerncourier.com/a/radio-euphoria-rebooted/ Fri, 12 Jul 2019 09:50:22 +0000 https://preview-courier.web.cern.ch?p=83677 In this book Obodovskiy shows that radio-phobia causes far greater harm to public health and economic development than the radiation itself.

The post Radio-euphoria rebooted? appeared first on CERN Courier.

]]>

Ilya Obodovskiy’s new book is the most detailed and fundamental survey of the subject of radiation safety that I have ever read.

The author assumes that while none of his readers will ever be exposed to large doses of radiation, all of them, irrespective of gender, age, financial situation, profession and habits, will be exposed to low doses throughout their lives. Therefore, he reasons, if it is not possible to get rid of radiation in small doses, it is necessary to study its effect on humans.

Obodovskiy adopts a broad approach. Addressing the problem of the narrowing of specialisations, which, he says, leads to poor mutual understanding between the different fields of science and industry, the author uses inclusive vocabulary, simultaneously quoting different units of measurement, and collecting information from atomic, molecular and nuclear physics, and biochemistry and biology. I would first, however, like to draw attention to the rather novel section ‘Quantum laws and a living cell’.

Quite a long time after the discovery of X-rays and radioactivity, the public was overwhelmed by “X-ray-mania and radio-euphoria”. But after World War II – and particularly after the Japanese vessel Fukuryū-Maru experienced the radioactive fallout from a thermonuclear explosion at Bikini Atoll – humanity got scared. The resulting radio-phobia determined today’s commonly negative attitudes towards radiation, radiation technologies and nuclear energy. In this book Obodovskiy shows that radio-phobia causes far greater harm to public health and economic development than the radiation itself.

The risks of ionising radiation can only be clarified experimentally. The author is quite right when he declares that medical experiments on human beings are ethically evil. Nevertheless, a large group of people have received small doses. An analysis of the effect of radiation on these groups can offer basic information, and the author asserts that in most cases results show that low-dose irradiation does not affect human health.

It is understandable that the greater part of the book, as for any textbook, is a kind of compilation, however, it does discuss several quite original issues. Here I will point out just one. To my knowledge, Obodovskiy is the first to draw attention to the fact that deep in the seas, oceans and lakes, the radiation background is two to four orders of magnitude lower than elsewhere on Earth. The author posits that one of the reasons for the substantially higher complexity and diversity of living organisms on land could be the higher levels of ionising radiation.

In the last chapter the author gives a detailed comparison of the various sources of danger that threaten people, such as accidents on transport, smoking, alcohol, drugs, fires, chemicals, terror and medical errors. Obodovskiy shows that the direct danger to human health from all nuclear applications in industry, power production, medicine and research is significantly lower than health hazards from every non-nuclear source of danger.

The post Radio-euphoria rebooted? appeared first on CERN Courier.

]]>
Review In this book Obodovskiy shows that radio-phobia causes far greater harm to public health and economic development than the radiation itself. https://cerncourier.com/wp-content/uploads/2019/07/CCJulAug19_Reviews-radio1.jpg
Kilogram joins the ranks of reproducible units https://cerncourier.com/a/kilogram-joins-the-ranks-of-reproducible-units/ Wed, 10 Jul 2019 13:58:18 +0000 https://preview-courier.web.cern.ch?p=83588 Fixing the elementary charge makes the vacuum magnetic permeability an unfixed parameter to be measured experimentally.

The post Kilogram joins the ranks of reproducible units appeared first on CERN Courier.

]]>

On 20 May, 144 years after the signing of the Metre Convention in 1875, the kilogram was given a new definition based on Planck’s constant, h. Long tied to the International Prototype of the Kilogram (IPK) – a platinum-iridium cylinder in Paris – the kilogram is the last SI base unit to be redefined based on fundamental constants or atomic properties rather than a human-made artefact.

The dimensions of h are m2 kg s–1. Since the second and the metre are defined in terms of a hyperfine transition in caesium-133 and the speed of light, knowledge of h allows the kilogram to be set without reference to the IPK.

Measuring h to a suitably high precision of 10 parts per billion required decades of work by international teams across continents. In 1975 British physicist Bryan Kibble proposed a device, then known as a watt balance and now renamed the Kibble balance in his honour, which linked h to the unit of mass. A coil is placed inside a precisely calibrated magnetic field and a current driven through it such that an electromagnetic force on the coil counterbalances the force of gravity. The experiment is then repeated thousands of times over a period of months in multiple locations. The precision required is such that the strength of the gravitational field, which varies across the laboratory, must be measured before each trial.

Once the required precision was achieved, the value of h could be fixed and the definitions inverted, removing the kilogram’s dependence on the IPK. Following several years of deliberations, the new definition was formally adopted at the 26th General Conference on Weights and Measures in November last year. The 2019 redefinition of the SI base units came into force in May, and also sees the ampere, kelvin and mole redefined by fixing the numerical values for the elementary electric charge, the Boltzmann constant and the Avogadro constant, respectively.

“The revised SI future-proofs our measurement system so that we are ready for all future technological and scientific advances such as 5G networks, quantum technologies and other innovations that we are yet to imagine,” says Richard Brown, head of metrology at the UK’s National Physical Laboratory.

But the SI changes are controversial in some quarters. While heralding the new definition of the kilogram as “huge progress”, CNRS research director Pierre Fayet warns of possible pitfalls of fixing the value of the elementary charge: the vacuum magnetic permeability (μo) then becomes an unfixed parameter to be measured experimentally, with the electrical units becoming dependent on the fine structure constant. “It appears to me as a conceptual weakness of the new definitions of electrical units, even if it does not have consequences for their practical use,” says Fayet.

One way out of this, he suggests, is to embed the new SI system within a larger framework in which c = ħ = μo = εo = 1, thereby fixing the vacuum magnetic permeability and other characteristics of the vacuum (C. R. Physique 20 33). This would allow all the units to be expressed in terms of the second, with the metre and joule identified as fixed numbers of seconds and reciprocal seconds, respectively. While likely attractive to high-energy physicists, however, Fayet accepts that it may be some time before such a proposal could be accepted.

The post Kilogram joins the ranks of reproducible units appeared first on CERN Courier.

]]>
News Fixing the elementary charge makes the vacuum magnetic permeability an unfixed parameter to be measured experimentally. https://cerncourier.com/wp-content/uploads/2019/07/CCJulAug19_News-kilo_th.jpg
Building scientific resilience https://cerncourier.com/a/building-scientific-resilience/ Wed, 08 May 2019 13:02:09 +0000 https://preview-courier.web.cern.ch?p=83108 New political landscapes make international organisations in science more vital than ever, argues John Womersley.

The post Building scientific resilience appeared first on CERN Courier.

]]>

Brest-Litovsk, Utrecht, Westphalia… at first sight, intergovernmental treaties belong more to the world of Bismarck and Napoleon than that of modern science. Yet, in March this year we celebrated the signing of a new treaty establishing the world’s largest radio telescope, the Square Kilometre Array (SKA). Why use a tool of 19th-century great-power politics to organise a 21st century big-science project?

Large-science projects like SKA require multi-billion budgets and decades-long commitment. Their resources must come from many countries, and they need mutual assurance for all contributors that none will renege. The board for SKA, of which I was formerly chair, rapidly concluded that only an intergovernmental organisation could give the necessary stability. It is a very European approach, born of our need to bring together many smaller countries. But it is flexible and resilient.

Of course there are other ways to do this. A European Research Infrastructure Consortium (ERIC) is a lighter weight, faster way to set up an intergovernmental research organisation and is the model that we have used for the European Spallation Source (ESS) in Sweden. The ERIC is part of European Union (EU) legislation and provides many of the benefits in VAT and purchasing rules that an international convention or treaty would, without a convoluted approval process. Once the UK (one of the 13 ESS member nations) withdraws from the EU, it will need legislation to recognise the status of ERICs, just as non-EU Switzerland and Norway have done.

Research facilities can also be run by organisations without any intergovernmental authority: charities, not-for-profit companies or university consortia. This may seem quick and agile, but it is risky. For example, the large US telescope projects TMT and GMT are university-led and have been able to get started, but it seems that US federal involvement will now be essential for their success.

In fact, US participation in international organisations is often an issue because it requires senate approval. The last time this happened for a science project was the ITER fusion experiment, which today is making good progress but had a rocky start. The EU is one of ITER’s seven member entities and its involvement is facilitated via EUROfusion – one of eight European intergovernmental research organisations that are members of EIROforum. Most were established decades ago, and their stable structure has helped them invest in major new facilities such as ESO’s European Extremely Large Telescope.

So international treaty-based science organisations are great for delivering big-science projects, while also promoting understanding between the science communities of different countries. In the aftermath of the Second World War that was really important, and was a founding motivation for CERN. More recently, the SESAME light source in Jordan adopted the CERN model to bring the Middle East’s scientific communities together.

Today the word faces new political challenges, and international treaties don’t do much to address the growing gap between angry, disenfranchised voters and an educated, internationally minded “elite”. We scientists often see nationalism as the problem, but the issue is more one of populism – and by being international we merely seem remote. We are used to speaking about outreach,  but we also need to think seriously about “in-reach” within our own countries and regions, to engage better with groups such as Trump voters and Brexit supporters.

There’s also the risk that too much stability can become rigidity. Organisations like SKA or ESS aim to provide room for negotiation and for substantial amounts of contributions to be made in-kind. They are free of commitments such as pension schemes and, in the case of SKA, membership levels are tied to the size of a country’s astronomy community and not to GDP. Were a future, global project like a Future Circular Collider to be hosted at CERN, a purpose-built intergovernmental agreement would surely be the best way to manage it. CERN is the archetype of intergovernmental organisations in science, and offers great stability in the face of political upheavals such as Brexit. Its challenge today is to think outside the box.

The same applies to all big projects in physics today. Our future prosperity and ability to address major challenges depend on investments in large, cutting-edge research infrastructures. Intergovernmental organisations provide the framework for those investments to flourish.

The post Building scientific resilience appeared first on CERN Courier.

]]>
Opinion New political landscapes make international organisations in science more vital than ever, argues John Womersley. https://cerncourier.com/wp-content/uploads/2019/05/CCMayJun19_View_1.jpg
Neutrino connoisseurs talk stats at CERN https://cerncourier.com/a/neutrino-connoisseurs-talk-stats-at-cern/ Wed, 08 May 2019 09:33:12 +0000 https://preview-courier.web.cern.ch?p=83053 PHYSTAT-nu 2019 was held at CERN from 22 to 25 January.

The post Neutrino connoisseurs talk stats at CERN appeared first on CERN Courier.

]]>

PHYSTAT-nu 2019 was held at CERN from 22 to 25 January. Counted among the 130 participants were LHC physicists and professional statisticians as well as neutrino physicists from across the globe. The inaugural meeting took place at CERN in 2000 and PHYSTAT has gone from strength to strength since, with meetings devoted to specific topics in data analysis in particle physics. The latest PHYSTAT-nu event is the third of the series to focus on statistical issues in neutrino experiments. The workshop focused on the statistical tools used in data analyses, rather than experimental details and results.

Modern neutrino physics is geared towards understanding the nature and mixing of the three neutrinos’ mass and flavour eigenstates. This mixing can be inferred by observing “oscillations” between flavours as neutrinos travel through space. Neutrino experiments come in many different types and scales, but they tend to have one calculation in common: whether the neutrinos are created in an accelerator, a nuclear reactor, or by any number of astrophysical sources, the number of events expected in the detector is the product of the neutrino flux and the interaction cross section. Given the ghostly nature of the neutrino, this calculation presents subtle statistical challenges. To cancel common systematics, many facilities have two or more detectors at different distances from the neutrino source. However, as was shown for the NOVA and T2K experiments, competitors to observe CP violation using an accelerator-neutrino beam, it is difficult to correlate the neutrino yields in the near and far detectors. A full cancellation of the systematic uncertainties is complicated by the different detector acceptances, possible variations in the detector technologies, and the compositions of different neutrino interaction modes. In the coming years these two experiments plan to combine their data in a global analysis to increase their discovery power – lessons can be learnt from the LHC experience.

The problem of modelling the interactions of neutrinos with nuclei – essentially the problem of calculating the cross section in the detector – forces researchers to face the thorny statistical challenge of producing distributions that are unadulterated by detector effects. Such “unfolding” corrects kinematic observables for the effects of detector acceptance and smearing, but correcting for these effects can cause huge uncertainties. To counter this, strong “regularisation” is often applied, biasing the results towards the smooth spectra of Monte Carlo simulations. The desirability of publishing unregularised results as well as unfolded measurements was agreed by PHYSTAT-nu attendees. “Response matrices” may also be released, allowing physicists outside of an experimental collaboration to smear their own models, and compare them to detector-level data. Another major issue in modeling neutrino–nuclear interactions is the “unknown unknowns”. As Kevin McFarland of the University of Rochester reflected in his summary talk, it is important not to estimate your uncertainty by a survey of theory models. “It’s like trying to measure the width of a valley from the variance of the position of sheep grazing on it. That has an obvious failure mode: sheep read each other’s papers.”

An important step for current and future neutrino experiments could be to set up a statistics committee, as at the Tevatron, and, more recently, the LHC experiments. This PHYSTAT-nu workshop could be the first real step towards this exciting scenario.

The next PHYSTAT workshop will be held at Stockholm University from 31 July to 2 August on the subject of statistical issues in direct-detection dark-matter experiments.

The post Neutrino connoisseurs talk stats at CERN appeared first on CERN Courier.

]]>
Meeting report PHYSTAT-nu 2019 was held at CERN from 22 to 25 January. https://cerncourier.com/wp-content/uploads/2019/05/CCMayJun19_FN-nu.jpg
SESAME synchrotron goes all-solar https://cerncourier.com/a/sesame-synchrotron-goes-all-solar/ Tue, 07 May 2019 15:18:23 +0000 https://preview-courier.web.cern.ch?p=82981 SESAME is now the world’s first major research infrastructure to be fully powered by renewable energy.

The post SESAME synchrotron goes all-solar appeared first on CERN Courier.

]]>

On 26 February, a new solar power plant powering the SESAME light source in Jordan was officially inaugurated. In addition to being the first synchrotron-light facility in the Middle East region, SESAME is now the world’s first major research infrastructure to be fully powered by renewable energy.

Electricity from the solar power plant will be supplied by an on-grid photovoltaic system constructed 30 km away, and its 6.48 MW power capacity is ample to satisfy SESAME’s needs for several years. “As in the case of all accelerators, SESAME is in dire need of energy, and as the number of its users increases so will its electricity bill,” says SESAME director Khaled Toukan. “Given the very high cost of electricity in Jordan, with this solar power plant the centre becomes sustainable.”

Energy efficiency and other environmental factors are coming under growing scrutiny at large research infrastructures worldwide. The necessary funding for the SESAME installation became available in late 2016 when the Government of Jordan agreed to allocate JD 5 million (US$7.05 million) from funds provided by the European Union (EU) to support the deployment of clean energy sources. The power plant, which uses monocrystalline solar panels, was built by the Jordanian company Kawar Energy and power that is transmitted to the grid will be accounted for to the credit of SESAME.

SESAME opened its beamlines to users in July 2018. Cyprus, Egypt, Iran, Israel, Jordan, Pakistan, Palestine and Turkey are currently members of SESAME, with 16 further countries – plus CERN and the EU – listed as observers.

The post SESAME synchrotron goes all-solar appeared first on CERN Courier.

]]>
News SESAME is now the world’s first major research infrastructure to be fully powered by renewable energy. https://cerncourier.com/wp-content/uploads/2019/05/CCMayJun19_News-seasame.jpg
Assessing CERN’s impact on careers https://cerncourier.com/a/assessing-cerns-impact-on-careers/ Mon, 11 Mar 2019 16:33:36 +0000 https://preview-courier.web.cern.ch?p=13543 Results from a new survey show the impact of working at CERN on an individual’s career.

The post Assessing CERN’s impact on careers appeared first on CERN Courier.

]]>

Since the advent of the Large Hadron Collider (LHC), CERN has been recognised as the world’s leading laboratory for experimental particle physics. More than 10,000 people work at CERN on a daily basis. The majority are members of universities and other institutions worldwide, and many are young students and postdocs. The experience of working at CERN therefore plays an important role in their careers, be it in high-energy physics or a different domain.

The value of education

In 2016 the CERN management appointed a study group to collect information about the careers of students who have completed their thesis studies in one of the four LHC experiments. Similar studies were carried out in the past, also including people working on the former LEP experiments, and were mainly based on questionnaires sent to the team leaders of the various collaborator institutes. The latest study collected a larger and more complete sample of up-to-date information from all the experiments, with the aim of addressing young physicists who have left the field. This allows a quantitative measurement of the value of the education and skills acquired at CERN in finding jobs in other domains, which is of prime importance to evaluate the impact and role of CERN’s culture.

Following an initial online questionnaire with 282 respondents, the results were presented to the CERN Council in December 2016. The experience demonstrated the potential for collecting information from a wider population and also to deepen and customise the questions. Consequently, it was decided to enlarge the study to all persons who have been or are still involved with CERN, without any particular restrictions. Two distinct communities were polled with separate questionnaires: past and current CERN users (mainly experimentalists at any stage of their career), and theorists who had collaborated with the CERN theory department. The questionnaires were opened for a period of about four months and attracted 2692 and 167 participants from the experimental and theoretical communities, respectively. A total of 84 nationalities were represented, with German, Italian and US nationals making
up around half, and the distribution of participants by experiments was: ATLAS (994); CMS (977); LHCb (268) ALICE (102); and “other” (87), which mainly included members of the NA62 collaboration.

The questionnaires addressed various professional and sociological aspects: age, nationality, education, domicile and working place, time spent at CERN, acquired expertise, current position, and satisfaction with the CERN environment. Additional points were specific to those who are no longer CERN users, in relation to their current situation and type of activity. The analysis revealed some interesting trends.

For experimentalists, the CERN environment and working experience is considered as satisfactory or very satisfactory by 82% of participants, which is evenly distributed across nationalities. In 70% of cases, people who left high-energy physics mainly did so because of the long and uncertain path for obtaining a permanent position. Other reasons for leaving the field, although quoted by a lower percentage of participants, were: interest in other domains; lack of satisfaction at work; and family reasons. The majority of participants (63%) who left high-energy physics are currently working in the private sector, often in information technology, advanced technologies and finance domains, where they occupy a wide range of positions and responsibilities. Those in the public sector are mainly involved in academia or education.

For persons who left the field, several skills developed during their experience at CERN are considered important in their current work. The overall satisfaction of participants with their current position was high or very high for 78% of respondents, while 70% of respondents considered CERN’s impact on finding a job outside high-energy physics as positive or very positive. CERN’s services and networks, however, are not found to be very effective in helping finding a new job – a situation that is being addressed, for example, by the recently launched CERN alumni programme.

Theorists participating in the second questionnaire mainly have permanent or tenure-track positions. A large majority of them spent time at CERN’s theory department with short- or medium-term contracts, and this experience seems to improve participants’ careers when leaving CERN for a national institution. On average, about 35% of a theorist’s scientific publications originate from collaborations started at CERN, and a large fraction of theorists (96%) declared that they are satisfied or highly satisfied with their experience at CERN.

Conclusions

As with all such surveys, there is an inherent risk of bias due to the formulation of the questions and the number and type of participants. In practice, only between 20 and 30% of the targeted populations responded, depending on the addressed community, which means the results of the poll cannot be considered as representative of the whole CERN population. Nevertheless, it is clear that the impact of CERN on people’s careers is considered by a large majority of the people polled to be mostly positive, with some areas for improvement such as training and supporting the careers of those who choose to leave CERN and high-energy physics.

In the future this study could be made more significant by collecting similar information on larger samples of people, especially former CERN users. In this respect, the CERN alumni programme could help build a continuously updated database of current and former CERN users and also provide more support for people who decide to leave high-energy physics.

The final results of the survey, mostly in terms of statistical plots, together with a detailed description of the methods used to collect and analyse all the data, have been documented in a CERN Yellow Report, and will also be made available through a dedicated web page.

The post Assessing CERN’s impact on careers appeared first on CERN Courier.

]]>
Careers Results from a new survey show the impact of working at CERN on an individual’s career. https://cerncourier.com/wp-content/uploads/2019/03/CCMarApr19_careers_th.jpg
Inspired by software https://cerncourier.com/a/inspired-by-software/ Mon, 11 Mar 2019 15:30:28 +0000 https://preview-courier.web.cern.ch?p=13510 Being a scientist in the digital age means being a software producer and a software consumer.

The post Inspired by software appeared first on CERN Courier.

]]>
High-energy code

Of all the movements to make science and technology more open, the oldest is “open source” software. It was here that the “open” ideals were articulated, and from which all later movements such as open-access publishing derive. Whilst it rightly stands on this pedestal, from another point of view open-source software was simply the natural extension of academic freedom and knowledge-sharing into the digital age.

Open-source has its roots in the free software movement, which grew in the 1980s in response to monopolising corporations and restrictions on proprietary software. The underlying ideal is open collaboration: peers freely, collectively and publicly build software solutions. A second ideal is recognition, in which credit for the contributions made by individuals and organisations worldwide is openly acknowledged. A third ideal concerns rights, specifically the so-called four freedoms granted to users: to use the software for any purpose; to study the source code to understand how it works; to share and redistribute the software; and to improve the software and share the improvements with the community. Users and developers therefore contribute to a virtuous circle in which software is continuously improved and shared towards a common good, minimising vendor lock-in for users.

Today, 20 years after the term “open source” was coined, and despite initial resistance from traditional software companies, many successful open-source business models exist. These mainly involve consultancy and support services for software released under an open-source licence and extend beyond science to suppliers of everyday tools such as the WordPress platform, Firefox browser and the Android operating system. A more recent and unfortunate business model adopted by some companies is “open core”, whereby essential features are deemed premium and sold as proprietary software on top of existing open-source components.

Founding principles

Open collaboration is one of CERN’s founding principles, so it was natural to extend the principle into its software. The web’s invention brought this into sharp focus. Having experienced first-hand its potential to connect physicists around the globe, in 1993 CERN released the web software into the public domain so that developers could collaborate and improve on it (see CERN’s ultimate act of openness). The following year, CERN released the next web-server version under an open-source licence with the explicit goal of preventing private companies from turning it into proprietary software. These were crucial steps in nurturing the universal adoption of the web as a way to share digital information, and exemplars of CERN’s best practice in open-source software.

Nowadays, open-source software can be found in pretty much every corner of CERN, as in other sciences and industry. Indico and Invenio – two of the largest open-source projects developed at CERN to promote open collaboration – rely on the open-source framework Python Flask. Experimental data are stored in CERN’s Exascale Open Storage system, and most of the servers in the CERN computing centre are running on Openstack – an open-source cloud infrastructure to which CERN is an active contributor. Of course, CERN also relies heavily on open-source GNU/Linux as both a server and desktop operating system. On the accelerator and physics analysis side, it’s all about open source. From C2MON, a system at the heart of accelerator monitoring and data acquisition, to ROOT, the main data-analysis framework used to analyse experimental data, the vast majority of the software components behind the science done at CERN are released under an open-source licence.

Open hardware

The success of the open-source model for software has inspired CERN engineers to create an analogous “open hardware” licence, enabling electronics designers to collaborate and use, study, share and improve the designs of hardware components used for physics experiments. This approach has become popular in many sciences, and has become a lifeline for teaching and research in developing countries.

Being a scientist in the digital age means being a software producer and a software consumer. As a result, collaborative software-development platforms such as GitHub and GitLab have become as important to the physics department as they are to the IT department. Until recently, the software underlying an analysis has not been easily shared. CERN has therefore been developing research data-management tools to enable the publication of software and data, forming the basis of an open-data portal (see Preserving the legacy of particle physics). Naturally, this software itself is open source and has also been used to create the worldwide open-data service Zenodo, which is connected to GitHub to make the publication of open-source software a standard part of the research cycle.

Interestingly, as with the early days of open source, many corners of the scientific community are hesitant about open science. Some people are concerned that their software and data are not of sufficient quality or interest to be shared, or that they will be helping others to the next discovery before them. To triumph over the sceptics, open science can learn from the open-source movement, adopting standard licences, credit systems, collaborative development techniques and shared governance. In this way, it too will be able to reap the benefits of open collaboration: transparency, efficiency, perpetuity and flexibility. 

The post Inspired by software appeared first on CERN Courier.

]]>
Feature Being a scientist in the digital age means being a software producer and a software consumer. https://cerncourier.com/wp-content/uploads/2019/03/CCMarApr19_Opensource-1.jpg
Open science: A vision for collaborative, reproducible and reusable research https://cerncourier.com/a/open-science-a-vision-for-collaborative-reproducible-and-reusable-research/ Mon, 11 Mar 2019 15:15:40 +0000 https://preview-courier.web.cern.ch?p=13500 True open science demands more than simply making data available.

The post Open science: A vision for collaborative, reproducible and reusable research appeared first on CERN Courier.

]]>

The goal of practising science in such a way that others can collaborate, question and contribute – known as “open science” – long predates the web. One could even argue that it began with the first academic journal 350 years ago, which enabled scientists to share knowledge and resources to foster progress. But the web offered opportunities way beyond anything before it, quickly transforming academic publishing and giving rise to greater sharing in areas such as software. Alongside the open-source (Inspired by software), open-access (A turning point for open-access publishing) and open-data (Preserving the legacy of particle physics) movements grew the era of open science, which aims to encompass the scientific process as a whole.

Today, numerous research communities, political circles and funding bodies view open science and reproducible research as vital to accelerate future discoveries. Yet, to fully reap the benefits of open and reproducible research, it is necessary to start implementing tools to power a more profound change in the way we conduct and perceive research. This poses both sociological and technological challenges, starting from the conceptualisation of research projects, through conducting research, to how we ensure peer review and assess the results of projects and grants. New technologies have brought open science within our reach, and it is now up to scientific communities to agree on the extent to which they want to embrace this vision.

Particle physicists were among the first to embrace the open-science movement, sharing preprints and building a deep culture of using and sharing open-source software. The cost and complexity of experimental particle physics, making complete replication of measurements unfeasible, presents unique challenges in terms of open data and scientific reproducibility. It may even be considered that openness itself, in the sense of having an unfettered access to data from its inception, is not particularly advantageous.

Take the existing data-management policies of the LHC collaborations: while physicists generally strive to be open in their research, the complexity of the data and analysis procedures means that data become publicly open only after a certain embargo period that is used to assess its correctness. The science is thus born “closed”. Instead of thinking about “open data” from its inception, it is more useful to speak about FAIR (findable, accessible, interoperable and reusable) data, a term coined by the FORCE11 community. The data should be FAIR throughout the scientific process, from being initially closed to being made meaningfully open later to those outside the experimental collaborations.

True open science demands more than simply making data available: it needs to concern itself with providing information on how to repeat or verify an analysis performed over given datasets, producing results that can be reused by others for comparison, confirmation or simply for deeper understanding and inspiration. This requires runnable examples of how the research was performed, accompanied by software, documentation, runnable scripts, notebooks, workflows and compute environments. It is often too late to try to document research in such detail once it has been published.

True open science demands more than simply making data available

FAIR data repositories for particle physics, the “closed” CERN Analysis Preservation portal and the “open” CERN Open Data portal emerged five years ago to address the community’s open-science needs. These digital repositories enable physicists to preserve, document, organise and share datasets, code and tools used during analyses. A flexible metadata structure helps researchers to define everything from experimental configurations to data samples, from analysis code to software libraries and environments used to analyse the data, accompanied by documentation and links to presentations and publications. The result is a standard way to describe and document an analysis for the purposes of discoverability and reproducibility.

Recent advancements in the IT industry allow us to encapsulate the compute environments where the analysis was conducted. Capturing information about how the analysis was carried out can be achieved via a set of runnable scripts, notebooks, structured workflows and “containerised” pipelines. Complementary to data repositories, a third service named REANA (reusable analyses) allows researchers to submit parameterised computational workflows to run on remote compute clouds. It can be used to reinterpret preserved analyses but also to run “active” analyses before they are published and preserved, with the underlying philosophy that physics analyses should be automated from inception so that they can be executed without manual intervention. Future reuse and reinterpretation starts with the first commit of the analysis code; altering an already-finished analysis to facilitate its eventual reuse after publication is often too late.

Full control

The key guiding principle of the analysis preservation and reuse framework is to leave the decision as to when a dataset or a complete analysis is shared, privately or publicly, in the hands of the researchers. This gives the experiment collaborations full control over the release procedures, and thus fully supports internal processing and review protocols before the results are published on community services, such as arXiv, HEPData and INSPIRE.

The CERN Open Data portal was launched in 2014 amid a discussion as to whether primary particle-physics data would find any use outside of the LHC collaborations. Within a few years, the first paper based on open data from the CMS experiment was published (see Preserving the legacy of particle physics).

Three decades after the web was born, science is being shared more openly than ever and particle physics is at the forefront of this movement. As we have seen, however, simple compliance with data and software openness is not enough: we also need to capture, from the start of the research process, runnable recipes, software environments, computational workflows and notebooks. The increasing demand from funding agencies and policymakers for open data-management plans, coupled with technological progress in information technology, leads us to believe that the time is ripe for this change.

Sharing research in an easily reproducible and reusable manner will facilitate knowledge transfer within and between research teams, accelerating the scientific process. This fills us with hope that three decades from now, even if future generations may not be able to run our current code on their futuristic hardware platforms, they will be at least well equipped to understand the processes behind today’s published research in sufficient detail to be able to check our results and potentially reveal something new.

The post Open science: A vision for collaborative, reproducible and reusable research appeared first on CERN Courier.

]]>
Feature True open science demands more than simply making data available. https://cerncourier.com/wp-content/uploads/2019/03/CCMarApr19_Openscience_2.jpg
A turning point for open-access publishing https://cerncourier.com/a/a-turning-point-for-open-access-publishing/ Mon, 11 Mar 2019 15:08:57 +0000 https://preview-courier.web.cern.ch?p=13495 The European Commission is embarking on an ambitious project called Plan S to make all scientific publications open access from 2020.

The post A turning point for open-access publishing appeared first on CERN Courier.

]]>
High-energy physics (HEP) has been at the forefront of open-access publishing, the long-sought ideal to make scientific literature freely available. An early precursor to the open-access movement in the late 1960s was the database management system SPIRES (Stanford Physics Information Retrieval System), which aggregated all available (paper-copy) preprints that were sent between different institutions. SPIRES grew to become the first database accessible through the web in 1991 and later evolved into INSPIRE-HEP, hosted and managed by CERN in collaboration with other research laboratories.

The electronic era

The birth of the web in 1989 changed the publishing scene irreversibly. Vast sums were invested to take the industry from paper to online and to digitise old content, resulting in a migration from the sale of printed copies of journals to electronic subscriptions. From 1991, helped by the early adoption by particle physicists, the self-archiving repository arXiv.org allowed rapid distribution of electronic preprints in physics and, later, mathematics, astronomy and other sciences. The first open-access journals then began to sprout up and in early 2000 three major international events – the Budapest Open Access Initiative, Bethesda Statement on Open Access Publishing and the Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities – set about leveraging the new technology to grant universal free access to the results of scientific research.

Today, roughly one quarter of all scholarly literature in sciences and humanities is open access. In HEP, the figure is almost 90%. The Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP3), a global partnership between libraries, national funding agencies and publishers of HEP journals, has played an important role in HEP’s success. Designed at CERN, SCOAP3 started operation in 2014 and removes subscription fees for journals and any expenses scientists might incur to publish their articles open access by paying publishers directly. Some 3000 institutions from 43 countries (figure 1) contribute financially according to their scientific output in the field, re-using funds previously spent on subscription fees for journals that are now open access.

“SCOAP3 has demonstrated how open access can increase the visibility of research and ease the dissemination of scientific results for the benefit of everyone,” says SCOAP3 operations manager Alex Kohls of CERN. “This initiative was made possible by a strong collaboration of the worldwide library community, researchers, as well as commercial and society publishers, and it can certainly serve as an inspiration for open access in other fields.”

Plan S

On 4 September 2018, a group of national funding agencies, the European Commission (EC) and the European Research Council – under the name “cOAlition S” – launched a radical initiative called Plan S. Its aim is to ensure that, by 2020, all scientific publications that result from research funded by public grants must be published in compliant open-access journals or platforms. Robert-Jan Smits, the EC’s open-access envoy and one of the architects of Plan S, cites SCOAP3 as an inspiration for the project and says that momentum for Plan S has been building for two decades. “During those years many declarations, such as the Budapest and Berlin ones, were adopted, calling for a rapid transition to full and immediate open access. Even the 28 science ministers of the European Union issued a joint statement in 2016 that open access to scientific publications should be a reality by 2020,” says Smits. “The current situation shows, however, that there is still a long way to go.”

Recently, China released position papers supporting the efforts of Plan S, which could mark a key moment for the project. But the reaction of scientists around the world has been mixed. An open letter published in September by biochemist Lynn Kamerlin of Uppsala University in Sweden, attracting more than 1600 signatures at the time of writing, argues that Plan S would strongly reduce the possibilities to publish in suitable scientific journals of high quality, possibly splitting the global scientific community into two separate systems. Another open letter, published in November by biologist Michael Eisen at University of California Berkeley with around 2000 signatures, backs the principles of Plan S and supports its commitment “to continue working with funders, universities, research institutions and other stakeholders until we have created a stable, fair, effective and open system of scholarly communication.”

Challenges ahead

High-energy physics is already aligned to the Plan S vision thanks to SCOAP3, says Salvatore Mele of CERN, who is one of SCOAP3’s architects. But for other disciplines “the road ahead is likely to be bumpy”. “Funders, libraries and publishers have cooperated through CERN to make SCOAP3 possible. As most of the tens of thousands of scholarly journals today operate on a different model, with access mostly limited to readers paying subscription fees, this vision implies systemic challenges for all players: funders, libraries, publishers and, crucially, the wider research community,” he says.

It is publishers who are likely to face the biggest impact from Plan S. However, the Open Access Scholarly Publishers Association (OASPA) – which includes, among others, the American Physical Society, IOP Publishing (which publishes CERN Courier) and The Royal Society – recently published a statement of support, claiming OASPA “would welcome the opportunity to provide guidance and recommendations for how the funding of open-access publications should be implemented within Plan S”, while emphasising that smaller publishers, scholarly societies and new publishing platforms need to be included in the decision-making process.

Responding to an EC request for Plan S feedback that was open until 8 February, however, publishers have expressed major concerns about the pace of implementation and about the consequences of Plan S for hybrid journals. In a statement on 12 February, the European Physical Society, while supportive of the Plan S rationale, wrote that “several of the governing principles proposed for its implementation are not conducive to a transition to open access that preserves the important assets of today’s scientific publication system”. In another statement, the world’s largest open-access publisher, Springer Nature, released a list of six recommendations for funding bodies worldwide to adopt in order for full open-access to become a reality, highlighting the differences between “geographic, funder and disciplinary needs”. In parallel, a group of learned societies in mathematics and science in Germany has reacted with a statement citing a “precipitous process” that infringes the freedom of science, and urged cOAlition S to “slow down and consider all stakeholders”.

Global growth

Smits thinks traditional publishers, which are a critical element in quality control and rigorous peer review in scholarly literature, should adopt a fresh look, for example by implementing more transparent metrics. “It is obvious that the big publishers that run the subscription journals and make enormous profits prefer to keep the current publishing model. Furthermore, the dream of each scientist is to publish in a so-called prestigious high-impact journal, which shows that the journal impact factor is still very present in the academic world,” says Smits. “To arrive at the necessary change in academic culture, new metrics need to be developed to assess scientific output. The big challenge for cOAlition S is to grow globally, by having more funders signing up.”

Undoubtedly we are at a turning point between the old and new publishing worlds. The EC already requires that all publications from projects receiving its funding be made open access. But Plan S goes further, proposing an outright shift in scholarly publication. It is therefore crucial to ensure a smooth shift that takes into account all the actors, says Mele. “Thanks to SCOAP3, which has so far supported the publication of more than 26,000 articles, the high-energy physics community is fortunate to meet the vision of Plan S, while retaining researcher choice of the most appropriate place to publish their results.” 

The post A turning point for open-access publishing appeared first on CERN Courier.

]]>
Feature The European Commission is embarking on an ambitious project called Plan S to make all scientific publications open access from 2020. https://cerncourier.com/wp-content/uploads/2019/03/CCMarApr19_Openacess-1-214x317.png
Standing out from the crowd https://cerncourier.com/a/standing-out-from-the-crowd/ Thu, 24 Jan 2019 09:00:05 +0000 https://preview-courier.web.cern.ch/?p=13105 Larger collaborations mean there are many more PhD students and postdocs, while the number of permanent jobs has not increased equivalently.

The post Standing out from the crowd appeared first on CERN Courier.

]]>
Big physics

Advances in particle physics are driven by well-defined innovations in accelerators, instrumentation, electronics, computing and data-analysis techniques. Yet our ability to innovate depends strongly on the talents of individuals, and on how we continue to attract and foster the best people. It is therefore vital that, within today’s ever-growing collaborations, individual researchers feel that their contributions are recognised adequately within the scientific community at large.

Looking back to the time before large accelerators, individual recognition was not an issue in our field. Take Rutherford’s revolutionary work on the nucleus or, more recently, Cowan and Reines’ discovery of the neutrino – there were perhaps a couple of people working in a lab, at most with a technician, yet acknowledgement was at a global scale. There was no need for project management; individual recognition was spot-on and instinctive.

As high-energy physics progressed, the needs of experiments grew. During the 1980s, experiments such as UA1 and UA2 at the Super Proton Synchrotron (SPS) involved institutions from around five to eight countries, setting in motion a “natural evolution” of individual recognition. From those experiments, in which mentoring in family-sized groups played a big role, emerged spontaneous leaders, some of whom went on to head experimental physics groups, departments and laboratories. Moving into the 1990s, project management and individual recognition became even more pertinent. In the experiments at the Large Electron–Positron collider (LEP), the number of physicists, engineers and technicians working together rose by an order of magnitude compared to the SPS days, with up to 30 participating institutions and 20 countries involved in a given experiment.

Today, with the LHC experiments providing an even bigger jump in scale, we must ask ourselves: are we making our immense scientific progress at the expense of individual recognition?

Group goals

Large collaborations have been very successful, and the discovery of the Higgs boson at the LHC had a big impact in our community. Today there are more than 5000 physicists from institutions in more than 40 countries working on the main LHC experiments, and this mammoth scale demands a change in the way we nurture individual recognition and careers. In scientific collaborations with a collective mission, group goals are placed above personal ambition. For example, many of us spend hundreds of hours in the pit or carry out computing and software tasks to make sure our experiments deliver the best data, even though some of this collective work isn’t always “visible”. However, there are increasing challenges nowadays, particularly for young scientists who need to navigate the difficulties of balancing their aspirations. Larger collaborations mean there are many more PhD students and postdocs, while the number of permanent jobs has not increased equivalently; hence we also need to prepare early-career researchers for a non-academic career.

To fully exploit the potential of large collaborations, we need to bring every single person to maximum effectiveness by motivating and stimulating individual recognition and career choices. With this in mind, in spring 2018 the European Committee for Future Accelerators (ECFA) established a working group to investigate what the community thinks about individual recognition in large collaborations. Following an initial survey addressing leaders of several CERN and CERN-recognised experiments, a community-wide survey closed on 26 October with a total of 1347 responses.

Community survey

Participants expressed opinions on several statements related to how they perceive systems of recognition in their collaboration. More than 80% of the participants are involved in LHC experiments and researchers from most European countries were well represented. Just less than half (44%) were permanent staff members at their institute, with the rest comprising around 300 PhD students and 440 postdocs or junior staff. Participants were asked to indicate their level of agreement with a list of statements related to individual recognition. Each answer was quantified and the score distributions were compared between groups of participants, for instance according to career position, experiment, collaboration size, country, age, gender and discipline. Some initial findings are listed over the page, while the full breakdown of results – comprising hundreds of plots – is available at https://ecfa.web.cern.ch.

Conferences: “The collaboration guidelines for speakers at conferences allow me to be creative and demonstrate my talents.” Overall, participants from the LHCb collaboration agree more with this statement compared to those from CMS and especially ATLAS. For younger participants this sentiment is more pronounced. Respondents affirmed that conference talks are an outstanding opportunity to demonstrate to the broader community their creativity and scientific insight, and are perceived to be one of the most important aspects of verifying the success of a scientist.

Publications: “For me it is important to be included as an author of
all collaboration-wide papers.”
Although the effect is less pronounced for participants from very large collaborations, they value being included as authors on collaboration-wide publications. The alphabetic listing of authors is also supported, and at all career stages. Participants had divided opinions when it came to alternatives.

Assigned responsibilities: “I perceive that profiles of positions with responsibility are well known outside the particle-physics community.” The further away from the collaboration, the more challenging it becomes to inform people about the role of a convener, yet the selection as a convenor is perceived to be very important in verifying the success of a scientist in our field. The majority of the participating early-career researchers are neutral or do not agree with the statement that the process of selecting conveners is sufficiently transparent and accessible.

Technical contributions: “I perceive that my technical contributions get adequate recognition in the particle-physics community.”  Hardware and software technical work is at the core of particle-physics experiments, yet it remains challenging to recognise these contributions inside, but especially outside, the collaboration.

Scientific notes: “Scientific notes on analysis methods, detector and physics simulations, novel algorithms, software developments, etc, would be valuable for me as a new class of open publications to recognise individual contributions.” Although participants have very diverse opinions when it comes to making the internal collaboration notes public, they would value the opportunity to write down their novel and creative technical ideas in a new class of public notes.

Beyond disseminating the results of the survey, ECFA will reflect on how it can help to strengthen the recognition of individual achievements in large collaborations. The LHC experiments and other large collaborations have expressed openness to enter a dialogue on the topic, and will be invited by ECFA to join a pan-collaboration working group. This will help to relate observations from the survey to current practices in the collaborations, with the aim of keeping particle physics fit and healthy towards the next generation of experiments.

The post Standing out from the crowd appeared first on CERN Courier.

]]>
Careers Larger collaborations mean there are many more PhD students and postdocs, while the number of permanent jobs has not increased equivalently. https://cerncourier.com/wp-content/uploads/2019/01/CCJanFeb19_Careers-big.jpg
Theory event fuses physics and gender https://cerncourier.com/a/theory-event-fuses-physics-and-gender/ Fri, 30 Nov 2018 14:39:19 +0000 https://preview-courier.web.cern.ch?p=13350 The goal of “Gen-HET” is to improve the visibility of women in the field of high-energy theory.

The post Theory event fuses physics and gender appeared first on CERN Courier.

]]>

CERN hosted a workshop on high-energy theory and gender on 26–28 September. It was the first activity of the “Gen-HET” working group, whose goals are to improve the presence and visibility of women in the field of high-energy theory and increase awareness of gender issues.

Most of the talks in the workshop were on physics. Invited talks spanned the whole of high-energy theory, providing an opportunity for participants to learn about new results in neighbouring research areas at this interesting time for the field. Topics ranged from the anti-de-Sitter/conformal field theory (AdS/CFT) correspondence and inflationary cosmology to heavy-ion, neutrino and beyond-Standard Model physics.

Agnese Bissi (Uppsala University, Sweden) began the physics programme by reviewing the now-two-decades-old AdS/CFT correspondence, and discussing the use of conformal bootstrap methods in holography. Korinna Zapp (LIP, Lisbon, Portugal and CERN) then put three recent discoveries in heavy-ion physics into perspective: the hydrodynamic behaviour of soft particles; jet quenching; and surprising similarities between soft particle production in high-multiplicity proton–proton and heavy-ion collisions.

JiJi Fan (Brown University, USA) delved into the myriad world of beyond-the Standard Model phenomenology, discussing the possibility that the Higgs is “meso-tuned” but that there are no other light scalars. Elvira Gamiz (University of Granada, Spain) reviewed key features of lattice simulations for flavour physics and mentioned significant tensions with some experimental results that are as high as 3σ in certain B-decay channels. The theory colloquium, by Ana Achucarro (University of Leiden, Holland, and UPV-EHU Bilbao, Spain), was devoted to the topic of inflation, which still presents a major challenge to theorists.

The importance of parton distribution functions in an era of high-precision physics was the focus of a talk by Maria Ubiali (University of Cambridge, UK), who explained the state-of-the-art methods used. Reviewing key topics in cosmology and particle physics, Laura Covi (Georg-August-University Göttingen, Germany) then described how models with heavy R-parity violating supersymmetry lead to scenarios for baryogenesis and gravitino dark matter.

In neutrino physics, Silvia Pascoli (Durham University, UK) gave an authoritative overview of the experimental and theoretical status, while Tracy Slatyer (MIT, USA) did the same for dark matter, emphasising the necessity of search strategies that test many possible dark-matter models.

Closing the event, Alejandra Castro (University of Amsterdam, the Netherlands) talked about black-hole entropy and its fascinating connections with holography and number theory. The final physics talk, by Eleni Vyronidou (CERN), covered Standard Model effective field theory (SMEFT), which provides a pathway to new physics above the direct energy-reach of colliders.

The rest of the workshop centred on talks and discussion sessions about gender issues. The full spectrum of issues was addressed, a few examples of which are given here.

Julie Moote from University College London, UK, delivered a talk on behalf of the Aspires project in the UK, which is exploring how social identities and inequalities affect students continuing in science, while Marieke van den Brink from Radboud University Nijmegen, the Netherlands, described systematic biases that were uncovered by her group’s studies of around 1000 professorial appointments in the Netherlands. Meytal Eran-Jona from the Weizmann Institute of Science, Israel, reviewed studies about unconscious bias and its implications for women in academia, and described avenues to promote gender equality in the field.

The last day of the meeting focused on actions that physicists can take to improve diversity in their own departments. For example, Jess Wade from Imperial College London, UK, discussed UK initiatives such as the Institute of Physics Juno and Athena SWAN awards, and Yossi Nir from the Weizmann Institute gave an inspiring account of his work on increasing female participation in physics in Israel. One presentation drawing on bibliometric data in high-energy theory attracted much attention beyond the workshop, as has been widely reported elsewhere.

This first workshop on high-energy theory and gender combined great physics, mentoring and networking. The additional focus on gender gave participants the opportunity to learn about the sociological causes of gender imbalance and how universities and research institutes are addressing them.

We are very grateful to many colleagues for their support in putting together this meeting, which received help from the CERN diversity office and financial support from the CERN theory department, the Mainz “cluster of excellence” PRISMA, Italy’s National Institute for Nuclear Physics (INFN), the University of Milano-Bicocca, the ERC and the COST network.

Similar activities are planned in the future, including discussions on other scientific communities and minority groups.

The post Theory event fuses physics and gender appeared first on CERN Courier.

]]>
Meeting report The goal of “Gen-HET” is to improve the visibility of women in the field of high-energy theory. https://cerncourier.com/wp-content/uploads/2018/11/CCDec18_Faces-Gendermeet.jpg
Survey addresses recognition in large collaborations https://cerncourier.com/a/survey-addresses-recognition-in-large-collaborations/ Fri, 28 Sep 2018 13:25:40 +0000 https://preview-courier.web.cern.ch/?p=12716 The survey will be distributed widely, with a deadline for responses of 26 October.

The post Survey addresses recognition in large collaborations appeared first on CERN Courier.

]]>
Building 40

The European Committee for Future Accelerators (ECFA) has created a working group to examine the recognition of individual achievements in large scientific collaborations. Based on feedback from an initial survey of the leaders of 29 CERN-based or CERN-recognised experiments in particle, nuclear, astroparticle and astrophysics, ECFA found that the community is ready to engage in dialogue on this topic and receptive to potential recommendations.

In response, ECFA has launched a community-wide survey to verify how individual researchers perceive the systems put in place to recognise their achievements. The survey will be distributed widely, and can be found on the ECFA website (https://ecfa.web.cern.ch) with a deadline for responses by 26 October.

The results of the survey will be disseminated and discussed at the upcoming plenary ECFA meeting at CERN on 15–16 November. An open session during the morning of 15 November, also to be webcast, will be devoted to the discussion of the outcomes of the survey, and aims to gather input to be submitted to the update of the European Strategy for Particle Physics (CERN Courier April 2018 p7). During the remaining open sessions, comprehensive overviews of all major future collider projects in and beyond Europe, and related accelerator technologies, will be given.

“Visibility and promotion of young scientists is of utmost importance in science and in particular also for the large collaborations in high-energy physics,” says ECFA chairperson Jorgen D’Hondt. “On the eve of the update process of the European Strategy, it is an outstanding opportunity for ECFA to take on its responsibility for informing the community about the opportunities and challenges ahead of us. Everybody is welcome.”

The post Survey addresses recognition in large collaborations appeared first on CERN Courier.

]]>
News The survey will be distributed widely, with a deadline for responses of 26 October. https://cerncourier.com/wp-content/uploads/2018/10/CCOct18News-ecfa.jpg
The history and future of the PHYSTAT series https://cerncourier.com/a/the-history-and-future-of-the-phystat-series/ Fri, 01 Jun 2018 15:25:19 +0000 https://preview-courier.web.cern.ch?p=13362 PHYSTAT has played a role in the evolution of the way particle physicists employ statistical methods in their research.

The post The history and future of the PHYSTAT series appeared first on CERN Courier.

]]>

Most particle-physics conferences emphasise the results of physics analyses. The PHYSTAT series is different: speakers are told not to bother about the actual results, but are reminded that the main topics of interest are the statistical techniques used, the resulting uncertainty on measurements, and how systematics are incorporated. What makes good statistical practice so important is that particle-physics experiments are expensive in human effort, time and money. It is thus very worthwhile to use reliable statistical techniques to extract the maximum information from data (but no more).

Origins

Late in 1999, I had the idea of a meeting devoted solely to statistical issues, and in particular to confidence intervals and upper limits for parameters of interest. With the help of CERN’s statistics guru Fred James, a meeting was organised at CERN in January 2000 and attracted 180 participants. It was quickly followed by a similar one at Fermilab in the US, and further meetings took place at Durham (2002), SLAC (2003) and Oxford (2005). These workshops dealt with general statistical issues in particle physics, such as: multivariate methods for separating signal from background; comparisons between Bayesian and frequentist approaches; blind analyses; treatment of systematics; p-values or likelihood ratios for hypothesis testing; goodness-of-fit techniques; the “look elsewhere” effect; and how to combine results from different analyses.

Subsequent meetings were devoted to topics in specific areas within high-energy physics. Thus, in 2007 and 2011, CERN hosted two more meetings focusing on issues relevant for data analysis at the Large Hadron Collider (LHC), and particularly on searches for new physics. At the 2011 meeting, a whole day was devoted to unfolding, that is, correcting observed data for detector smearing effects. More recently, two PHYSTAT-ν workshops took place at the Institute for Physics and Mathematics of the Universe in Japan (2016) and at Fermilab (2017). They concentrated on issues that arise in analysing data from neutrino experiments, which are now reaching exciting levels of precision. In between these events, there were two smaller workshops at the Banff International Research Station in Canada, which featured the “Banff Challenges” – in which participants were asked to decide which of many simulated data sets contained a possible signal of new physics.

The PHYSTAT workshops have largely avoided having parallel sessions so that participants have the opportunity to hear all of the talks. From the very first meetings, the atmosphere has been enhanced by the presence of statisticians; more than 50 have participated in the various meetings over the years. Most of the workshops start with a statistics lecture at an introductory level to help people with less experience in this field understand the subsequent talks and discussions. The final pair of summary talks are then traditionally given by a statistician and a particle physicist.

A key role

PHYSTAT has played a role in the evolution of the way particle physicists employ statistical methods in their research, and has also had a real influence on specific topics. For instance, at the SLAC meeting in 2003, Jerry Friedman (a SLAC statistician who was previously a particle physicist) spoke about boosted decision trees for separating signal from background; such algorithms are now very commonly used for event selection in particle physics. Another example is unfolding, which was discussed at the 2011 meeting at CERN; the Lausanne statistician Victor Panaretos spoke about theoretical aspects, and subsequently his then student Mikael Kuusela became part of the CMS experiment, and has provided much valuable input to analyses involving unfolding. PHYSTAT is also one of the factors that has helped in raising the level of respectability with which statistics is regarded by particle physicists. Thus, graduate summer schools (such as those organised by CERN) now have lecture courses on statistics, some conferences include plenary talks, and books on particle-physics methodology have chapters devoted to statistics. With the growth in size and complexity of data in this field, a thorough grounding in statistics is going to become even more important.

Recently, Olaf Behnke of DESY in Hamburg has taken over the organisation and planning of the PHYSTAT programme and already there are ideas regarding having a monthly lecture series, a further PHYSTAT-ν workshop at CERN in January 2019 and a PHYSTAT-LHC meeting in autumn 2019, and possibly one devoted to statistical issues in dark-matter experiments. In all probability, the future of PHYSTAT is bright.

The post The history and future of the PHYSTAT series appeared first on CERN Courier.

]]>
Meeting report PHYSTAT has played a role in the evolution of the way particle physicists employ statistical methods in their research. https://cerncourier.com/wp-content/uploads/2018/06/CCJune18_FP-phystat.jpg
Sizing up physics beyond colliders https://cerncourier.com/a/sizing-up-physics-beyond-colliders/ Mon, 15 Jan 2018 16:02:10 +0000 https://preview-courier.web.cern.ch?p=13375 The Physics Beyond Colliders (PBC) initiative, launched in 2016, explores the opportunities offered by the CERN accelerator complex and infrastructure.

The post Sizing up physics beyond colliders appeared first on CERN Courier.

]]>
Physics Beyond Colliders workshop

The Physics Beyond Colliders (PBC) initiative, launched in 2016, explores the opportunities offered by the CERN accelerator complex and infrastructure that are complementary to high-energy collider experiments and other initiatives worldwide. It takes place in an exciting and quickly developing physics landscape. To quote a contribution by theorist Jonathan Feng at the recent ICFA seminar in Ottawa: “In particle theory, this is a time of great creativity, new ideas, and best of all, new proposals for experiments and connections to other fields.”

Following a kick-off workshop in September 2016 (CERN Courier November 2016 p28), the second general PBC workshop took place at CERN on 21–22 November. With more than 230 physicists in attendance, it provided an opportunity to review the progress of the studies and to collect further ideas from the community.

During the past year, the PBC study was organised into working groups to connect experts in the various relevant fields to representatives of the projects. Two physics working groups dealing with searches for physics beyond the Standard Model (BSM) and QCD measurements address the design of the experiments and their physics motivation, while several accelerator working groups are pursuing initiatives ranging from exploratory studies to more concrete plans for possible implementation at CERN. The effort has already spawned new collaborations between different groups at CERN and with external institutes, and significant progress is already visible in many areas.

The potential performance increase for existing and new users of the upgraded HL-LHC injector chain, following the culmination of the LHC injector upgrade project (CERN Courier October 2017 p22), is being actively pursued with one key client being the SPS North Area at CERN. The interplay between potential future operation of the existing SPS fixed-target experiments (NA61, NA62, NA64, COMPASS) and the installation of new proposed detectors (NA64++, MUonE, DIRAC++, NA60++) has started to be addressed in both accelerator and physics respects. The technical study of the SPS proton beam dump facility and the optimisation of the SHiP detector for investigating the hidden sector are also advancing well.

Different options for fixed-target experiments at the LHC, for instance using gas targets or crystal extraction, are under investigation, including feasibility tests with the LHC beams. The novel use of partially stripped ions (PSI) to produce high-energy gamma rays in a so-called gamma factory (CERN Courier November 2017 p7) is also gaining traction. Having taken PSI into the SPS this year, near-term plans include the injection of partially stripped lead ions into the SPS and LHC in 2018.

The design study of a storage ring for a proton electric-dipole-moment (EDM) measurement is progressing, and new opportunities to use such a ring for relic axion searches through oscillating EDMs have been put forward. In the loop are the COSY team at Jülich who continue to break new ground with polarised deuteron experiments (CERN Courier September 2016 p27).

Last but not least are non-accelerator projects that wish to benefit from CERN’s technological expertise. One highlight is the future IAXO helioscope, proposed as a successor of the CERN CAST experiment for the search of solar axions. Recently IAXO has formed as a full collaboration and is in discussion with DESY as a potential site. IAXO and a potential precursor experiment (Baby-IAXO) benefit from CERN PBC support for the design of their magnets.

The workshop also included a session devoted to the presentation of exciting new ideas, following a call for contributions from the community. One noticeable new idea consists of the construction of a low-energy linac using CLIC technology for electron injection and acceleration in the SPS. A slow extracted SPS e-beam in the 10–20 GeV energy range would allow hidden sector searches similar to NA64 but at higher intensity, and the linac would provide unique R&D possibilities for future linear accelerators. Another highlight is the prospect of performing the first optical detection of vacuum magnetic birefringence using high-field magnets under development at CERN. New projects are also being proposed elsewhere, including a first QED measurement in the strong field regime at the DESY XFEL (LUXE project) and a search for η meson rare decays at FNAL (REDTOP experiment).

The presentations and discussions at the workshop have also shown that, beyond its support to the individual projects, the PBC study group provides a useful forum for communication between communities with similar motivations. This will be an important ingredient to optimise the scope of the future projects.

The PBC study is now at a crucial point, with deliverables due at the end of 2018 as input to the European Strategy for Particle Physics Update the following year. The PBC documents will include the results of the design studies of the accelerator working groups, with a level of detail matched to the maturity of the projects, and summaries of the physics motivation of the proposed experiments in the worldwide context by the BSM and QCD physics groups. One overview document will provide an executive summary of the overall landscape, prospects and relevant issues. It should also be emphasised that the goal of the PBC study is to gather facts on the proposed projects, not to rank them.

A follow-up plenary meeting of the PBC working groups is foreseen in mid-2018, and the main findings of the PBC study will be presented to the community in an open closeout workshop towards the end of the year.

pbc.web.cern.ch

The post Sizing up physics beyond colliders appeared first on CERN Courier.

]]>
Meeting report The Physics Beyond Colliders (PBC) initiative, launched in 2016, explores the opportunities offered by the CERN accelerator complex and infrastructure. https://cerncourier.com/wp-content/uploads/2018/01/CCfac13_01_18.jpg
The physicist’s guide to the universe https://cerncourier.com/a/the-physicists-guide-to-the-universe/ Fri, 13 Oct 2017 07:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/the-physicists-guide-to-the-universe/ Sometimes referred to as the bible of particle physics, during the last 60 years the Review of Particle Physics has become the number-one reference in the field.

The post The physicist’s guide to the universe appeared first on CERN Courier.

]]>

Teasing out the intricate measurements that separate the smallest components of matter in the universe often involves monumental machines and huge international scientific collaborations. So it’s important that particle physicists are on the same page – or, rather, pages – when it comes to particle-physics results. For the past 60 years, the definitive collection of particle-physics evaluations and reviews has been bound up in a weighty print volume called the Review of Particle Physics, which is published every other year. The latest (2016) edition of what is sometimes referred to as the “bible of particle physics” contains 1808 pages in the complete version published online, and features 117 review articles on topics ranging from the Higgs boson to the Big Bang, statistics and particle detectors. Its Particle Listings include evaluations of 3062 new measurements from 721 papers, in addition to 35,436 measurements from 9843 papers published in earlier editions. The staff behind it carefully evaluate data on around 8000 different quantities to provide averages, fits and best limits.

The Review is the all-time most highly cited publication in particle physics, with recent editions eventually reaching more than 6000 citations. It also has a companion 344 page booklet that is the descendant of “wallet cards” first issued in 1957, with a summary of particle data from the main Review. The PDG website (pdg.lbl.gov) features the complete content of the book as both PDF files and in an interactive version, with downloadable figures and tables, as well as educational materials. The Review continues to grow as we learn more about the basic constituents of matter, and its history reflects a field that is continuously evolving.

Berkeley beginnings

The Review of Particle Physics and its associated publications and website are the products of the international Particle Data Group (PDG), which since its beginnings has been headquartered at the University of California Radiation Laboratory, now the Lawrence Berkeley National Laboratory (Berkeley Lab), in California. More than 200 authors around the globe currently contribute to the contents of the Review, including 3.5 full-time-equivalent physicists in the PDG group at Berkeley Lab who also co-ordinate the effort.

The story began towards the end of 1957 with a paper in the Annual Review of Nuclear Science authored by the late Arthur “Art” Rosenfeld and Murray Gell-Mann. The tables of particle masses and lifetimes associated with that article, which Rosenfeld prepared with Walter Barkas in an unpublished report, “Data for Elementary-Particle Physics,” are credited with PDG’s inception. “The damn thing just grew,” Rosenfeld said of the wallet-card summary of that first report, which now fills a spiral-bound booklet.

Rosenfeld said in 1975 that the motivation for the original 1957 report was to provide particle data for early computer programs that were used to process the data from new particle-physics experiments, including bubble-chamber experiments. The following year the report was revised. The report was next revised in 1961, and during these first few years of the Review Rosenfeld and his colleagues intermittently distributed updates of the report to the particle-physics community, along with the updated wallet cards.

New discoveries in the field led to a growing need for particle-data resources, and Rosenfeld was clear that the 1963 edition should be the last attempted without the help of a computer. A separate effort by Finnish physicist Matts Roos called “Tables of Elementary Particles and Resonant States” also illustrated that it was no longer possible for a single person to compile data critically, reckoned Rosenfeld. So the two separate efforts joined forces, with five Berkeley authors and Roos publishing “Data on Elementary Particles and Resonant States” in 1964. This article, which appeared in the Reviews of Modern Physics journal, comprised 27 pages plus three wallet cards.

The group branded itself as the Particle Data Group in 1968 and published its first data booklet that year. By 1974 the report, by then called Review of Particle Properties, had grown to 200 pages and had 13 authors, several of them based in Europe. An escalation of discoveries in the field during the mid-1970s provided the cornerstones for the Standard Model of particle physics, which described the family of known and theorised particles and their properties. The heavy crush of particle data flowing into PDG during this period led the staff to implement a new media format for distributing some data and additional quality-control measures. In 1973 a microfiche with references and backup material was included in an envelope at the back of the book.

Since then, the population of particle physicists worldwide has exploded and the print version of the Review and related booklets are currently distributed to thousands of physicists. INSPIRE, an information system that tracks published materials and experiments in the field of high-energy physics, now counts more than 1100 active experiments in the field, compared to about 300 in 1975, and the number of particle physicists has also increased from about 7000 in 1975 to an estimated 20,000 today. The print book was getting so big – growing at a rate of about 10% per year – that the PDG dropped its Listings from the print edition in 2016.

“We would have had to print two volumes if we continued to include the Listings. Given that there is likely no single person who wants to read through a major fraction of the Listings, this wasn’t justified,” recalls Juerg Beringer of Berkeley Lab, who became leader of the PDG in 2016. “Looking up data from the Listings online is anyway far more convenient.” The Listings are still available on the PDG website and included in the online journal publication.

Review articles

Many sections in the Listings are accompanied by review articles that provide further information on the data presented. Other review articles summarise major topics in particle physics or cosmology. Review articles can vary from about a page to tens of pages in length, and roughly two-thirds of review articles require updates in each edition. The first PDG review article on the Higgs boson, which appeared in 1988, was two pages long. Today, five years after the Higgs was discovered, the review is about 50 pages long and is the most viewed review on the website, with more than 50,000 downloads each year. PDG even delayed publication in 2012 to accommodate the Higgs boson’s discovery, with staff scrambling to add the Higgs addendum to the Review within eight days of the discovery’s announcement on 4 July 2012.

The scope and importance of PDG has grown substantially, especially during the past 30 years (figure 1). While the size of the PDG group at Berkeley Lab has remained essentially the same, a large number of physicists worldwide were recruited to keep up with the flood of publications in particle physics and write the many PDG review articles that now cover almost every aspect of particle physics. There are now 223 authors who contribute to the review articles or Listings and each will typically write a single review article or handle one Listings section. Collaborators outside Berkeley Lab are volunteers who usually spend only a small fraction of their time on the Review, while PDG group members at Berkeley Lab typically spend half of their time working on the PDG (see image). There is also a European-based PDG “meson team” of about a dozen members, which holds meetings twice a year at CERN, while another PDG sub-group called the baryon team is responsible for the data on baryon resonances.

Michael Barnett, a Berkeley Lab physicist and previous head of PDG having been in the role for 25 years, recalled his first experience working with the Review production when he joined Berkeley Lab in 1984. “It was barely 300 pages long and still put together by hand,” Barnett said. “We used 20 rolls of Scotch Tape to stick pieces together for the camera-ready copy. The section on B mesons was a single page. These days the B-meson section alone is over 120 pages.” In earlier days, the data for the publications were stored on computer punch cards. The print data back then appeared as only uppercase letters, with no mathematical symbols, because the punch cards couldn’t accommodate them. Under Barnett’s watch the design and layout became more reader-friendly. Particle categories multiplied, with properties listed in detail. Many new reviews were added to help explain the content of Listings sections.

Computing era

In the late 1980s a then-modern computing system was developed that served the PDG well for two decades. But a major upgrade eventually became inevitable, and the COMPAS group from the Institute of High Energy Physics in Protvino, Russia, which had been a PDG collaborator for many years, began working on prototypes for a new computing system. Working with COMPAS and experts from Berkeley Lab’s Computational Research Division, Beringer led the development of a new web-based computing platform that was supported by a special grant from the US Department of Energy (DOE). As a result, each collaborator can now directly add data to the PDG database rather than channelling it all through the PDG editor. This platform has made Review updates far more manageable. “The new system allows collaborators to see changes immediately, without waiting for the editor to go through thousands of e-mails with instructions on what to change,” says Piotr Zyla, who succeeded Betty Armstrong as PDG editor in 2003.

As with any large-scale, data-intensive publishing endeavour, there have been a few notable glitches. The 1994 booklet had a ruler with centimetre marks that were shrunk by the publisher so that each centimetre was actually 0.97 of a centimetre. The error was discovered too late to fix, but not too late to insert a disclaimer citing fictitious and comical explanations for why the centimetres fell a bit short: “The PDG feels it has the right to redefine anything it wants”; “The booklets were returned from the printer at 0.25 times the speed of light”; and “A theorist is in charge of the PDG.”

Barnett and his colleagues had considered publishing the Review on the internet since the early days of the World Wide Web – which, of course, was created at CERN in 1989 to more easily share research data around the world. The entire contents of the Review were available on the web in 1995 and its interactive version, pdgLive, appeared with the 2006 edition. An increasingly sophisticated PDG web presence has been influenced by membership surveys asking readers whether in the digital age a printed book is essential, useful, or altogether unnecessary. The first survey, in 2000, got about 2450 responses, half of which found the print version useful and well over a third found it essential. By 2014 the number of responses had tripled. While there was a clear trend in favour of online publications, many respondents still emphasised the importance of the printed book. As one respondent stated in the 2000 survey, “I could live without my right arm, but I don’t want to.”

“We expected older physicists to be the ones who valued the book and the younger ones, who grew up with the internet, not to care,” says Barnett. “We got it backwards. Everybody used the web, but more grad students and postdocs found the printed book essential.” Their comments told why: to those entering physics, the book was not merely a reference but an introduction to the unexplored dimensions of their field. The distribution scheme for the print publications has become fairly sophisticated to minimise shipping costs. There are now four separate distribution channels: in Switzerland, Japan, China and the US. Receiving the print materials is not automatic and recipients must specifically request each new edition. The audience is largely physicists, teachers, students and physics fans, with most mailings going out to high-energy physics centres and academic institutions.

The bulk of the funding for PDG comes from the Office of Science of the DOE and supports the co-ordination and production activities at Berkeley Lab. Japan’s Ministry of Education, Culture, Sports, Science and Technology (MEXT) contributes to these efforts via a US–Japan agreement on co-operative research and development. CERN supports the meson team, and in recent years CERN and the Institute of High Energy Physics of the Chinese Academy of Sciences have paid for most of the printing and shipping costs of books and booklets. Funding agencies in multiple countries, including INFN in Italy, MINECO in Spain, and IHEP in Russia provide travel and other support to PDG collaborators in their countries.

Until recently, the PDG group at Berkeley Lab was able to handle most of the PDG co-ordination tasks. But with the growth of PDG in recent years, combined with a challenging funding environment, even this has become increasingly difficult. Thankfully, INFN recently agreed to help Berkeley Lab in this area. A recent effort to streamline and automate many aspects of PDG’s operations is also providing necessary relief.

Contributing knowledge

The published results collected in the Listings provide best values and limits for a wide range of particle properties. The data can also be used to study how knowledge in particle physics evolves, for example by plotting the evolution of PDG best values over time (figure 2 and table).

Over the decades there have been occasional disputes and discrepancies for PDG staff to resolve. In one instance, discussions escalated to a threatened lawsuit over PDG’s refusal to include one researcher’s particle data in the Review’s Summary Tables. There was also a case in which the experimental measurements of the mass squared of one particle (the electron neutrino) appeared as a negative number in the data: since this is mathematically impossible, the PDG editors adjusted the error margins to account for the problem. Another unusual episode concerned claims of discoveries of pentaquarks about a decade ago and later experiments discounting those earlier claims. These ups and downs, including the latest measurements from the LHCb experiment, were covered in the reviews to keep readers up to date.

When various data are in substantial conflict, PDG sets error bars that range across the whole span of results, or, in some cases, provides no average at all. Also, about 20 years ago, the PDG instituted a new naming scheme that more logically renamed many particles. All of them stuck except for one – there was an international campaign against that name-change, so the PDG staff deferred in this one instance.

Only a subset of data collected by PDG is available in a downloadable format suitable for further processing. There is a demand for such access from researchers running Monte Carlo programs and others who want to, for example, investigate the statistical properties of the agreement between multiple measurements of the same quantity. “Making all PDG data available in a machine-readable format is a very high priority. We’ve wanted to do this for a long time as there are many uses and a lot of interest from the community. But we can barely keep up with the ongoing updates of the Review, and so the implementation of such new features takes much more time than we would like,” Beringer says.

Rosenfeld, in the conclusion of his 1975 paper assessing the work of PDG, noted challenges even then in supporting the data needs of the scientific community: “As we write this review we wonder if we have not been too modest in our requests for support…we feel that PDG is doing an effective job, but if we could spend, each year, one-fifth of the typical experiment [in those days the typical experiment cost about $3 million], it could provide broader and more timely services.”

The gradual transition from print to primarily online distribution is expected to continue, in line with the overall shift of publishers toward online publication but also, in part, because of the high cost of printing and mailing the books. Nevertheless, as long as there is continuing demand and adequate resources, PDG hopes to continue the printed book. “Producing and updating the Review of Particle Physics in modern formats will remain PDG’s core mission,” says Beringer.

Online access to PDG’s increasingly mobile-friendly web pages, or a PDG smartphone app with the complete contents of the Review in a mobile-friendly format, could in principle replace the PDG booklet. But especially for students, the PDG book and booklet  also carry substantial symbolic value, and the booklets are often distributed in introductory particle-physics classes. “It is a landmark thing for some of the graduate students and postdocs,” Barnett remarks. “When you get your first book, and when you see your data appearing, you feel like you are a particle physicist.”

The post The physicist’s guide to the universe appeared first on CERN Courier.

]]>
Feature Sometimes referred to as the bible of particle physics, during the last 60 years the Review of Particle Physics has become the number-one reference in the field. https://cerncourier.com/wp-content/uploads/2017/10/CCpdg1_09_17.jpg
Three-year extension for open-access initiative https://cerncourier.com/a/three-year-extension-for-open-access-initiative/ https://cerncourier.com/a/three-year-extension-for-open-access-initiative/#respond Fri, 14 Oct 2016 07:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/three-year-extension-for-open-access-initiative/ In September, following three years of successful operation and growth, CERN announced the continuation of the global SCOAP3 open-access initiative for at least three more years. SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics) is a partnership of more than 3000 libraries, funding agencies and research organisations from 44 countries that has made […]

The post Three-year extension for open-access initiative appeared first on CERN Courier.

]]>

In September, following three years of successful operation and growth, CERN announced the continuation of the global SCOAP3 open-access initiative for at least three more years. SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics) is a partnership of more than 3000 libraries, funding agencies and research organisations from 44 countries that has made tens of thousands of high-energy physics articles publicly available at no cost to individual authors. Inspired by the collaborative model of the LHC, SCOAP3 is hosted at CERN under the oversight of international governance. It is primarily funded through the redirection of budgets previously used by libraries to purchase journal subscriptions.

Since 2014, in co-operation with 11 leading scientific publishers and learned societies, SCOAP3 has supported the transition to open access of many long-standing titles in the community. During this time, 20,000 scientists from 100 countries have benefited from the opportunity to publish more than 13,000 open-access articles free of charge.

With strong consensus of the growing SCOAP3 partnership, and supported by the increasing policy requirements for and global commitment to open access in its Member States, CERN has now signed contracts with 10 scientific publishers and learned societies for a three-year extension of the initiative. “With its success, SCOAP3 has shown that its model of global co-operation is sustainable, in the same broad and participative way we build and operate large collaborations in particle physics,” says CERN’s director for research and computing, Eckhard Elsen.

The post Three-year extension for open-access initiative appeared first on CERN Courier.

]]>
https://cerncourier.com/a/three-year-extension-for-open-access-initiative/feed/ 0 News
LHC experiments weigh up 2016 data https://cerncourier.com/a/lhc-experiments-weigh-up-2016-data/ https://cerncourier.com/a/lhc-experiments-weigh-up-2016-data/#respond Fri, 12 Aug 2016 07:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/lhc-experiments-weigh-up-2016-data/ The 38th International Conference on High-Energy Physics (ICHEP 2016) took place on 3–10 August in Chicago, US. Among numerous results presented, the LHC experiments released their latest analyses of 13 TeV proton–proton collision data recorded in 2016. Based on a data set of 12 fb–1, ATLAS released many new results including 50 conference notes. Highlights included new and highly precise […]

The post LHC experiments weigh up 2016 data appeared first on CERN Courier.

]]>

The 38th International Conference on High-Energy Physics (ICHEP 2016) took place on 3–10 August in Chicago, US. Among numerous results presented, the LHC experiments released their latest analyses of 13 TeV proton–proton collision data recorded in 2016.

Based on a data set of 12 fb–1, ATLAS released many new results including 50 conference notes. Highlights included new and highly precise measurements of WZ production that constrain anomalous boson couplings. ATLAS also searched in many final states for signs of direct production of supersymmetric and other new particles from beyond the Standard Model. No compelling evidence was found. In particular, the intriguing hint of a possible new state with a mass of 750 GeV decaying into photon pairs seen in the 2015 data has not reappeared. The larger data set  also allowed ATLAS to “rediscover” the Higgs boson with high statistical significance.

The CMS collaboration presented more than 70 new results based on an integrated luminosity of 13 fb–1, also including a rediscovery of the Higgs. In line with the findings from ATLAS, an updated search for a 750 GeV diphoton resonance by CMS did not confirm the excess observed previously, setting a limit on its cross-section of 1.5 fb at the 95% CL. Searches for supersymmetric and exotic particles also showed no significant excesses, allowing mass limits to be increased by a few hundred GeV. New massive Z bosons up to 4 TeV and string resonances decaying into pairs of jets up to 7.4 TeV have now been excluded, while searches for dark matter exclude mediator masses up to 2 TeV in several standard scenarios.

LHCb presented many interesting new results in the domain of flavour physics. A particular highlight was the discovery of the decay mode B0 K+K, which is the rarest B-meson decay into a hadronic final state ever observed, as well as searches for CP violation in the charm system. Another first was a measurement of the photon polarisation in radiative decays of Bs mesons, and determinations of the production cross-sections of several key processes at a collision energy of 13 TeV – some of which at first sight are at variance with current predictions.

Based on lead–lead collisions with an energy of 5 TeV per nucleon pair, the ALICE collaboration presented new measurements of the properties of the quark–gluon plasma. These included fundamental measurements of the production of quarkonium at the highest collision energy ever reached at an accelerator. ALICE also measured the viscosity of the plasma at the new energy, showing that the system still behaves almost as an ideal liquid.

• CERN Courier went to press as ICHEP 2016 got under way. A full report will appear in the next issue.

The post LHC experiments weigh up 2016 data appeared first on CERN Courier.

]]>
https://cerncourier.com/a/lhc-experiments-weigh-up-2016-data/feed/ 0 News
Data to physics https://cerncourier.com/a/data-to-physics/ https://cerncourier.com/a/data-to-physics/#respond Fri, 20 May 2016 07:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/data-to-physics/ At the beginning of May, the LHC declared the start of a new physics season for its experiments. The “Stable Beams” visible on the LHC Page 1 screen (see image above) is the “go ahead” for all the experiments to start taking data for physics. Since 25 March, when the LHC was switched back on after its […]

The post Data to physics appeared first on CERN Courier.

]]>

At the beginning of May, the LHC declared the start of a new physics season for its experiments. The “Stable Beams” visible on the LHC Page 1 screen (see image above) is the “go ahead” for all the experiments to start taking data for physics.

Since 25 March, when the LHC was switched back on after its winter break, the accelerator complex and experiments have been fine-tuned using low-intensity beams and pilot proton collisions, and now the LHC and the experiments are taking an abundance of data.

The short circuit that occurred at the end of April, caused by a small beech marten that had found its way onto a large, open-air electrical transformer situated above ground, resulted in a delay of only a few days in the LHC running schedule. The relevant part of the LHC stopped immediately and safely after the short circuit, and the entire machine remained in standby mode for a few days.

Now, the four largest LHC experiment collaborations, ALICE, ATLAS, CMS and LHCb, have started to collect and analyse the 2016 data (see images above). Last year, operators increased the number of proton bunches to 2244 per beam, spaced at intervals of 25 ns. These enabled the ATLAS and CMS collaborations to study data from about 400 million million proton–proton collisions. In 2016, operators will increase the number of particles circulating in the machine and the squeezing of the beams in the collision regions. The LHC will generate up to one-billion collisions per second in the experiments.

The physics run with protons will last six months. The machine will then be set up for a four-week run colliding protons with lead ions.

The post Data to physics appeared first on CERN Courier.

]]>
https://cerncourier.com/a/data-to-physics/feed/ 0 News
The LSC welcomes new experiments https://cerncourier.com/a/the-lsc-welcomes-new-experiments/ https://cerncourier.com/a/the-lsc-welcomes-new-experiments/#respond Wed, 28 Oct 2015 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/the-lsc-welcomes-new-experiments/ The Canfranc Underground Laboratory welcomes new ideas and proposals.

The post The LSC welcomes new experiments appeared first on CERN Courier.

]]>

The Canfranc Underground Laboratory (LSC) in Spain is one of four European deep-underground laboratories, together with Gran Sasso (Italy), Modane (France) and Boulby (UK). The laboratory is located at Canfranc Estación, a small town in the Spanish Pyrenees situated about 1100 m above sea level. Canfranc is known for the railway tunnel that was inaugurated in 1928 to connect Spain and France. The huge station – 240 m long – was built on the Spanish side, and still stands as proof of the history of the place, although the railway operation was stopped in 1970.

In 1985, Angel Morales and his collaborators from the University of Zaragoza started to use the abandoned underground space to carry out astroparticle-physics experiments. In the beginning, the group used two service cavities, currently called LAB780. In 1994, during the excavation of the 8 km-long road tunnel (Somport tunnel), an experimental hall of 118 m2 was built 2520 m away from the Spanish entrance. This hall, called LAB2500, was used to install a number of experiments carried out by several international collaborations. In 2006, two additional larger halls – hall A and hall B, collectively called LAB2400 – were completed and ready for use. The LSC was born.

Today, some 8400 m3 are available to experimental installations at Canfranc in the main underground site (LAB2400), and a total volume of about 10,000 m3 on a surface area of 1600 m2 is available among the different underground structures. LAB2400 has about 850 m of rock overburden with a residual cosmic muon flux of about 4 × 10–3 m–2 s–1. The radiogenic neutron background (< 10 MeV) and the gamma-ray flux from natural radioactivity in the rock environment at the LSC are determined to be of the order of 3.5 × 10–6 n/(cm2 s) and 2γ/(cm2 s), respectively. The neutron flux is about 30 times less intense than on the surface. The radon level underground is kept in the order of 50–80 Bq/m3 by a ventilation system with fresh-air input of about 19,600 m3/h and 6300 m3/h for hall A and B, respectively. To reduce the natural levels of radioactivity, a new radon filtering system and a radon detector with a sensitivity of mBq/m3 will be installed in hall A in 2016, to be used by the experiments.

The underground infrastructure also includes a clean room to support detector assembly and to maintain the high level of cleanliness required for the most important components. A low-background screening facility, equipped with seven high-purity germanium γ-spectrometers, is available to experiments that need to select components with low radioactivity for their detectors. The screening facility has recently been used by the SuperKGd collaboration to measure the radiopurity of gadolinium salts for the Super-Kamiokande gadolinium project.

A network of 18 optical fibres, each equipped with humidity and temperature sensors, is installed in the main halls to monitor the rock stability. The sensitivity of the measurement is at the micrometer level; so far, across a timescale of four years, changes of 0.02% have been measured over 10 m scale lengths.

The underground infrastructure is complemented by a modern 1800 m2 building on the surface, which houses offices, a chemistry and an electronics laboratory, a workshop and a warehouse. Currently, some 280 scientists from around the world use the laboratory’s facilities to carry out their research.

The scientific programme at the LSC focuses on searches for dark matter and neutrinoless double beta decay, but it also includes experiments on geodynamics and on life in extreme environments.

Neutrinoless double beta decay

Unlike the two-neutrino mode observed in a number of nuclear decays (ββ2ν, e.g. 136Xe → 136Ba + 2e + 2–νe), the neutrinoless mode of double beta decay (ββ0ν, e.g. 136Xe → 136Ba + 2e) is as yet unobserved. The experimental signature of a neutrinoless double beta decay would be two electrons with total energy equal to the energy released in the nuclear transition. Observing this phenomenon would demonstrate that the neutrino is its own antiparticle, and is one of the main challenges in physics research carried out in underground laboratories. The NEXT experiment at the LSC aims to search for those experimental signatures in a high-pressure time projection chamber (TPC), using xenon enriched in 136Xe. The NEXT TPC is designed with a plane of photomultipliers on the cathode and a plane of silicon photomultipliers behind the anode. This set-up allows the collaboration to determine the energy and the topology of the event, respectively. In this way, background from natural radioactivity and from the environment can be accurately rejected. In its final configuration, NEXT will use 100 kg of 136Xe and 15 bar pressure. A demonstrator of the TPC with 10 kg of Xe, named NEW, is currently being commissioned at the LSC.

The Canfranc Laboratory also hosts R&D programmes in support of projects that will be carried out in other laboratories. An example is BiPo, a high-sensitivity facility that measures the radioactivity on thin foils for planar detectors. Currently, BiPo is performing measurements for the SuperNEMO project proposed at the Modane laboratory. SuperNEMO aims to make use of 100 kg of 82Se in thin foils to search for ββ0ν signatures. These foils must have very low contamination from other radioactive elements. In particular, the contamination must be less than 10 μBq/kg for 214Bi from the 238U decay chain, and less than 2 μBq/kg for 208Tl from the 232Th decay chain. These levels of radioactivity are too small to be measured with standard instruments. The BiPo experiment provides a technical solution to perform this very accurate measurement using a thin 82Se foil (40 mg/cm2) that is inserted between two detection modules equipped with scintillators and photomultipliers to tag 214Bi and 208Tl.

Dark matter

The direct detection of dark matter is another typical research activity of underground laboratories. At the LSC, two projects are in operation for this purpose: ANAIS and ArDM. In its final configuration, ANAIS will be an array of 20 ultrapure NaI(Tl) crystals aiming at investigating the annual modulation signature of dark-matter particles coming from the galactic halo. Each 12.5 kg crystal is put inside a high-purity electroformed copper shielding made at the LSC chemistry laboratory. Roman lead of 10 cm thickness, plus other lead structures totalling 20 cm thickness, are installed around the crystals, together with an active muon veto and passive neutron shielding. In 2016, the ANAIS detector will be in operation with a total of 112 kg high-purity NaI(Tl) crystals.

A different experimental approach is adopted by the ArDM detector. ArDM makes use of two tonnes of liquid argon to search for WIMP interactions in a two-phase TPC. The TPC is viewed by two arrays of 12 PMTs and can operate in single phase (liquid only) or double phase (liquid and gas). The single-phase operation mode was successfully tested up to summer 2015, and the collaboration will be starting the two-phase mode by the end of 2015.

Nuclear astrophysics

In recent decades, the scientific community has shown growing interest in measuring cross-sections of nuclear interactions taking place in stars. At the energy of interest (that is, the average energy of particles at the centre of the stars), the expected interaction rates are very small. The signal is so small that the measurement can only be performed in underground laboratories where the levels of background are reduced. For this reason, a project has been proposed at the LSC: the Canfranc Underground Nuclear Astrophysics (CUNA) facility. CUNA would require a new experimental hall to host a linear accelerator and the detectors. A feasibility study has been carried out and further developments are expected in the coming years.

Geodynamics

The geodynamic facility at the LSC aims to study local and global geodynamic events. The installation consists of a broadband seismometer, an accelerometer and two laser strainmeters underground, and two GPS stations on the surface in the surroundings of the underground laboratory. This facility allows seismic events to be studied over a wide spectrum, from seismic waves to tectonic deformations. The laser interferometer consists of two orthogonal 70 m-long strainmeters. Non-linear shallow water tides have been observed with this set-up and compared with predictions. This was possible because of the excellent signal-to-noise ratio for strain data at the LSC.

Life in extreme environments

In the 1990s, it became evident that life on Earth extends into the deep subsurface and extreme environments. Underground facilities can be an ideal laboratory for scientists specialising in astrobiology, environmental microbiology or other similar disciplines. The GOLLUM project proposed at the LSC aims to study micro-organisms inhabiting rocks underground. The project plans to sample the rock throughout the length of the railway tunnel and characterize microbial communities living at different depths (metagenomics) by DNA extraction.

Currently operating mainly in the field of dark matter and the search for rare decays, the LSC has the potential to grow as a multidisciplinary underground research infrastructure. Its large infrastructure equipped with specialized facilities allows the laboratory to host a variety of experimental projects. For example, the space previously used by the ROSEBUD experiment is now available to collaborations active in the field of direct dark-matter searches or exotic phenomena using scintillating bolometers or low-temperature detectors. A hut with exceptionally low acoustic and vibrational background, equipped with a 3 × 3 × 4.8 m3 Faraday cage, is available in hall B. This is a unique piece of equipment in an underground facility that, among other things, could be used to characterize new detectors for low-mass dark-matter particles. Moreover, some 100 m2 are currently unused in hall A. New ideas and proposals are welcome, and will be evaluated by the LSC International Scientific Committee.

• For further details about the LSC, visit www.lsc-canfranc.es.

The post The LSC welcomes new experiments appeared first on CERN Courier.

]]>
https://cerncourier.com/a/the-lsc-welcomes-new-experiments/feed/ 0 Feature The Canfranc Underground Laboratory welcomes new ideas and proposals.
Vienna hosts a high-energy particle waltz https://cerncourier.com/a/vienna-hosts-a-high-energy-particle-waltz/ https://cerncourier.com/a/vienna-hosts-a-high-energy-particle-waltz/#respond Fri, 25 Sep 2015 07:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/vienna-hosts-a-high-energy-particle-waltz/   The first results at a new high-energy frontier in particle physics were a major highlight for the 2015 edition of the European Physical Society Conference on High Energy Physics (EPS-HEP). The biennial conference took place at the University of Vienna on 22–29 July, only weeks after data taking at the LHC at CERN had started […]

The post Vienna hosts a high-energy particle waltz appeared first on CERN Courier.

]]>
 

The first results at a new high-energy frontier in particle physics were a major highlight for the 2015 edition of the European Physical Society Conference on High Energy Physics (EPS-HEP). The biennial conference took place at the University of Vienna on 22–29 July, only weeks after data taking at the LHC at CERN had started at the record centre-of-mass energy of 13 TeV. In addition to the hot news from the LHC, the 723 participants from all over the world were also able to share a variety of exciting news in different areas of particle and astroparticle physics, presented in 425 parallel talks, 194 posters and 41 plenary talks. The following report focuses on a few selected highlights, including the education and outreach session – a “first” for EPS-HEP conferences (see box below).

After more than two years of intense work during the first long shutdown, the LHC and the experiments have begun running again, ready to venture into unexplored territories and perhaps observe physics beyond the Standard Model, following the discovery of the Higgs boson in 2012. Both the accelerator teams and the LHC experimental collaborations made a huge effort to provide collisions and to gather physics data in time for EPS-HEP 2015. By mid-July, the experiments had already recorded 100 times more data than they had at around the same time after the LHC had started up at 7 TeV in 2010, and the collaborations had worked hard to be able to bring the first results using 2015 data.

Talks at the conference provided detailed information about the operation of the accelerator and expectations for the near and distant future. The ATLAS, CMS and LHCb collaborations all presented results at 13 TeV for the first time (CERN Courier September 2015 pp8–11). Measurements of the charged-particle production rate as a function of rapidity provide a first possibility to test hadronic physics models in the new energy region. Several known resonances, such as the J/ψ and the Z and W bosons, have been rediscovered at these higher energies, and the cross-section for top–antitop production has been measured and found to be consistent with the predictions of the Standard Model. The first searches for new phenomena have also been performed, but unfortunately with no sign of unexpected behaviour. In all, the early results presented at the conference were very encouraging and everyone is looking forward to more data being delivered and analysed.

At the same time, the LHC collaborations have continued to extract interesting new physics from the collider’s first long run. According to the confinement paradigm of quantum chromodynamics, the gauge theory of strong interactions, only bound states of quarks and gluons that transform trivially under the local symmetries of this description are allowed to exist in nature. It forbids free quarks and gluons, but allows bound states composed of two, three, four, five, etc, quarks and antiquarks, and provides no reason why such states cannot exist. While quark–antiquark and three-quark bound states have been known since the first formulation of the basic theory some 40 years ago, it is only a year or so since unambiguous evidence for tetraquark states was first presented. Now, at EPS-HEP 2015, the LHCb collaboration reported on the observation of exotic resonances in the decay products of the Λb, which could be interpreted as charmonium-pentaquarks. The best fit of the findings requires two pentaquark states with spin-parity JP = 3⁻2 and JP = 5⁺2, although other assignments and even a fit in terms of merely one pentaquark are also possible (CERN Courier September 2015 p5).

The study of semileptonic decays of B mesons with τ leptons in the final state offers the possibility of revealing hints of “new physics” sensitive to non-Standard Model particles that preferentially couple to third- generation fermions. The BaBar experiment at SLAC, the Belle experiment at KEK and the LHCb experiment at CERN have all observed an excess of events for the B-meson decays B → D + τ– ντ and B → D* + τ– ντ. Averaging over the results of the three experiments, the discrepancy compared with Standard Model expectations amounts to some 3.9σ.

Nonzero neutrino masses and associated phenomena such as neutrino oscillations belong to what is currently the least well-understood sector of the Standard Model. The Tokai to Kamioka (T2K) experiment, using a νμ beam generated at the Japan Proton Accelerator Complex situated approximately 300 km east of the Super-Kamiokande detector, was the first to observe νμ to νe oscillations. It has also made a precise measurement of the angle θ23 in the Pontecorvo–Maki–Nakagawa–Sakata neutrino-mixing matrix, the leptonic counterpart of the Cabibbo–Kobayashi–Maskawa (CKM) quark-mixing matrix. However, as this value is practically independent of the relative magnitudes of the neutrino masses, it does not enable the different scenarios for the neutrino-mass hierarchy to be distinguished. A comparison of neutrino oscillations with those of antineutrinos might provide clues to the still unsolved puzzle of charge-parity violation. In this context, T2K presented an update of their earlier results on νμ disappearance results and three candidates for the appearance of νe.

At the flavour frontier, the LHCb collaboration reported a new exclusive measurement of the magnitude of the CKM matrix element |Vub|, while Belle revisited the CKM magnitude |Vcb|. In the case of |Vub|, based on Λb decays, there remains a tension between the values distilled from exclusive and inclusive decay channels that is still not understood. For |Vcb|, Belle presented an updated exclusive measurement that is, for the first time, completely consistent with the inclusive measurement of the same parameter.

Weak gravitational lensing provides a means to estimate the distribution of dark matter in the universe. By looking at more than a million source galaxies at a mean co-moving distance of 2.9 Gpc (about nine thousand million light-years), the Dark Energy Survey collaboration has produced an impressive map of both luminous and dark matter, exhibiting potential candidates for superclusters and (super)voids. The mass distribution deduced from this map correlates nicely with the “known”, that is, optically detected, galaxy clusters in the foreground.

More than a year ago, the BICEP2 collaboration caused some disturbance in the scientific community by claiming to have observed the imprint of primordial gravitational waves, generated during inflation, in the B-mode polarization spectrum of the cosmic-microwave background. Since then, the Planck collaboration has collected strong evidence that, upon subtraction of the impact of foreground dust, the BICEP2 data can be explained by a “boring ordinary” cosmic-microwave background (CERN Courier November 2014 p15).

Following the parallel sessions that formed the first part of the conference, Saturday afternoon was devoted to the traditional special joint session with the European Committee for Future Accelerators (ECFA). The comprehensive title for this year was “Connecting Scales: Bridging the Infinities”, with an emphasis on particle-physics topics that influence the evolution of the universe. This joint EPS-HEP/ECFA session, which was well attended, gave the audience a unique occasion to profit from broad overviews in various fields.

Prizes and more

As is traditional, the award of the latest prizes of the EPS High Energy and Particle Physics Division started the second half of the conference, which is devoted to the plenary sessions. The 2015 High Energy and Particle Physics Prize was awarded to James Bjorken “for his prediction of scaling behaviour in the structure of the proton that led to a new understanding of the strong interaction”, and to Guido Altarelli, Yuri Dokshizer, Lev Lipatov and Giorgio Parisi “for developing a probabilistic field theory framework for the dynamics of quarks and gluons, enabling a quantitative understanding of high-energy collisions involving hadrons”. The 2015 Giuseppe and Vanna Cocconi Prize was awarded to Francis Halzen “for his visionary and leading role in the detection of very-high-energy extraterrestrial neutrinos, opening a new observational window on the universe”. The Gribov Medal, Young Experimental Physicist Prize, and Outreach Prize for 2015 were also presented to their recipients, respectively, Pedro Vieira, Jan Fiete Grosse-Oetringhaus and Giovanni Petrucciani, and Kate Shaw (CERN Courier June 2015 p27).

An integral part of every conference is the social programme, which offers the local organizers the opportunity to present impressions of the city and the country where the conference is being held. Vienna is well known for classical music, and on this occasion the orchestra of the Vienna University of Technology performed Beethoven’s 7th symphony at the location where it was first performed – the Festival Hall of the Austrian Academy of Sciences. The participants were also invited by the mayor of the city of Vienna to a “Heurigen” – an Austrian wine tavern where recent year’s wines are served, combined with local food. A play called Curie_Meitner_Lamarr_indivisible presented three outstanding women pioneers of science and technology, all of whom had a connection to Vienna. A dinner in the orangery of the Schönbrunn Palace, the former imperial summer residence, provided a fitting conclusion to the social programme of this important conference for particle physics.

• EPS-HEP 2015 was jointly organized by the High Energy and Particle Physics Division of the European Physical Society, the Institute of High Energy Physics of the Austrian Academy of Sciences, the University of Vienna, the Vienna University of Technology, and the Stefan-Meyer Institute of the Austrian Academy of Sciences. For more details and the full programme, visit http://eps-hep2015.eu.

All about communication

The EPS-HEP 2015 conference made several innovations to communicate not only to the participants and particle physicists elsewhere, but also to a wider general public.

Each morning the participants were welcomed with a small newsletter containing information for the day. During the first part of the conference with only parallel sessions, the newsletter summarized the topics of all of the sessions, highlighting expected new results. The idea was to give the participants a glimpse of the topics being discussed at the parallel sessions they could not attend. For the second part of the conference with plenary presentations only, the daily newsletter also contained interviews that looked behind the scenes. The conference was accompanied online in social media, with tweets, Facebook entries and blogs highlighting selected scientific topics and social events. The tweets, in particular, attracted a large audience of people who were not able to attend the conference.

During the first week, a dedicated parallel session on education and outreach took place – the first ever at an EPS-HEP conference. The number of abstracts submitted for the session was remarkable, clearly indicating the need for exchange and discussions on this topic. The conveners chose a slightly different format from the standard parallel sessions, so that besides oral presentations on specific topics, a lively panel discussion with various contributions from the audience also took place. The session concluded with a “Science Slam” – a format in which scientists give short talks explaining the focus of their research in lively terms for the public. Extending the scope of the EPS-HEP conference towards topics concerned with education and outreach was clearly an important strength of this year’s edition.

In addition, a rich outreach programme formed an important part of the conference in Vienna; from the start, everyone involved in planning had a strong desire to take the scientific questions of the conference outside of the particle-physics community. One highlight of the programme was the public screening of the movie Particle Fever, followed by a discussion with Fabiola Gianotti, who will be the next director-general of CERN, and the producer of the movie, David Kaplan. Visual arts have become another important way to bring the general public in touch with particle physics, and several exhibitions, reflecting different aspects of particle physics from an artistic point of view, took place during the conference.

 

 

The post Vienna hosts a high-energy particle waltz appeared first on CERN Courier.

]]>
https://cerncourier.com/a/vienna-hosts-a-high-energy-particle-waltz/feed/ 0 Feature
Physics and performance in LHC Run 2 https://cerncourier.com/a/physics-and-performance-in-lhc-run-2/ https://cerncourier.com/a/physics-and-performance-in-lhc-run-2/#respond Wed, 26 Aug 2015 08:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/physics-and-performance-in-lhc-run-2/ The year 2015 began for the ATLAS experiment with an intense phase of commissioning using cosmic-ray data and first proton–proton collisions, allowing ATLAS physicists to test the trigger and detector systems as well as to align the tracking devices. Then the collection of physics data in LHC Run 2 started in June, with proton–proton collisions at […]

The post Physics and performance in LHC Run 2 appeared first on CERN Courier.

]]>

The year 2015 began for the ATLAS experiment with an intense phase of commissioning using cosmic-ray data and first proton–proton collisions, allowing ATLAS physicists to test the trigger and detector systems as well as to align the tracking devices. Then the collection of physics data in LHC Run 2 started in June, with proton–proton collisions at a centre-of-mass energy of 13 TeV (CERN Courier July/August 2015 p25). Measurements at this new high-energy frontier were among the highlights of the many results presented by the ATLAS collaboration at EPS-HEP 2015.

An important early goal for ATLAS was to record roughly 200 million inelastic proton–proton collisions with a very low level of secondary collisions within the same event (“pile-up”). This data sample allowed ATLAS physicists to perform detailed studies of the tracking system, which features a new detector, the “Insertable B-layer” (IBL). The IBL consists of a layer of millions of tiny silicon pixels mounted in the innermost heart of ATLAS at a distance of 3.3 cm from the proton beam (CERN Courier October 2013 p28). Together with the other tracking layers of the overall detector, the IBL allows ATLAS to measure the origin of charged particles with up to two times better precision than during the previous run. Figure 1 shows the resolution achieved for the longitudinal impact parameter of the beam.

ATLAS exploited the early data sample at 13 TeV for important physics measurements. It allowed the collaboration to characterize inelastic proton–proton collisions in terms of charged-particle production and the structure of the “underlying event” – collision remnants that are not directly related to the colliding partons in the proton. This characterization is important for validating the simulation of the high-luminosity LHC collisions, which contain up to 40 inelastic proton–proton collisions in a given event (one event involves the crossing of two proton bunches with more than 100 billion protons each). Figure 2 shows the evolution of the charged-particle multiplicity with centre-of-mass energy.

ATLAS also measured the angular correlation among pairs of the produced charged particles, confirming the appearance of a so-called “ridge” phenomenon in events with large particle multiplicity at a centre-of-mass energy of 13 TeV. The “ridge” (figure 3) consists of long-range particle–particle correlations not predicted by any of the established theoretical models describing inelastic proton–proton collisions.

After the low-luminosity phase, the LHC operators began to increase the intensity of the beams. By the time of EPS-HEP 2015, ATLAS had recorded a total luminosity of 100 pb–1, of which up to 85 pb–1 could be exploited for physics and performance studies. ATLAS physicists measured the performance of electron, muon and τ-lepton reconstruction, the reconstruction and energy calibration of jets, and the reconstruction of “displaced” decays of long-lived particles, such as weakly decaying hadrons containing a bottom quark. The precision of the position measurements of displaced decay locations (vertices) is significantly improved by the new IBL detector.

ATLAS used these data to classify the production of J/ψ particles at 13 TeV in terms of their immediate (“prompt”) and delayed (“non-prompt”) origin. While non-prompt J/ψ production is believed to be well understood via the decay of b hadrons, prompt production continues to be mysterious in some aspects.

ATLAS also performed a first study of the production of energetic, isolated photons and a first cross-section measurement of inclusive jet production in 13 TeV proton–proton collisions. Both are correctly described by state-of-the-art theory.

The data samples at high collision energy contain copious numbers of Z and W bosons, the mediators of the weak interaction, whose leptonic decays provide a clean signature in the detector that can be exploited for calibration purposes. ATLAS has studied the kinematic properties of these bosons, also in association with jet production. Their abundance in 13 TeV proton–proton collisions is found to be consistent with the expectation from theory. ATLAS has also observed some rare di-boson (ZZ) events, which – with a hundred times more data – should allow the direct detection of Higgs bosons. Figure 4 shows a candidate ZZ event.

In higher-energy proton collisions, the rate of particle production for many heavier particles for a given luminosity increases. The heaviest known particle, the top quark – with a mass approximately 170 times that of a proton – is predominantly produced in pairs at the LHC, and the cross-section for the production of top-quark pairs is expected to increase by a factor of 3.3 at 13 TeV, compared with the 8 TeV collisions of Run 1. ATLAS has performed an early measurement of the top-pair production cross-section in the cleanest channels where one top quark decays to an electron, an electron-neutrino and a jet containing a b-hadron (“b-jet”), while the other top-quark decays to a muon, a muon-neutrino and a b-jet. The small backgrounds from other processes in this channel allow a robust measurement with small systematic uncertainties. The measured cross-section agrees with the predicted increase of a factor of 3.3. The precision of the measurement is limited by the 9% uncertainty in luminosity, which is expected to improve significantly during the year. Figure 5 shows the evolution of the top-pair production cross-section.

Although the available data sample does not yet allow a significant increase in the sensitivity to the most prominent new physics phenomena, ATLAS has exploited the data to perform important early measurements. The excellent detector performance has allowed the confirmation of theoretical expectations with 13 TeV proton–proton collision energies.

The post Physics and performance in LHC Run 2 appeared first on CERN Courier.

]]>
https://cerncourier.com/a/physics-and-performance-in-lhc-run-2/feed/ 0 News
First results at 13 TeV and more from Run 1 https://cerncourier.com/a/first-results-at-13-tev-and-more-from-run-1/ https://cerncourier.com/a/first-results-at-13-tev-and-more-from-run-1/#respond Wed, 26 Aug 2015 08:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/first-results-at-13-tev-and-more-from-run-1/ The highlight of EPS-HEP 2015 for the CMS collaboration was the publication of the first physics result exploring the new territory at the LHC energy of 13 TeV: the measurement of the charged-hadron multiplicity (dN/dη), where η, the pseudorapidity, is a measure for the direction of the particle track. When protons collide at the LHC, more […]

The post First results at 13 TeV and more from Run 1 appeared first on CERN Courier.

]]>

The highlight of EPS-HEP 2015 for the CMS collaboration was the publication of the first physics result exploring the new territory at the LHC energy of 13 TeV: the measurement of the charged-hadron multiplicity (dN/dη), where η, the pseudorapidity, is a measure for the direction of the particle track. When protons collide at the LHC, more than one of their constituents (quarks or gluons) can interact with another one, so every collision produces an underlying spray of charged hadrons, such as pions and kaons, and the greater the energy, the higher the number of produced particles. Knowing precisely how many charged hadrons are created at the new collision energy is important for ensuring that the theoretical models used in the simulations employed in the physics analyses describe these underlying processes accurately. The publication from CMS at 13 TeV reports the differential multiplicity distribution for values of η < 2, and a measured density for central charged hadrons (with |η| < 0.5) of 5.49±0.01 (stat.)±0.17 (syst.). Figure 1 shows the differential distribution and the energy dependence of the new measurement compared with earlier data at lower energies.

CMS has, in addition, produced a full suite of performance plots covering a range of physics objects and final states, using up to 43/pb of 13 TeV data. Figure 2 shows the dimuon mass spectrum obtained from multiple trigger paths, where several resonances from the ω meson to the Z boson can be seen clearly. The B physics group in CMS has studied this spectrum in detail from the J/ψ to the Υ masses, and also the decay-time distributions for events with J/ψ or B+ mesons. Dedicated performance plots were presented at the conference for various muon, electron and photon kinematic and identification variables, as well as the measured reconstruction and identification efficiencies. The reconstruction of several low-mass states, including Ks, Λ, D0, D*, B+, B0 and Bs, demonstrate the good performance of the CMS tracker. In addition, the position of the beam spot has been measured in all three dimensions. Simulations are already found to reproduce these physics-object data well at this early stage.

The physics groups in CMS have also started to study several processes at 13 TeV in some detail. One highlight is a first look for searches in the dijet invariant-mass spectrum, which so far reaches up to approximately 5 TeV (figure 3). Results of the same analysis on Run 1 data were released only in spring, but CMS is already continuing the search where it ended at 8 TeV up to 13 TeV, thus demonstrating the collaboration’s readiness for discovery physics in the new energy regime. The TOP group has studied top–antitop (tt) events in the dilepton and lepton+jet channels, in addition to taking a first look at events consistent with the production of single top quarks.

While eagerly jumping on the new data, CMS continues to produce world-class physics results on the Run 1 data collected at 7 and 8 TeV. The collaboration has recently approved more than 30 new results, which were shown at the conference. These include searches for new physics as well as precision Standard Model measurements. The results presented include measurements of the two-photon production of W-boson pairs through the interaction of two photons, the electroweak production of a W boson accompanied by two jets, production rates for particle jets at 2.76 TeV compared with 8 TeV, as well as the production of two photons along with jets.

Discovered more than two decades ago, the top quark continues to play a vital role in physics analyses for both measurements and searches, because it is the heaviest elementary particle known so far. New CMS results with this special type of quark include measurements of the tt production rates in the fully hadronic sample, and a measurement of the tt+bb process as well as the tt production in conjuction with a Z or W boson. In addition, searches for signs of new physics continue, most recently in the process where top decays to a charm quark and a Higgs boson, t → cH, and the Higgs boson transforms to photons.

On the Higgs front itself, CMS has performed three new searches for non-Standard Model Higgs bosons containing τ leptons in the decay products, while on the supersymmetry front, analyses have looked for dark-matter candidates and other supersymmetric particles. Heavy-ion results from Run 1, using proton–proton, proton–lead and lead–lead collisions, include Υ polarization as a function of charged-particle multiplicity in proton–proton collisions, Z-boson production, jet-fragmentation functions in proton–lead collisions, and nuclear modification of Υ states in lead–lead collisions.

The post First results at 13 TeV and more from Run 1 appeared first on CERN Courier.

]]>
https://cerncourier.com/a/first-results-at-13-tev-and-more-from-run-1/feed/ 0 News
SESAME: a bright hope for the Middle East https://cerncourier.com/a/sesame-a-bright-hope-for-the-middle-east/ https://cerncourier.com/a/sesame-a-bright-hope-for-the-middle-east/#respond Wed, 22 Jul 2015 08:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/sesame-a-bright-hope-for-the-middle-east/ A 2.5-GeV, third-generation light source is under construction in the Middle-East that will arguably be the region’s first true international centre of excellence.

The post SESAME: a bright hope for the Middle East appeared first on CERN Courier.

]]>

Synchrotron-light sources have become an essential tool in many branches of medicine, biology, physics, chemistry, materials science, environmental studies and even archaeology. There are some 50 storage-ring-based synchrotron-light sources in the world, including a few in developing countries, but none in the Middle East. SESAME is a 2.5-GeV, third-generation light source under construction near Amman. When it is commissioned in 2016, it will not only be the first light source in the Middle East, but arguably also the region’s first true international centre of excellence.

The members of SESAME are currently Bahrain, Cyprus, Egypt, Iran, Israel, Jordan, Pakistan, the Palestinian Authority and Turkey (others are being sought). Brazil, China, the European Union, France, Germany, Greece, Italy, Japan, Kuwait, Portugal, the Russian Federation, Spain, Sweden, Switzerland, the UK and the US are observers.

SESAME will: foster scientific and technological capacities and excellence in the Middle East and neighbouring regions, and help prevent or reverse the brain drain; build scientific links and foster better understanding and a culture of peace through collaboration between peoples with different creeds and political systems.

The origins of SESAME

The need for an international synchrotron light-source in the Middle East was recognized by the Pakistani Nobel laureate Abdus Salam, one of the fathers of the Standard Model of particle physics, more than 30 years ago. This need was also felt by the CERN-and-Middle-East-based Middle East Scientific Co-operation group (MESC), headed by Sergio Fubini. MESC’s efforts to promote regional co-operation in science, and also solidarity and peace, started in 1995 with the organization in Dahab, Egypt, of a meeting at which the Egyptian minister of higher education, Venice Gouda, and Eliezer Rabinovici of MESC and the Hebrew University in Israel – and now a delegate to the CERN and SESAME councils – took an official stand in support of Arab–Israeli co-operation.

At the request of Fubini and Herwig Schopper, the German government agreed to donate the components of BESSY I to SESAME

In 1997, Herman Winick of SLAC and the late Gustav-Adolf Voss of DESY suggested building a light source in the Middle East using components of the soon-to-be decommissioned BESSY I facility in Berlin. This brilliant proposal fell on fertile ground when it was presented and pursued during workshops organized in Italy (1997) and Sweden (1998) by MESC and Tord Ekelof, of MESC and Uppsala University. At the request of Fubini and Herwig Schopper, a former director-general of CERN, the German government agreed to donate the components of BESSY I to SESAME, provided that the dismantling and transport – eventually funded by UNESCO – were taken care of by SESAME.

The plan was brought to the attention of Federico Mayor, then director-general of UNESCO, who called a meeting of delegates from the Middle East and neighbouring regions at the organization’s headquarters in Paris in June 1999. The meeting launched the project by setting up an International Interim Council with Schopper as chair. Jordan was selected to host SESAME, in a competition with five other countries from the region. It has provided the land and funded the construction of the building.

In May 2002, the Executive Board of UNESCO unanimously approved the establishment of the new centre under UNESCO’s auspices. SESAME formally came into existence in April 2004, when the permanent council was established, and ratified the appointments of Schopper as president and of the first vice-presidents, Dincer Ülkü of Turkey and Khaled Toukan of Jordan. A year later, Toukan stepped down as vice-president and became director of SESAME.

Meanwhile, the ground-breaking ceremony was held in January 2003, and construction work began the following August. Since February 2008, SESAME has been working from its own premises, which were formally opened in November 2008 in a ceremony held under the auspices of King Abdullah II of Jordan, and with the participation of Prince Ghazi Ben Mohammed of Jordan and Koïchiro Matsuura, then director-general of UNESCO. In November 2008, Schopper stepped down as president of the Council and was replaced by Chris Llewellyn Smith, who is also a former director-general of CERN. In 2014, Rabinovici and Kamal Araj of Jordan became vice-presidents, replacing Tarek Hussein of Egypt and Seyed Aghamiri of Iran.

SESAME users

As at CERN, the users of SESAME will be based in universities and research institutes in the region. They will visit the laboratory periodically to carry out experiments, generally in collaboration. The potential user-community, which is growing rapidly, already numbers some 300, and is expected eventually to grow to between 1000 and 1500. It is being fostered by a series of Users’ Meetings – the 12th, in late 2014, attracted more than 240 applications, of which only 100 could be accepted. The training programme, which is supported by the International Atomic Energy Agency, various governments and many of the world’s synchrotron laboratories, and which includes working visits to operational light sources, is already bringing significant benefits to the region.

Technical developments

In 2002, the decision was taken to build a completely new main storage ring, with an energy of 2.5 GeV – compared with the 1 GeV that would have been provided by upgrading the main BESSY 1 ring – while retaining refurbished elements of the BESSY I microtron to provide the first stage of acceleration and the booster synchrotron. As a result, SESAME will not only be able to probe shorter distances, but will also be a third-generation light source, i.e. one that can accommodate insertion devices – wigglers and undulators – to produce enhanced synchrotron radiation. There are light sources with higher energy and greater brightness, but SESAME’s performance (see table) will be good enough to allow users – with the right ideas – to win Nobel prizes.

Progress has not been as rapid as had been hoped, owing mainly to lack of funding, as discussed below. The collapse of the roof under an unprecedented snowfall in December 2013, when it even snowed in Cairo, has not helped. Nevertheless, despite working under the open sky throughout 2014, the SESAME team successfully commissioned the booster synchrotron in September 2014. The beam was brought to the full energy of 800 MeV, essentially without loss, and the booster is now the highest-energy accelerator in the Middle East (CERN Courier November 2014 p5).

The final design of the magnets for the main ring and for the powering scheme was carried out by CERN in collaboration with SESAME. Construction of the magnets is being managed by CERN using funds provided by the European Commission. The first of 16 cells was assembled and successfully tested at CERN at the end of March, and installation will begin later this year (CERN Courier May 2015 p6). If all goes well, commissioning of the whole facility – initially with only two of the four accelerating cavities – should begin in June next year.

The scientific programme

SESAME will nominally have four “day-one” beamlines in Phase 1a, although to speed things up and save money, it will actually start with just two. Three more beamlines will be added in Phase 1b.

One of the beamlines that will be available next year will produce photons with energies of 0.01–1 eV for infrared spectromicroscopy, which is a powerful tool for non-invasive studies of chemical components in cells, tissues and inorganic materials. A Fourier transform infrared microscope, which will be adapted to this beamline, has already been purchased. Meanwhile, 11 proposals from the region to use it with a conventional thermal infrared source have been approved. The microscope has been in use since last year, and the first results include a study of breast cancer by Fatemeh Elmi of the University of Mazandaran, Iran, with Randa Mansour and Nisreen Dahshan, who are PhD students in the Faculty of Pharmacy, University of Jordan. When SESAME is in operation, the infrared beamline will be used in biological applications, environmental studies, materials and archaeological sciences.

An X-ray absorption fine-structure and X-ray fluorescence beamline, with photon energies of 3–30 keV, will also be in operation next year. It will have potential applications in materials and environmental sciences, providing information on chemical states and local atomic structure that can be used for designing new materials and improving catalysts (e.g. for the petrochemical industries). Other applications include the non-invasive identification of the chemical composition of fossils and of valuable paintings.

It is hoped that macro-molecular crystallography and material-science beamlines, with photon energies of 4–14 keV and 3–25 keV, respectively, will be added in the next two years, once the necessary funding is available. The former will be used for structural molecular biology, aimed at elucidating the structures of proteins and other types of biological macromolecules at the atomic level, to gain insight into mechanisms of diseases to guide drug design (as used by pharmaceutical and biotech companies). The latter will use powder diffraction for studies of disordered/amorphous material on the atomic scale. The use of powder diffraction to study the evolution of nanoscale structures and materials in extreme conditions of pressure and temperature has become a core technique for developing and characterizing new smart materials.

In Phase 1b, soft X-ray (0.05–2 keV), small and wide-angle X-ray scattering (8–12 keV) and extreme-ultraviolet (10–200 eV) beamlines will be added. They will be used, respectively, for atomic, molecular and condensed-matter physics; structural molecular biology and materials sciences; and atomic and molecular physics, in a spectral range that provides a window on the behaviour of atmospheric gases, and enables characterization of the electrical and mechanical properties of materials, surfaces and interfaces.

The main challenges

The main challenge has been – and continues to be – obtaining funding. Most of the SESAME members have tiny science budgets, many are in financial difficulties, and some have faced additional problems, such as floods in Pakistan and the huge influx of refugees in Jordan. Not surprisingly, they do not find it easy to pay their contributions to the operational costs, which are rising rapidly as more staff are recruited, and will increase even faster when SESAME comes into operation and is faced with paying large electricity bills at $0.36/kWh and rising. Nevertheless, increasing budgets have been approved by the SESAME Council. As soon as the funding can be found, a solar-power plant, which would soon pay for itself and ease the burden of paying the electricity bill, will be constructed. And SESAME has always been open to new members, who are being sought primarily to share the benefits but also to share the costs.

So far, $65 million has been invested, including the value to SESAME of in-kind contributions of equipment (from Jordan, Germany, the UK, France, Italy, the US and Switzerland), cash contributions to the capital budget (from the EU, Jordan, Israel, Turkey and Italy), and manpower and other operational costs that are paid by the members (but not including important in-kind contributions of manpower, especially from CERN and the French light source, SOLEIL).

SESAME is a working example of Arab–Israeli–Iranian–Turkish–Cypriot–Pakistani collaboration.

Thanks to the contributions already made and additional funding to come from Iran, Israel, Jordan and Turkey, which have each pledged voluntary contributions totalling $5 million, most of the funds that are required simply to bring SESAME into operation next year are now available. At the SESAME Council meeting in May, Egypt announced that it will also make a voluntary contribution, which will narrow the immediate funding gap. More will, however, be needed, to provide additional beamlines and a properly equipped laboratory, and additional funds are being sought from a variety of governments and philanthropic organizations.

The ongoing turbulence in the Middle East has only had two direct effects on SESAME. First, sanctions are making it impossible for Iran to pay its capital and operational contributions, which are much needed. Second, discussions of Egypt joining other members in making voluntary contributions were interrupted several times by changes in the government.

Outlook

SESAME is a working example of Arab–Israeli–Iranian–Turkish–Cypriot–Pakistani collaboration. Senior scientists and administrators from the region are working together to govern SESAME through the Council, with input from scientists from around the world through its advisory committees. Young and senior scientists from the region are collaborating in preparing the scientific programme at Users’ Meetings and workshops. And the extensive training programme of fellowships, visits and schools is already building scientific and technical capacity in the region.

According to the Italian political theorist Antonio Gramsci, there is a perpetual battle between the optimism of the will and the pessimism of the brain. Several times during its history, SESAME has faced seemingly impossible odds, and pessimists might have given up. Luckily, however, the will prevailed, and SESAME is now close to coming into operation. There are still huge challenges, but we are confident that thanks to the enthusiasm of all those involved they will be met and SESAME will fulfil its founders’ ambitious aims.

The post SESAME: a bright hope for the Middle East appeared first on CERN Courier.

]]>
https://cerncourier.com/a/sesame-a-bright-hope-for-the-middle-east/feed/ 0 Feature A 2.5-GeV, third-generation light source is under construction in the Middle-East that will arguably be the region’s first true international centre of excellence. https://cerncourier.com/wp-content/uploads/2015/07/CCses2_06_15.jpg
Collaboration meets for the first FCC week https://cerncourier.com/a/collaboration-meets-for-the-first-fcc-week/ https://cerncourier.com/a/collaboration-meets-for-the-first-fcc-week/#respond Mon, 27 Apr 2015 08:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/collaboration-meets-for-the-first-fcc-week/ As many as 340 physicists, engineers, science managers and journalists gathered in Washington DC for the first annual meeting of the global Future Circular Collider (FCC) study. The FCC week covered all aspects of the study – designs of 100-km hadron and lepton colliders, infrastructures, technology R&D, experiments and physics. The meeting began with an exciting […]

The post Collaboration meets for the first FCC week appeared first on CERN Courier.

]]>
CCnew13_04_15

As many as 340 physicists, engineers, science managers and journalists gathered in Washington DC for the first annual meeting of the global Future Circular Collider (FCC) study. The FCC week covered all aspects of the study – designs of 100-km hadron and lepton colliders, infrastructures, technology R&D, experiments and physics.

The meeting began with an exciting presentation by US congressman Bill Foster, who recalled the history of the LHC as well as the former design studies for a Very Large Hadron Collider. A special session on Thursday was devoted to the experience with the US LHC Accelerator Research Program (LARP), to the US particle-physics strategy, and US R&D activities in high-field magnets and superconducting RF. A well-attended industrial exhibition and a complementary “industry fast-track” session were focused on Nb3Sn and high-temperature superconductor development.

James Siegrist from the US Department of Energy (DOE) pointed the way for aligning the high-field magnet R&D efforts at the four leading US magnet laboratories (Brookhaven, Fermilab, Berkeley Lab and the National High Magnetic Field Laboratory) with the goals of the FCC study. An implementation plan for joint magnet R&D will be composed in the near future. Discussions with further US institutes and universities are ongoing, and within the coming months several other DOE laboratories should join the FCC collaboration. A first US demonstrator magnet could be ready as early as 2016.

A total of 51 institutes have joined the FCC collaboration since February 2014, and the FCC study has been recognized by the European Commission (EC). Through the EuroCirCol project within the HORIZON2020 programme, the EC will fund R&D by 16 beneficiaries – including KEK in Japan – on the core components of the hadron collider. The four key themes addressed by EuroCirCol are the FCC-hh arc design (led by CEA Saclay), the interaction-region design (John Adams Institute), the cryo-beam-vacuum system (CELLS consortium), and the high-field magnet design (CERN). On the last day of the FCC week, the first meeting of the FCC International Collaboration was held. Leonid Rivkin was confirmed as chair of the board, with a mandate consistent with the production of the Conceptual Design Report, that is, to the end of 2018.

The next FCC Week will be held in Rome on 11–15 April 2016.

• The FCC Week in Washington was jointly organized by CERN and the US DOE, with support from the IEEE Council of Superconductivity. More than a third of the participants (120) came from the US. CERN (93), Germany (20), China (16), UK (16), Italy (12), France (11), Russia (11), Japan (10), Switzerland (10) and Spain (6) were also strongly represented. For further information, visit cern.ch/fccw2015.

The post Collaboration meets for the first FCC week appeared first on CERN Courier.

]]>
https://cerncourier.com/a/collaboration-meets-for-the-first-fcc-week/feed/ 0 News https://cerncourier.com/wp-content/uploads/2015/04/CCnew13_04_15.jpg
XFELs in the study of biological structure https://cerncourier.com/a/xfels-in-the-study-of-biological-structure/ https://cerncourier.com/a/xfels-in-the-study-of-biological-structure/#respond Mon, 23 Feb 2015 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/xfels-in-the-study-of-biological-structure/ X-ray free-electron lasers are enabling new classes of experiments

The post XFELs in the study of biological structure appeared first on CERN Courier.

]]>

The Linac Coherent Light Source (LCLS) at SLAC produced its first laser-like X-ray pulses in April 2009. The unique and potentially transformative characteristics of the LCLS beam – in particular, the short femtosecond pulse lengths and the large numbers of photons per pulse (see The LCLS XFEL below) – have created whole new fields, especially in the study of biological materials. X-ray diffraction on nanocrystals, for example, reveals 3D structures at atomic resolution, and allows pump-probe analysis of functional changes in the crystallized molecules. New modalities of X-ray solution scattering include wide-angle scattering, which provides detailed pictures from pump-probe experiments, and fluctuational solution scattering, where the X-ray pulse freezes the rotation of the molecules in the beam, resulting in a rich, 2D scattering pattern. Even the determination of the structure of single particles is possible. This article focuses on examples from crystallography and time-resolved solution scattering.

An important example from crystallography concerns the structure of protein molecules. As a reminder, protein molecules, which are encoded in our genes, are linear polymers of the 20 naturally occurring amino-acid monomers. Proteins contain hundreds or thousands of amino acids and carry out most functions within cells or organs. They catalyse chemical reactions; act as motors in a variety of contexts; control the flow of substances into and out of cells; and mediate signalling processes. Knowledge of their atomic structures lies at the heart of mechanistic understanding in modern biology.

Serial femtosecond crystallography (SFX) provides a method of studying the structure of proteins. In SFX, still X-ray photographs are obtained from a stream of nanocrystals, each crystal being illuminated by a single pulse of a few femtoseconds duration. At the LCLS, the 1012 photons per pulse can produce observable diffraction from a protein crystal much less than 1 μm3. Critically, a 10 fs pulse will scatter from a specimen before radiation damage takes place, thereby eliminating such damage as an experimental issue. Figure 1 shows a typical SFX set-up for crystals of membrane proteins. The X-ray beam in yellow illuminates a stream of crystals, shown in the inset, being carried in a thin stream of highly viscous cubic-phase lipid (LCP). The high-pressure system that creates the jet is on the left. The rate of LCP flow is well matched to the 120 Hz arrival rate of the X-ray pulses, so not much material is wasted between shots. In the ideal case, each X-ray pulse scatters from a single crystal in the LCP flow. For soluble proteins, a jet of aqueous buffer replaces the LCP.

AT1R is found at the surface of vascular cells and serves as the principal regulator of blood pressure (figure 3). Although several AT1R blockers (ARBs) have been developed as anti-hypertensive drugs, the structural knowledge of the binding to AT1Rs has been lacking, owing mainly to the difficulties of growing high-quality crystals for structure determination. Using SFX at the LCLS, Vadim Cherezov and colleagues have successfully determined the room-temperature crystal structure of human AT1R in a complex with its selective receptor-blocker ZD7155 at 2.9 Å resolution (Zhang et al. 2015). The structure of the AT1R–ZD7155 complex reveals key features of AT1R and critical interactions for ZD7155 binding. Docking simulations, which predict the binding orientation of clinically used ARBs onto the AT1R structure, further elucidated both the common and distinct binding modes for these anti-hypertensive drugs. The results have provided fundamental insights into the AT1R structure-function relationship and structure-based drug design.

In solution scattering, an X-ray beam illuminates a volume of solution containing a large number of the particles of interest, creating a diffraction pattern. Because the experiment averages across many rotating molecules, the observed pattern is circularly symmetric and can be encapsulated by a radial intensity curve, I(q), where q = 4πsinθ/λ and 2θ is the scattering angle. The data are therefore essentially one-dimensional (figure 4b). The I(q) curves are quite smooth and can be well described by a modest number of parameters. They have traditionally been analysed to yield a few important physical characteristics of the scattering particle, such as its molecular mass and radius of gyration. Synchrotrons have enabled new classes of solution-scattering experiments, and the advent of XFEL sources is already providing further expansion of the methodology.

 
 

Chasing the protein quake

An elegant example of time-resolved wide-angle scattering (WAXS) at the LCLS comes from a group led by Richard Neutze at the University of Gothenberg (Arnlund et al. 2014), which has used multi-photon absorption to trigger an extremely rapid structural perturbation in the photosynthetic reaction centre from Blastochloris viridis, a non-sulphur purple bacterium that produces molecular oxygen valuable to our environment. The group measured the progress of this fluctuation using time-resolved WAXS. Appearing with a time constant of a few picoseconds, the perturbation falls away with a 10 ps time constant and, importantly, precedes the propagation of heat through the protein.

The photosynthetic reaction centre faces unique problems of energy management. The energy of a single photon of green light is approximately equal to the activation energy for the unfolding of the protein molecule. In the photosynthetic complex, photons are absorbed by light-harvesting antennae and then rapidly funnelled to the reaction centre through specialized channels. The hypothesis is that excess energy, which may also be deposited in the protein, is dissipated before damage can be done by a process named “a protein quake”, indicating a nanoscale analogue of the spreading of waves away from the epicentre of an earthquake.

The experiments performed at the coherent X-ray imaging (CXI) station at the LCLS used micro-jet injection of solubilized protein samples. An 800 nm laser pulse of 500 fs duration illuminating the sample was calibrated so that a heating signal could be observed in the difference between the WAXS spectra with and without the laser illumination (figure 5a). The XFEL was operated to produce 40 fs pulses at 120 Hz, and illuminated and dark samples were interleaved, each at 60 Hz. The team calibrated the delay time between the laser and XFEL pulses to within 5 ps, and collected scattering patterns across a series of 41 time delays to a maximum of 100 ps. Figure 5b shows the curves indicating the difference in scattering between activated and dark molecules that were generated at each time point.

The results from this study rely on knowing the equilibrium molecular structure of the complex

The results from this study rely on knowing the equilibrium molecular structure of the complex. Molecular-dynamics (MD) simulations and modelling play a key role in interpreting the data and developing an understanding of the “quake”. A combination of MD simulations of heat deposition and flow in a molecule and spectral decomposition of the time-resolved difference scattering curves provide a strong basis for a detailed understanding of the energy propagation in the system. Because the light pulse was tuned to the frequency of the photosystem’s antennae, cofactors (molecules within the photosynthetic complex) were instantaneously heated to a few thousand kelvin, before decaying with a half-life of about 7 ps through heat flow to the remainder of the protein. Also, principal component analysis revealed oscillations in the range q = 0.2–0.9 nm–1, corresponding to a crystallographic resolution of 31–7 nm, which are signatures of structural changes in the protein. The higher-angle scattering – corresponding to the heat motion – extends to a resolution of a few angstroms, with a time resolution extending to a picosecond. This study illustrates not only the rapid evolution of the technology and experimental prowess of the field, but brings it to bear on a problem that makes clear the biological relevance of extremely rapid dynamics.

Effective single-particle imaging (SPI) would eliminate the need for crystallization, and would open new horizons in structure determination. It is an arena in which electron microscopy is making great strides, and where XFELs face great challenges. Simulations have demonstrated the real possibility of recovering structures from many thousands of weak X-ray snapshots of molecules in random orientation. However, it has become clear, as actual experiments are carried out, that there are profound difficulties with collecting high-resolution data – at present the best resolution in 2D snapshot images is about 20 nm. A recent workshop on single-particle imaging at SLAC identified a number of sources of artifacts including complex detector nonlinearities, scattering from apertures, scattering from solvent, and shot-to-shot variation in beam intensity and position. In addition, the current capability to hit a single molecule with a pulse reliably is quite limited. Serious technical progress at XFEL beamlines will be necessary before the promise of SPI at XFELs is realized fully.

Currently, the only operational XFEL facilities are at the SPring-8 Angstrom Compact free-electron LAser (SACLA) at RIKEN in Japan (CERN Courier July/August 2011 p9) and the LCLS in the US, so competition for beamtime is intense. Within the next few years, the worldwide capacity to carry out XFEL experiments will increase dramatically. In 2017, the European XFEL will come on line in Hamburg, providing a pulse rate of 27 kHz compared with the 120 Hz rate at the LCLS. At about the same time, facilities at the Paul Scherrer Institute in Switzerland and at the Pohang Accelerator Laboratory in South Korea will produce first light. In addition, the technologies for performing and analysing experiments are improving rapidly. It seems more than fair to anticipate a rapid growth in crystallography, molecular movies, and other exciting experimental methods.

The LCLS XFEL


Hard X-ray free-electron lasers (XFELs) are derived from the undulator platform commonly used in synchrotron X-ray sources around the world. In the figure, (a) shows the undulator lattice, which comprises a series of alternating pairs of magnetic north and south poles defining a gap through which electron bunches travel. The undulator at the LCLS is 60 m long, compared with about 3 m for a synchrotron device. The bunches experience an alternating force normal to the magnetic field in the gap, transforming their linear path into a low-amplitude cosine trajectory.

In the reference frame of the electron bunch, the radiation that each electron emits has a wavelength equal to the spacing of the undulator magnets (a few centimetres) divided by the square of the relativistic factor γ = E/me2 (see below). Each electron interacts both with the radiation emitted by electrons preceding it in the bunch, and with the magnetic field within the undulator. Initially, the N electrons in the bunch have random phases (see figure, (b)), so that the radiated power is proportional to N.

As the bunch advances through the undulator, it breaks up into a series of microbunches of electrons separated by the wavelength of the emitted radiation. Without going into detail, this microbunching arises from a Lorenz force on the electron in the direction of propagation, which is generated by the interaction of the undulator field and the (small) component of the electron velocity perpendicular to the direction of propagation. This force tends to push the electrons into a position at the peak of the emitted radiation. All electrons within a single bunch radiate coherently, and the radiation from one microbunch is also coherent with that from the next, being separated by a single wavelength. Therefore, the power in the radiated field is proportional to N2.

The process of microbunching can be viewed as a resonance process, for which the following undulator equation describes the conditions for operation at wavelength λ.

The tables, above, show typical operating conditions for the CXI beamline at the LCLS. The values represent only a small subset of possible operating conditions. Note the small source size, the short pulse duration and the high photons per pulse.

The post XFELs in the study of biological structure appeared first on CERN Courier.

]]>
https://cerncourier.com/a/xfels-in-the-study-of-biological-structure/feed/ 0 Feature X-ray free-electron lasers are enabling new classes of experiments https://cerncourier.com/wp-content/uploads/2015/02/CCXFE2_02_15th.jpg
Global perspectives on major science facilities https://cerncourier.com/a/global-perspectives-on-major-science-facilities/ https://cerncourier.com/a/global-perspectives-on-major-science-facilities/#respond Wed, 22 Jan 2014 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/global-perspectives-on-major-science-facilities/ Physics societies provide valuable input to the planning process.

The post Global perspectives on major science facilities appeared first on CERN Courier.

]]>
ILC digital render

Given the broad international collaborations involved in major scientific user facilities, timely formal and informal discussions among leaders of physics societies worldwide contribute to fortifying the scientific case that is needed to justify large, new enterprises. The past year, 2013, proved to be one of focused introspection and planning for major research facilities, conducted by learned societies and by government agencies in Asia, Europe and the US. All three regions developed visions for particle physics and in the US the government developed priorities and plans for a broad spectrum of scientific user facilities.

The Asia-Europe Physics Summit

In July, in Makuhari, Chiba, Japan, the third Asia-Europe Physics Summit (ASEPS3) – a collaboration between the Association of Asia Pacific Physical Societies and the European Physical Society – provided a forum for leaders in the respective physics communities to discuss strengthening the collaboration between Europe and the Asia-Pacific region (Barletta and Cifarelli 2013). These summits have three main goals: to discuss the scientific priorities and the common infrastructure that could be shared between European and Asian countries in various fields of physics research; to establish a framework to increase the level of Euro-Asia collaborations during the next 20 years; and to engage developing countries in a range of physics research. This year’s summit centred on international strategic planning for large research facilities. It also included a significant US perspective in three of the four round-table discussions.

High-energy physics programmes received particular focus

Round Table 1 offered perspectives on the technologies that enable major research facilities, while Round Table 2 looked to the issues of policy and co-operation inherent in the next generation of large facilities. High-energy physics programmes received particular focus in the discussion, where the three regions of Asia, Europe and the US have their own road maps and strategies. This round table clearly provided a special opportunity for a number of leaders and stakeholders to exchange their views. Participants in Round Table 4 discussed training, education and public outreach – in particular the lessons learnt and challenges from large research laboratories. Although the science motivations for major user facilities differ widely, many of the underlying accelerator and detector technologies – as well as issues of policy, international co-operation and training the next generation of technical physicists and engineers – are nonetheless in common.

LHC

Because both the update to the European Strategy for Particle Physics and the Technical Design Report for the International Linear Collider (ILC) had been issued by the time of the summit, and because the Snowmass process in the US was well under way, major facilities for particle physics set a primary, although far from exclusive, context for the discussions.

The European Strategy for Particle Physics

In January, a working group of the CERN Council met in Erice to draft an updated strategy for medium and long-term particle physics. That document was remitted to the Council, which formally adopted the recommendations in a special meeting hosted by the European Commission in Brussels in May. As expected, the updated strategy emphasizes the exploitation of the LHC to its full potential across many years through a series of planned upgrades. It also explicitly supports long-term research to “continue to develop novel techniques leading to ambitious future accelerator projects on a global scale” and to “maintain a healthy base in fundamental physics research, in universities and national laboratories”. In a period in which research funding is highly constrained worldwide, these latter points are a strong cautionary note that maintaining “free energy” in national research budgets is essential for innovation.

Beyond the focus on the LHC, the strategy recommends being open to engaging in particle-physics projects outside of the European region. In particular, it welcomes the initiative from the Japanese high-energy-physics community to host the ILC in Japan and “looks forward to a proposal from Japan to discuss a possible participation”. That sentiment resonated strongly with many participants in the 2013 Community Summer Study in the US, especially in the study groups on the energy-frontier study and accelerator capabilities. In September, the Asia-Pacific High Energy Physics Panel and the Asian Committee for Future Accelerators issued a statement that “the International Linear Collider (ILC) is the most promising electron positron collider to achieve the objectives of next-generation physics.”

The 2013 US Community Summer Study

In the spring of 2012, the Division of Particles and Fields of the American Physical Society (APS) commissioned an independent, bottom-up study that would give voice to the aspirations of the US particle-physics community for the future of high-energy physics. The idea of such a non-governmental study was welcomed by the relevant offices of both the US Department of Energy (DOE) and the National Science Foundation (NSF). The APS study explicitly avoided prioritizing proposed projects and experiments in favour of providing a broad perspective of opportunities in particle physics that would serve as a major input to an official DOE/NSF Particle Physics Project Prioritization Panel (P5). The study was broadly structured into nine working groups along the lines of the “physics frontiers” – energy, intensity and cosmic – introduced in the 2008 P5 report and augmented with studies of particle theory, accelerator capabilities, underground laboratories, instrumentation, computing and outreach. In turn, the two conveners of each working group divided their respective studies into several sub-studies, each with three conveners, generally.

Mapping of Universe

Beginning with a three-day organizational meeting in October 2012 and culminating in a nine-day session at the end of July/beginning of August 2013 – “Snowmass on the Mississippi” – the 2013 Community Summer Study involved nearly 1000 physicists from the US plus many participants from Europe and Asia. Roughly 30 small workshops were held in 2013 to prepare for the “Snowmass” session at the University of Minnesota, which was attended by several hundred physicists.

Snowmass activities connected with the energy frontier were strongly influenced by the discovery of a Higgs boson at the LHC. Not surprisingly, the scientific opportunities offered by the LHC and its series of planned upgrades received considerable attention. The study welcomed the initiative for the ILC in Japan, noting that the ILC is technically ready to proceed to construction. One idea that gained considerable momentum during the Snowmass process was the renewed interest in a very large hadron collider with an energy reach well beyond the LHC.

The conclusions of each of the nine working groups are presented in a summary report, which defines the most important questions for particle physics and identifies the most promising opportunities to address them in several strategic physics themes:

• Probe the highest possible energies and distance scales with the existing and upgraded LHC and reach for even higher precision with a lepton collider. Study the properties of the Higgs boson in full detail.
• Develop technologies for the long-term future to build multi-tera-electron-volt lepton colliders and 100 TeV hadron colliders.
• Execute a programme with the US as host that provides precision tests of the neutrino sector with an underground detector. Search for new physics in quark and lepton decays in conjunction with precision measurements of electric dipole and anomalous magnetic moments.
• Identify the particles that make up dark matter through complementary experiments deep underground, on the Earth’s surface and in space, and determine the properties of the dark sector.
• Map the evolution of the universe to reveal the origin of cosmic inflation, unravel the mystery of dark energy and determine the ultimate fate of the cosmos.

The study further identifies and recommends opportunities for investment in new enabling technologies of accelerators, instrumentation and computation. It recognizes the need for theoretical work, both in support of experimental projects and to explore unifying frameworks. It calls for new investments in physics education and identifies the need for an expanded, co-ordinated communication and outreach effort.

Summary

Although the activities of 2013 on possible perspectives and scenarios for major science facilities were neither a worldwide physics summit nor a worldwide physics study, they served to open the door for extensive engagement by physicists to build a compelling science case for major research facilities in Asia, Europe and the US. They identified ways to increase the scientific return on society’s investment and to spread the benefits of forefront physics research to developing countries.

During the meetings in 2013, it became clear that a possible future picture could be construction of the ILC in Japan and a long baseline neutrino programme in the US, while Europe exploits the LHC and prepares for the next machine at the energy frontier, which can be defined only after LHC data obtained at 14 TeV in the centre of mass have been analysed. Therefore, despite highly constrained research budgets worldwide, future prospects look bright and promising. They represent today’s challenge for the next generation(s) of scientists in a knowledge-based society.

The post Global perspectives on major science facilities appeared first on CERN Courier.

]]>
https://cerncourier.com/a/global-perspectives-on-major-science-facilities/feed/ 0 Feature Physics societies provide valuable input to the planning process. https://cerncourier.com/wp-content/uploads/2014/01/CCfut1_01_14.jpg
A network for the Balkans https://cerncourier.com/a/a-network-for-the-balkans/ https://cerncourier.com/a/a-network-for-the-balkans/#respond Wed, 20 Nov 2013 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/a-network-for-the-balkans/ Bringing physicists together in Southeast Europe.

The post A network for the Balkans appeared first on CERN Courier.

]]>
Julius Wess

From 1945 to 1990, the development of scientific educational and research capacities in physics in the Balkans followed the political and economic courses of the relevant countries. Yugoslavia and the six republics in its federation developed ties – to a greater or lesser extent – with both the East and the West, while Romania and Bulgaria became well integrated into the scientific system of the Soviet Union and the Eastern Bloc. In these countries and in the entire Balkans, the period was marked by a significant increase in the number of scientists – primarily in the field of physics – and scientific publications. There was also a substantial rise in the level of university education and scientific infrastructure, which had been lower before the Second World War or limited to a small number of exceptional yet isolated individuals or smaller institutions. Greece and Turkey were connected mainly to the US or Western Europe, while Albania was in self-imposed isolation for much of this period.

The years following 1990 brought significant changes, which were particularly dramatic and negative for the countries that were created after the break-up of Yugoslavia. The wars waged on the territory of the former Socialist Federal Republic of Yugoslavia and enormous economic problems resulted in the devastation of scientific capacities, the leaving of mainly young physicists and the stopping of many programmes and once-traditional scientific meetings – in particular the world-renowned “Adriatic meetings”. Less dramatic but more significant changes took place in Bulgaria, Romania and even Moldavia and the Ukraine – countries on the periphery of the Balkans but in the same neighbourhood. The number and quality of students graduating in physics, as well as financial investment in all forms of scientific educational work, plummeted. The number of researchers and PhD students, in particular, dropped so significantly in the majority of university centres that the critical mass necessary for teaching at graduate level as well as for teamwork and competitiveness was lost. The remaining young research groups and students – some only 100 km apart – had no form of communication, exchange or co-operation. European integration – if it began at all – proceeded slowly, while many previously established ties were severed.

Wess and WIGV

The origins of the Southeastern European Network in Mathematical and Theoretical Physics (SEENET-MTP) are linked to Julius Wess and his initiative “Wissenschaftler in globaler Verantwortung” (WIGV) – “Scientists in global responsibility” – launched in 1999 (Möller 2012). Wess was professor at the Ludwig Maximilian University (LMU) of Munich and director of the Max Planck Institute (MPI) for Physics in Munich. Like most people in Europe, he deplored the Yugoslav Wars of the 1990s and this eventually turned into a resolve to engage hands-on in re-establishing scientific co-operation with the scientists of former Yugoslavia during the “Triangle meeting” in Zagreb in 1999. Wess collected information about the remaining links between scientists in the new countries of the former Yugoslavia and the rest of the world, and especially between the former Yugoslav countries. He also found out about the institutional and economic situation of the universities and institutes.

The first network meeting of WIGV was organized in Maribor, Slovenia, in May 2000

The first network meeting of WIGV was organized in Maribor, Slovenia, in May 2000. It was followed by activities such as the Eighth Adriatic Meeting in Dubrovnik, Croatia, and the First German-Serbian School in Modern Mathematical Physics in Soko Banja, Serbia, in 2001. Three postdoc positions and many short-term fellowships were established in Munich, supported by the German Academic Exchange Service (DAAD), the German Research Foundation (DFG) and the Federal Ministry of Education and Research (BMBF). The biggest and, in a sense, the most important action was the Scientific Information Network for South East Europe (SINSEE/SINYU) project to establish new high-speed fibre capacity across large distances, especially for the scientific community, with SINYU covering the region of the former Yugoslavia.

Balkan Workshop 2003

Unfortunately, between the summers of 2002 and 2003 the WIGV initiative lost its momentum. Many of the financial ad-hoc instruments created for the region ended during this time. Wess also needed to pause because of serious health problems in 2003. However, between October 2000 and December 2002 the idea of a “southeastern European” rather than “Yugoslav” network in mathematical and theoretical physics emerged and evolved in discussions between Wess, myself and other colleagues who visited Munich or took part in numerous meetings supported by WIGV.

Our impression was that a critical mass of students and researchers in the region of the former Yugoslavia could not be achieved and that a larger context should be attempted – the Balkans. In addition to the former Yugoslavia, this would include Bulgaria, Greece, Romania, Turkey, etc. We hoped that this kind of approach would have a political as well as scientific dimension, alongside other benefits. Agreement was quickly reached and the name Southeastern European Network in Mathematical and Theoretical Physics (SEENET-MTP) was created. With the personal recommendations of Wess, I visited CERN, the International Centre for Theoretical Physics (ICTP), the UNESCO headquarters in Paris and the UNESCO Venice office to promote the idea. In the course of discussions, the foundations were laid for support for the future network.

The SEENET-MTP Network

The founding meeting of the network was set up as a workshop – the Balkan Workshop (BW2003) on Mathematical, Theoretical and Phenomenological Challenges Beyond the Standard Model, with Perspectives of Balkans collaboration – that was held as a satellite meeting of the Fifth General Conference of the Balkan Physical Union, in Vrnjačka Banja, Serbia, in August 2003. This made it possible to have a regional meeting, with representatives from nearly all of the relevant countries present. Unlike the First German–Serbian School and some other actions, Germany’s contribution to the budget of BW2003 was no more than a third. The organization of the workshop was not without some controversy. It was a difficult but important lesson in the writing of applications for funding, proposals for projects and their implementation. The meeting, which had excellent lecturers, ended with the ratification of a letter of intent, followed by the election of myself as co-ordinator of the Network and Wess as co-ordinator of the Scientific-Advisory Committee (SAC) for the network (Djordjević 2012).

The most complex meeting of the network was the Balkan Summer Institute (BSI2011) with 180 participants and four associated events

While singling out the role of individuals might seem disproportionate, it is a pleasure to underline the role of Boyka Aneva in motivating colleagues from Sofia, Mihai Visinescu for those from Romania, Goran Senjanović of ICTP for his service as co-ordinator of the Network SAC (2008–2013) and the first and the current presidents of the Representative Committee of the SEENET-MTP Network, Radu Constantinescu of Craiova (2009–2013) and Dumitru Vulcanov of Timisoara, respectively. Starting in 2003 with 40 members and three nodes in Niš, Sofia and Bucharest, the network has grown steadily to its current size, now covering almost all of the countries in the Southeastern European region plus Ukraine. The Balkan Workshops series is an important part of the SEENET-MTP programme (see box). The most complex meeting of the network was the Balkan Summer Institute (BSI2011) with 180 participants and four associated events.

The main goals of the network and its activities and results can be summarized as follows.

To organize scientific and research activities in the region and the improvement of interregional collaboration through networking, the organization of scientific events and mobility programmes. The network has organized 15 scientific meetings (schools and workshops) and supported an additional 10 events. Around 1000 researchers and students have taken part in these meetings. Through UNESCO projects, followed by the ICTP project “Cosmology and Strings” PRJ-09, there have been more than 200 researcher and student exchanges in the region, about 150 seminars and 100 joint scientific papers. In co-operation with leading publishers both in the region and the rest of the world, the network has published numerous proceedings, topical journal issues and two monographs. It has also implemented 15 projects, mainly supported by UNESCO, ICTP and German foundations.

Balkan Workshop 2013

To promote the exchange of students and encourage communication between gifted pupils motivated towards natural sciences and their high schools. Three meetings and contests in the “Science and society” framework have been organized in Romania with 100 high-school pupils and undergraduate students. The network was a permanent supporter and driving force in establishing and supporting the first class for gifted high-school pupils in Niš, Serbia, and its networking with similar programmes.

To create a database as the foundation for an up-to-date overview of results obtained by different research organizations and, through this, the institutional capacity-building in physics and mathematics. The SEENET-MTP office in Niš, established in 2009, in co-operation with the University of Craiova and UNESCO Venice office, set up the project “Map of Excellence in Physics and Mathematics in SEE – the SEE MP e-Survey Project”. It has collected a full set of data on 40 leading institutions in physics and mathematics in seven Balkan countries.

BW2013: 10 years of the network

This year’s Balkan workshop – BW2013 Beyond the Standard Models – was held on 25–29 April in Vrnjačka Banja, Serbia, just like the first one. The meeting also provided an opportunity to mark 10 years of the network, which now consists of 20 institutions from 11 countries in the region and has 14 partner institutions and more than 350 individual members from around the world. It was organized by the Faculty of Science and Mathematics and SEENET-MTP office, Niš, in co-operation with the CERN Theory Group, the International School for Advanced Studies (SISSA) and ICTP, with the Physical Society Niš as local co-organizer.

The workshop offered a platform for discussions on three topics: beyond the Standard Model, everyday practice in particle physics and cosmology, and regional and interregional co-operation in science and education. The first two days were devoted to purely scientific problems, including new trends in particle and astroparticle physics: theory and phenomenology, cosmology (classical and quantum, inflation, dark matter and dark energy), quantum gravity and extra dimensions, strings, and non-commutative and non-archimedean quantum models. It was an opportunity to gather together leading experts in physics and students from the EU and Eastern Europe to discuss these topics. The third day was organized as a series of round tables on building sustainable knowledge-based societies, with a few invited lecturers and moderators from the Central European Initiative (CEI), UNESCO, the European Physical Society (EPS) etc.

Participants at the Balkan Workshop 2013

In total, 78 participants from 25 countries came to the events. Around 30 invited scientific talks, 15 panel presentations and several posters were presented. The EPS president John Dudley, EPS-CEI chair Goran Djordjević and former EPS presidents Macie Kolwas and Norbert Kroó were among the panellists. Mario Scalet (UNESCO Venice), Fernando Quevedo (ICTP), Luis Álvarez-Gaume, Ignatios Antoniadis and John Ellis (CERN), Alexei Morozov (ITEP, Moscow), Guido Martinelli (SISSA), Radomir Žikić (Ministry of Education and Science, Serbia) and others contributed greatly to the overall discussion and decisions made towards new projects. Dejan Stojković (SUNY at Buffalo) was unable to attend but has contributed a great deal as lecturer, adviser and guest editor in many network activities. Under the aegis and with the support of the EPS, the first meeting of the EPS Committee of European Integration (EPS-CEI) took place during the workshop and the first ad-hoc consortium based on the SEENET-MTP experience for future EU projects established.

SEENET-MTP: main network meetings

• BW2003 Workshop, Vrnjačka Banja, Serbia
• BW2005 Workshop, Vrnjačka Banja, Serbia
• MMP2006 School, Sofia, Bulgaria
• BW2007 Workshop, Kladovo, Serbia
• MMP2008 School, Varna, Bulgaria
• SSSCP2009 School, Belgrade-Niš, Serbia
• EBES2010 Conference, Niš, Serbia
• QFTHS2010 School and Workshop, Calimanesti, Romania
• BSI2011 Summer Institute, Donji Milanovac, Serbia
• QFTHS2012 School and Workshop, Craiova, Romania
• BW2013 Workshop, Vrnjačka Banja, Serbia

Despite the unexpected success of the SEENET-MTP initiative, its future faces challenges: to provide a mid-term and long-term financial base through EU funds, to prove its ability to contribute to current main lines of research, to extend the meeting’s activities from Bulgaria, Romania and Serbia and to other countries in the network, to organize a more self-connected and permanent training programme through topical one-week seminars for masters and PhD students in its nodes and, possibly in the future, joint masters or PhD programmes.

SEENET-MTP and physicists in the SEE region still need a partnership with leading institutions, organizations and individuals, primarily from Europe. In addition to LMU/MPI, the role of which was crucial in the period 2000–2009, and the long-term partners UNESCO and ICTP, the most promising supporters should be EPS, SISSA and CEI, as well as the most supportive one in the past few years – CERN and its Theory Group.

The post A network for the Balkans appeared first on CERN Courier.

]]>
https://cerncourier.com/a/a-network-for-the-balkans/feed/ 0 Meeting report Bringing physicists together in Southeast Europe.
Fermilab gears up for an intense future https://cerncourier.com/a/fermilab-gears-up-for-an-intense-future/ https://cerncourier.com/a/fermilab-gears-up-for-an-intense-future/#respond Wed, 20 Nov 2013 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/fermilab-gears-up-for-an-intense-future/ A series of upgrades will deliver many more protons.

The post Fermilab gears up for an intense future appeared first on CERN Courier.

]]>
Satellite view of Fermilab

When a beam of protons passed through Fermilab’s Main Injector at the end of July, it marked the first operation of the accelerator complex since April 2012. The intervening long shutdown had seen significant changes to all of the accelerators to increase the proton-beam intensity that they can deliver and so maximize the scientific reach of Fermilab’s experiments. In August, acceleration of protons to 120 GeV succeeded at the first attempt – a real accomplishment after all of the upgrades that were made – and in September the Main Injector was already delivering 250 kW of proton-beam power. The goal is to reach 700 kW in the next couple of years.

With the end of the Tevatron collider programme in 2011, Fermilab increased its focus on studying neutrinos and rare subatomic processes while continuing its active role in the CMS experiment at CERN. Accelerator-based neutrino experiments, in particular, require intense proton beams. In the spring of 2012, Fermilab’s accelerator complex produced the most intense high-energy beam of neutrinos in the world, delivering a peak power of 350 kW by routinely sending 3.8 × 1013 protons/pulse at 120 GeV every 2.067 s to the MINOS and MINERvA neutrino experiments. It also delivered 15 kW of beam power at 8 GeV, sending 4.4 × 1012 protons/pulse every 0.4 s to the MiniBooNE neutrino experiment.

Higher intensities

This level of beam intensity was pushing the capabilities of the Linac, the Booster and the Main Injector. During the shutdown, Fermilab reconfigured its accelerator complex (see figure 1) and upgraded its machines to prepare them for the new NOvA, MicroBooNE and LBNE experiments, which will demand more muon neutrinos. In addition, the planned Muon g-2 and Mu2e experiments will require proton beams for muon production. With the higher beam intensities it is important to reduce beam losses, so the recent accelerator upgrades have also greatly improved beam quality and mitigated beam losses.

Proton throughput in the Booster

Before the shutdown, four machines were involved in delivering protons for neutrino production: the Cockcroft–Walton pre-accelerator, the linear accelerator, the Booster accelerator and the Main Injector. During the past 15 years, the proton requests for the Linac and Booster have gone up by more than an order of magnitude – first in support of MiniBooNE, which received beam from the Booster, and then in support of MINOS, which received beam from the Main Injector. Past upgrades to the accelerator complex ensured that those requests were met. However, during the next 10 years another factor of three is required to meet the goals of the new neutrino experiments. The latest upgrades are a major step towards meeting these goals.

For the first 40 years of the laboratory’s existence, the initial stage of the Fermilab accelerator chain was a caesium-ion source and a Cockcroft–Walton accelerator, which produced a 750 keV H beam. In August 2012, these were replaced with a new ion source, a radiofrequency quadrupole (RFQ) and an Einzel lens. The RFQ accomplishes transverse focusing, bunching and acceleration in a single compact device, significantly smaller than the room-sized Cockcroft–Walton accelerator. Now the 750 keV beam is already bunched, which improves capture in the following Linac (a drift-tube linear accelerator). The Einzel lens is used as a beam chopper: the transmission of the ions can be turned on and off by varying the voltage on the lens. Since the ion source and RFQ are a continuous-wave system, beam chopping is important to develop notches in the beam to allow for the rise times of the Booster extraction kicker. Chopping at the lowest possible energy minimizes the power loss in other areas of the complex.

Main Injector

The Booster, which receives 400 MeV H ions from the Linac, uses charge-exchange injection to strip the electrons from the ions and maximize beam current. It then accelerates the protons to 8 GeV. For the first 30 years of Booster operation, the demand for proton pulses was often less than 1 Hz and never higher than about 2 Hz. With the advent of MiniBooNE in 2002 and MINOS in 2005, demand for protons rose dramatically. As figure 2 shows, in 2003 – the first year of full MiniBooNE operation – 1.6 × 1020 protons travelled through the Booster. This number was greater than the total for the previous 10 years.

Booster upgrades

A series of upgrades during the past 10 years enabled this factor of 10 increase in proton throughput. The upgrades improved both the physical infrastructure (e.g. cooling water and transformer power) and accelerator physics (aperture and orbit control).

While the Booster magnet systems resonate at 15 Hz – the maximum number of cycles the machine can deliver – many of the other systems have not had sufficient power or cooling to operate at this frequency. Previous upgrades have pushed the Booster’s performance to about 7.5 Hz but the goal of the current upgrades is to bring the 40-year-old Linac and Booster up to full 15 Hz operation.

Understanding the aperture, orbit, beam tune and beam losses is increasingly important as the beam frequency rises. Beam losses directly result in component activation, which makes maintenance and repair more difficult because of radiation exposure to workers. Upgrades to instrumentation (beam-position monitors and dampers), orbit control (new ramped multipole correctors) and loss control (collimation systems) have led to a decrease in total power loss of a factor of two, even with the factor of 10 increase in total beam throughput.

The injection area

Two ongoing upgrades to the RF systems continued during the recent shutdown. One concerns the replacement of the 20 RF power systems, exchanging the vacuum-tube-based modulators and power amplifiers from the 1970s with a solid-state system. This upgrade was geared towards improving reliability and reducing maintenance. The solid-state designs have been in use in the Main Injector for 15 years and have proved to be reliable. The tube-based power amplifiers were mounted on the RF cavities in the Booster tunnel, a location that exposed maintenance technicians to radiation. The new systems reduce the number of components in the tunnel, therefore reducing radiation exposure and downtime because they can be serviced without entering the accelerator tunnel. The second upgrade is a refurbishment of the cavities, with a focus on the cooling and the ferrite tuners. As operations continue, the refurbishment is done serially so that the Booster always has a minimum number of operational RF cavities. Working on these 40-plus-year-old cavities that have been activated by radiation is a labour-intensive process.

The Main Injector and Recycler

The upgrades to the Main Injector and the reconfiguration of the Recycler storage ring have been driven by the NOvA experiment, which will explore the neutrino-mass hierarchy and investigate the possibility of CP violation in the neutrino sector. With the goal of 3.6 × 1021 protons on target and 14 kt of detector mass, a significant region of the phase space for these parameters can be explored. For the six-year duration of the experiment, this requires the Main Injector to deliver 6 × 1021 protons/year. The best previous operation was 3.25 × 1021 protons/year. A doubling of the integrated number of protons is required to meet the goals of the NOvA experiment.

In 2012, just before the shutdown, the Main Injector was delivering 3.8 × 1013 protons every 2.067 s to the target for the Neutrinos at the Main Injector (NuMI) facility. This intensity was accomplished by injecting nine batches at 8 GeV from the Booster into the Main Injector, ramping up the Main Injector magnets while accelerating the protons to 120 GeV, sending them to the NuMI target, and ramping the magnets back down to 8 GeV levels – then repeating the process. The injection process took 8/15 of a second (0.533 s) and the ramping up and down of the magnets took 1.533 s.

The refurbished RF cavities

A key goal of the shutdown was to reduce the time of the injection process. To achieve this, Fermilab reconfigured the Recycler, which is an 8 GeV, permanent-magnet storage ring located in the same tunnel as the Main Injector. The machine has the same 3.3 km circumference as the Main Injector. During the Tevatron collider era, it was used for the storage and cooling of antiprotons, achieving a record accumulation of 5 × 1012 antiprotons with a lifetime in excess of 1000 hours.

In future, the Recycler will be used to slip-stack protons from the Booster and transfer them into the Main Injector. By filling the Recycler with 12 batches (4.9 × 1013 protons) from the Booster while the Main Injector is ramping, the injection time can be cut from 0.533 s to 11 μs. Once completed, the upgrades to the magnet power and RF systems will speed up the Main Injector cycle to 1.33 s – a vast improvement compared with the 2.067 s achieved before the shutdown. When the Booster is ready to operate at 15 Hz, the total beam power on target will be 700 kW.

To use the Recycler for slip-stacking required a reconfiguration of the accelerator complex. A new injection beamline from the Booster to the Recycler had to be built (figure 3), since previously the only way to get protons into the Recycler was via the Main Injector. In addition, a new extraction beamline from the Recycler to the Main Injector was needed, as the aperture of the previous line was designed for the transfer of low-emittance, low-intensity antiproton beams. New 53 MHz RF cavities for the Recycler were installed to capture the protons from the Booster, slip-stack them and then transfer them to the Main Injector. New instrumentation had to be installed and all of the devices for cooling antiproton beams – both stochastic and electron cooling systems – and for beam transfer had to be removed.

New neutrino horn

Figure 4 shows the new injection line from the Booster (figure 5) to the Recycler, together with the upgraded injection line to the Main Injector, the transfer line for the Booster Neutrino Beam programme, and the Main Injector and Recycler rings. During the shutdown, personnel removed more than 100 magnets, all of the stochastic cooling equipment, vacuum components from four transfer lines and antiproton-specific diagnostic equipment. More than 150 magnets, 4 RF cavities and about 500 m of beam pipe for the new transfer lines were installed. Approximately 300 km of cable was pulled to support upgraded beam-position monitoring systems, new vacuum installations, new kicker systems, other new instrumentation and new powered elements. Approximately 450 tonnes of material was moved in or out of the complex at the same time.

The NuMI target

To prepare for a 700 kW beam, the target station for the NuMI facility needed upgrades to handle the increased power. A new target design was developed and fabricated in collaboration with the Institute for High Energy Physics, Protvino, and the Rutherford Appleton Laboratory, UK. A new focusing horn was installed to steer higher-energy neutrinos to the NOvA experiment (figure 6). The horn features a thinner conductor to minimize ohmic heating at the increased pulsing rate. The water-cooling capacity for the target, the focusing horns and the beam absorber were also increased.

With the completion of the shutdown, commissioning of the accelerator complex is underway. Operations have begun using the Main Injector, achieving 250 kW on target for the NuMI beamline and delivering beam to the Fermilab Test Beam Facility. The reconfigured Recycler has circulated protons for the first time and work is underway towards full integration of the machine into Main Injector operations. The neutrino experiments are taking data and the SeaQuest experiment will receive proton beam soon. Intensity and beam power are inceasing in all of the machines and the full 700 kW beam power in the Main Injector should be accomplished in 2015.

The post Fermilab gears up for an intense future appeared first on CERN Courier.

]]>
https://cerncourier.com/a/fermilab-gears-up-for-an-intense-future/feed/ 0 Feature A series of upgrades will deliver many more protons. https://cerncourier.com/wp-content/uploads/2013/11/CCfer3_10_13.jpg
The EMC effect still puzzles after 30 years https://cerncourier.com/a/the-emc-effect-still-puzzles-after-30-years/ Fri, 26 Apr 2013 06:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/the-emc-effect-still-puzzles-after-30-years/ The unexpected result from 30 years ago is still in the minds of today's physicists.

The post The EMC effect still puzzles after 30 years appeared first on CERN Courier.

]]>
EMC plot from 1982

Contrary to the stereotype, advances in science are not typically about shouting “Eureka!”. Instead, they are about results that make a researcher say, “That’s strange”. This is what happened 30 years ago when the European Muon collaboration (EMC) at CERN looked at the ratio of their data on per-nucleon deep-inelastic muon scattering off iron and compared it with that of the much smaller nucleus of deuterium.

The data were plotted as a function of Bjorken-x, which in deep-inelastic scattering is interpreted as the fraction of the nucleon’s momentum carried by the struck quark. The binding energies of nucleons in the nucleus are several orders of magnitude smaller than the momentum transfers of deep-inelastic scattering, so, naively, such a ratio should be unity except for small corrections for the Fermi motion of nucleons in the nucleus. What the EMC experiment discovered was an unexpected downwards slope to the ratio (figure 1) – as revealed in CERN Courier in November 1982 and then published in a refereed journal the following March (Aubert et al. 1983).

This surprising result was confirmed by many groups

This surprising result was confirmed by many groups, culminating with the high-precision electron- and muon-scattering data from SLAC (Gomez et al. 1994), Fermilab (Adams et al. 1995) and the New Muon collaboration (NMC) at CERN (Amaudruz et al. 1995 and Arneodo et al. 1996). Figure 2 shows representative data. The conclusions from the combined experimental evidence were that: the effect had a universal shape; was independent of the squared four-momentum transfer, Q2; increased with nuclear mass number A; and scaled with the average nuclear density.

A simple picture

The primary theoretical interpretation of the EMC effect – the region x > 0.3 – was simple: quarks in nuclei move throughout a larger confinement volume and, as the uncertainty principle implies, they carry less momentum than quarks in free nucleons. The reduction of the ratio at lower x, named the shadowing region, was attributed either to the hadronic structure of the photon or, equivalently, to the overlap in the longitudinal direction of small-x partons from different nuclei. These notions gave rise to a host of models: bound nucleons are larger than free ones; quarks in nuclei move in quark bags with 6, 9 and even up to 3A quarks, where A is the total number of nucleons. More conventional explanations, such as the influence of nuclear binding, enhancement of pion-cloud effects and a nuclear pionic field, were successful in reproducing some of the nuclear deep-inelastic scattering data.

EMC graphs

It was even possible to combine different models to produce new ones; this led to a plethora of models that reproduced the data (Geesaman et al. 1995), causing one of the authors of this article to write that “EMC means Everyone’s Model is Cool”. It is interesting to note that none of the earliest models were that concerned with the role of two-nucleon correlations, except in relation to six-quark bags.

The initial excitement was tempered as deep-inelastic scattering became better understood and the data became more precise. Some of the more extreme models were ruled out by their failure to match well known nuclear phenomenology. Moreover, inconsistency with the baryon-momentum sum rules led to the downfall of many other models. Because some of them predicted an enhanced nuclear sea, the nuclear Drell-Yan process was suggested as a way to disentangle the various possible models. In this process, a quark from a proton projectile annihilates with a nuclear antiquark to form a virtual photon, which in turn becomes a leptonic pair (Bickerstaff et al. 1984). The experiment was done and none of the existing models provided an accurate description of both sets of data – a challenge that remains to this day (Alde et al. 1984).

New data

A significant shift in the experimental understanding of the EMC effect occurred when new data on 9Be became available (Seely et al. 2009). These data changed the experimental conclusion that the EMC effect follows the average nuclear density and instead suggested that the effect follows local nuclear density. In other words, even in deep-inelastic kinematics, 9Be seemed to act like two alpha particles with a single nearly free neutron, rather than like a collection of nucleons whose properties were all modified.

This led experimentalists to ask if the x > 1 scaling plateaux that have been attributed to short-range nucleon–nucleon correlations – a phenomenon that is also associated with high local densities – could be related to the EMC effect. Figure 3 shows the kinematic range of the EMC effect together with the x > 1 short-range correlation (SRC) region. While the dip at x = 1 has been shown to vary rapidly with Q2, the EMC effect and the magnitude of the x > 1 plateaux are basically constant within the Q2 range of the experimental data. Plotting the slope of the EMC effect, 0.3 < x < 0.7, against the magnitude of scaling x > 1 plateaux for all of the available data, as shown in figure 4, revealed a striking correlation (Weinstein et al. 2011). This phenomenological relationship has led to renewed interest in understanding how strongly correlated nucleons in the nucleus may be affecting the deep-inelastic results.

In February 2013, on nearly the 30th anniversary of the EMC publication, experimentalists and theorists came together at a special workshop at the University of Washington Institute of Nuclear Theory to review understanding of the EMC effect, discuss recent advances and plan new experimental and theoretical efforts. In particular, an entire series of EMC and SRC experiments are planned for the new 12 GeV electron beam at Jefferson Lab and analysis is underway of new Drell-Yan experimental data from Fermilab.

A new life

Although the EMC effect is now 30 years old, the recent experimental results have given new life to this old puzzle; no longer is Every Model Cool. Understanding the EMC effect implies understanding how partons behave in the nuclear medium. It thus has far-reaching consequences for not only the extraction of neutron information from nuclear targets but also for understanding effects such as the NuTeV anomaly or the excesses in the neutrino cross-sections observed by the MiniBooNe experiment.

The post The EMC effect still puzzles after 30 years appeared first on CERN Courier.

]]>
Feature The unexpected result from 30 years ago is still in the minds of today's physicists. https://cerncourier.com/wp-content/uploads/2013/04/CCemc1_04_13.jpg
Work for the LHC’s first long shutdown gets under way https://cerncourier.com/a/work-for-the-lhcs-first-long-shutdown-gets-under-way/ https://cerncourier.com/a/work-for-the-lhcs-first-long-shutdown-gets-under-way/#respond Wed, 20 Feb 2013 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/work-for-the-lhcs-first-long-shutdown-gets-under-way/ Photos highlight some of CERN’s preparations for a busy two years of maintenance and consolidation.

The post Work for the LHC’s first long shutdown gets under way appeared first on CERN Courier.

]]>
The LHC has been delivering data to the physics experiments since the first collisions in 2009. Now, with the first long shutdown, LS1, which started on 13 February, work begins to refurbish and consolidate aspects of the collider, together with the experiments and other accelerators in the injections chain.

LS1 was triggered by the need to consolidate the magnet interconnections so as to allow the LHC to operate at the design energy of 14 TeV in the centre-of-mass for proton–proton collisions. It has now turned into a programme involving all of the groups that have equipment in the accelerator complex, the experiments and the infrastructure systems. LS1 will see a massive programme of maintenance for the LHC and its injectors in the wake of more than three years of operation without the long winter shutdowns that were the norm in the past.

The main driving effort will be the consolidation of the 10,170 high-current splices between the superconducting magnets. As many as 1000–1500 splices will need to be redone and more than 27,000 shunts added to overcome possible problems with poor contacts between the superconducting cable and the copper stabilizer that led to the breakdown in September 2008.

The teams will start by opening up the interconnections between each of the 1695 main magnet cryostats. They will repair and consolidate around 500 interconnections at a time, in work that will gradually cover the entire 27-km circumference of the LHC. The effort on the LHC ring will also involve the exchange of 19 magnets, consolidation of the cryogenic feed boxes and installation of pressure-relief valves on the sectors that have not yet been equipped with them.

The Radiation to Electronics project (R2E) will see the protection of sensitive electronic equipment optimized by relocating the equipment or by adding shielding. Nor will work during LS1 be confined to the LHC. Major renovation work is scheduled, for example, for the Proton Synchrotron, the Super Proton Synchrotron and the LHC experiments.

Preparations for LS1 started more than three years ago, with the detailed planning of manpower and other resources. For example, Building 180 on the Meyrin site at CERN recently became a hive of activity as a training centre for the technicians who are implementing the various repairs and modifications. The pictures shown here give the flavour of this activity.

A view of the Large Magnet Facility
Plug-in modules
Welding
Workshops
The cutting tool

• More detailed articles on the work being done during LS1 will appear in the coming months. For news of the activities, watch out for articles in CERN Bulletin at http://cds.cern.ch/journal/CERNBulletin/2013/06/News%20Articles/?ln=en.

The post Work for the LHC’s first long shutdown gets under way appeared first on CERN Courier.

]]>
https://cerncourier.com/a/work-for-the-lhcs-first-long-shutdown-gets-under-way/feed/ 0 Feature Photos highlight some of CERN’s preparations for a busy two years of maintenance and consolidation. https://cerncourier.com/wp-content/uploads/2013/02/CClhc1_01_13.jpg
Berkeley welcomes real-time enthusiasts https://cerncourier.com/a/berkeley-welcomes-real-time-enthusiasts/ https://cerncourier.com/a/berkeley-welcomes-real-time-enthusiasts/#respond Tue, 06 Nov 2012 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/berkeley-welcomes-real-time-enthusiasts/ The IEEE-NPSS Real-Time Conference is devoted to the latest developments in real-time techniques in particle physics, nuclear and astrophysics, plasma physics and nuclear fusion, medical physics, space science, accelerators and general nuclear power and radiation instrumentation. Taking place every second year, it is sponsored by the Computer Application in Nuclear and Plasma Sciences technical committee […]

The post Berkeley welcomes real-time enthusiasts appeared first on CERN Courier.

]]>
CCrt1_09_12

The IEEE-NPSS Real-Time Conference is devoted to the latest developments in real-time techniques in particle physics, nuclear and astrophysics, plasma physics and nuclear fusion, medical physics, space science, accelerators and general nuclear power and radiation instrumentation. Taking place every second year, it is sponsored by the Computer Application in Nuclear and Plasma Sciences technical committee of the IEEE Nuclear and Plasma Sciences Society (NPSS). This year, the 18th conference in the series, RT2012, was organized by the Lawrence Berkeley National Laboratory (LBNL) under the chair of Sergio Zimmermann and took place on 11–15 June at the Shattuck Plaza Hotel in downtown Berkeley, California.

The conference returned to the US after being held in Lisbon for RT2010 and in Beijing in 2009, when the first Asian conference of this series was held at the Institute for High-Energy Physics. RT2012 attracted 207 registrants, with a large proportion of young researchers and engineers. Following the meetings in Beijing and Lisbon, there is now a significant attendance from Asia, as well as from the fusion and medical communities, making the conference an excellent place to meet real-time specialists with diverse interests from around the world.

Presentations and posters

As in the past, the 2012 conference consisted of plenary oral sessions. This format encourages participants to look at real-time developments in different sectors other than their own and greatly fosters the necessary interdisciplinary exchange of ideas in the various fields. Following a long tradition, each poster session is associated with a “mini-oral” presentation session. Presenters can opt for a two-minute talk, which helps them to emphasize the highlights of their posters. It is also an excellent educational opportunity for young participants to present and promote their work. With a mini-oral presentation still fresh in mind, delegates can then seek out the appropriate author during the following poster session, an approach that stimulates lively and intensive discussions.

The conference began as usual with an opening session with five invited speakers who surveyed hot topics from physics or innovative technical developments. First, David Schlegel of LBNL gave an introduction to the physics of learning about dark energy from the largest galaxy maps. Christopher Marshall of Lawrence Livermore National Laboratory introduced the National Ignition Facility and its integrated computer system. CERN’s Niko Neufeld gave an overview talk on the trigger and data acquisition (DAQ) at the LHC, which provided an introduction to the large number of detailed presentations that followed during the week. Henry Frisch of the University of Chicago presented news from the Large Area Photodetectors project, which aims for submillimetre and subnanosecond resolution in space and time, respectively. Last, Fermilab’s Ted Liu spoke about triggering in high-energy physics, with selected topics for young experimentalists.

The technical programme, organized by Réjean Fontaine of the University of Sherbrook, Canada, brought together various areas of real-time computing applications and DAQ covering a range of topics in various fields. About half of the topics came from high-energy physics, the rest mainly from astrophysics and nuclear fusion, medical applications and accelerators.

Some important sessions, such as that on Data Acquisition and Intelligent Signal Processing, started with an invited introductory or review talk. Ealgoo Kim of Stanford University reviewed the trend of data-path structures for DAQ in positron-emission tomography systems, showing how the electronics and DAQ are similar to those for detectors in high-energy physics. Bruno Gonçalves of the Instituto Superior Técnico Lisbon spoke about trends in controls and DAQ in fusion devices, such as ITER, particularly towards reaching the necessary high availability. Riccardo Paoletti of the University of Siena and INFN Pisa presented the status and perspectives on fast waveform digitizers, with many examples being given in following presentations.

Rapid evolution

This year the conference saw the rapid and systematic evolution of intelligent signal processing as it moves further towards front-end signal processing at the start of the DAQ chain. This incorporates ultrafast analogue and timing converters that use the waveform analysis concept together with powerful digital signal-processing architectures, which are necessary to compress and extract data in real time in a quasi “deadtime-less” process. Read-out systems are now made of programmable devices that include hardware and software techniques and tools for programming the reconfigurable hardware, such as field-programmable gate arrays, graphic processing units (GPUs) and digital signal processors.

An increasing number of applications and projects using new standards

Participants saw the evolution of many new projects that include architectures dealing with fully real-time signal processing, digital data extraction, compression and storage at the front-end, such as the PANDA antiproton-annihilation experiment for the Facility for Antiproton and Ion Research being built at Darmstadt. For the read-out and data-collection systems, the conceptual model is based on fast data transfer, now with multigigabit parallel links from the front-end data buffers up to terabit networks with their associated hardware (routers, switches, etc.). Low-level trigger systems are becoming fully programmable and in some experiments, such as LHCb at CERN, challenging upgrades of the level-0 selection scheme are planned, with trigger processing taking place in real time at large computer farms. There is an ongoing integration of processing farms for high-level triggers and filter farms for online selection of interesting events at the LHC. Experiences with real data were reported at the conference, providing feedback on the improvement of the event selection process.

A survey of control, monitoring and test systems for small and large instruments, as well as new machines – such as the X-ray Free-Electron Laser at DESY – was presented, showing the increasing similarities and possibilities for integration with standard DAQ systems of these instruments. A new track at the conference this year dealt with upgrades of existing systems, mainly related to LHC experiments at CERN and to Belle II at KEK and the SuperB project.

The conference saw an increasing number of applications and projects using new standards, emerging technologies such as Advance Telecommunications Computing Architecture (ATCA), as well as feedback on the experience and lessons learnt from successes and failures. This last topic, in particular, was new at this conference. Rather than showing only great achievements in glossy presentations, it can also be helpful to learn from other people’s difficulties, problems and even mistakes.

CANPS Prize awarded

A highlight of the Real-Time conference is the presentation of the CANPS prize, which is given to individuals who have made outstanding contributions in the application of computers in nuclear and plasma sciences. This year the award went to Christopher Parkman, now retired from CERN, for the “outstanding development and user support of modular electronics for the instrumentation in physics applications”. Special efforts were also made to stimulate student contributions and awards were given for the three best student papers, selected by a committee chaired by Michael Levine of Brookhaven National Laboratory.

Last, an industrial exhibit by a few relevant companies ran through the week (CAEN, National Instruments, Schroff, Struck, Wiener and ZNYX). There was also the traditional two-day workshop on ATCA and MicroTCA, which is the latest DAQ standard, following CAMAC, Fastbus and VME, from the telecommunications industry. This workshop with tutorials, organized by Ray Larsen and Zheqiao Geng of SLAC and Sergio Zimmermann of LBNL, took place during the weekend before the conference. Two short courses were also held that same weekend, one by Mariano Ruiz of the Technical University of Madrid on DAQ systems and one by Hemant Shukla of LBNL on data analysis with fast graphic cards (GPUs).

The 19th Real-Time Conference will take place in May 2014 in the deer park inside the city of Nara, Japan. It will be organized jointly by KEK, the University of Osaka and RIKEN under the chair of Masaharu Nomachi. A one-week Asian Summer school on advanced techniques on electronics, trigger, DAQ and read-out systems will also be organized jointly with the conference.

• More details about the Real-Time Conference. A special edition of IEEE Transactions on Nuclear Sciences will include all eligible contributions from the RT2012 conference, with Sascha Schmeling of CERN as senior editor.

The post Berkeley welcomes real-time enthusiasts appeared first on CERN Courier.

]]>
https://cerncourier.com/a/berkeley-welcomes-real-time-enthusiasts/feed/ 0 Feature https://cerncourier.com/wp-content/uploads/2012/11/CCrt1_09_12.jpg
An important day for science https://cerncourier.com/a/viewpoint-an-important-day-for-science/ Thu, 23 Aug 2012 07:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/viewpoint-an-important-day-for-science/ CERN’s director-general, Rolf Heuer, looks beyond the results announced on 4 July to a wider significance.

The post An important day for science appeared first on CERN Courier.

]]>
CCvie1_07_12

On 4 July 2012, particle physics was headline news around the world thanks to a scientific success story that began over 60 years ago. It was a great day for science and a great day for humanity: a symbol of what people can achieve when countries pool their resources and work together, particularly when they do so over the long term.

This particular success story is called CERN, a European laboratory for fundamental research born from the ashes of the Second World War with support from all parties, in Europe and beyond. The headline news was the discovery of a particle consistent with the long-sought after Higgs boson, certainly a great moment for science. In the long term, however, the legacy of 4 July may well be that CERN’s global impact endorses the model established by the organization’s founding fathers in the 1950s and shows that it still sets the standard for scientific collaboration today. CERN’s success exemplifies what people can achieve if we keep sight of the vision that those pioneers had for a community of scientists united in diversity pursuing a common goal.

CERN is a European organization, founded on principles of fairness to its members and openness to the world. Accordingly, its governance model gives a fair voice to all member states, both large and small. Its funding model allows member states to contribute according to their means. Its research model welcomes scientists from around the world who are able to contribute positively to the laboratory’s research programmes. Through these basic principles, CERN’s founding fathers established a model of stability for cross-border collaboration in Europe, for co-ordinated European engagement with the rest of the world, and they laid down a blueprint for leadership in the field of particle physics. The result is that today, CERN is undisputedly the hub of a global community of scientists advancing the frontiers of knowledge. It is a shining example of what people can do together.

This fact has not been lost on other fields and over the years several European scientific organizations have emulated the CERN model. The European Space Agency (ESA) and European Southern Observatory (ESO), for example, followed CERN’s example and have also established themselves as leaders in their fields. Today, those thinking of future global science projects look to the CERN model for inspiration.

Scientific success stories like this are now more important then ever. At a time when the world is suffering the worst economic crisis in decades, people – particularly the young – need to see and appreciate the benefits of basic science and collaboration across borders. And at a time when science is increasingly estranged from a science-dependent society, it is important for good science stories to make the news and encourage people to look beyond the headlines. For these reasons, as well as the discovery itself, 4 July was an important day for science.

The post An important day for science appeared first on CERN Courier.

]]>
Opinion CERN’s director-general, Rolf Heuer, looks beyond the results announced on 4 July to a wider significance. https://cerncourier.com/wp-content/uploads/2012/08/CCvie1_07_12.jpg
How the CMS collaboration orchestrates its success https://cerncourier.com/a/how-the-cms-collaboration-orchestrates-its-success/ https://cerncourier.com/a/how-the-cms-collaboration-orchestrates-its-success/#respond Tue, 27 Mar 2012 14:35:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/how-the-cms-collaboration-orchestrates-its-success/ A management model for running large collaborations.

The post How the CMS collaboration orchestrates its success appeared first on CERN Courier.

]]>
CCcms2_03_12

New members of the top-level management talk to Antonella Del Rosso about the CMS model for running a large collaboration, as they prepare for the start of the LHC’s run in 2012.

CCcms1_03_12

Trying to uncover the deepest mysteries of the universe is no trivial task. Today, the scientific collaborations that accept the challenge are huge, complex organizational structures that have their own constitution, strict budget control and top management. CMS, one of two general-purpose experiments that study the LHC collisions, provides a good example of how this type of scientific complexity can be dealt with.

 

The collaboration has literally thousands of heroes

Tiziano Camporesi

The CMS collaboration currently has around 4300 members, with more than 1000 new faces joining in the past three years. Together they come from some 170 institutes in 40 countries and six continents. Each institute has specific tasks to complete, which are agreed with the management leading the collaboration. “The collaboration is evolving all of the time. Every year we receive applications from five or so new institutes that wish to participate in the experiment,” says Joe Incandela of the University of California Santa Barbara and CERN, who took over as spokesperson of the CMS collaboration at the start of 2012. “The Collaboration Board has the task of considering those applications and taking a decision after following the procedures described in the CMS constitution. All of the participating institutes are committed to maintaining, operating, upgrading and exploiting the physics of the detector.”

Once they become full members of the collaboration, all institutes are represented on the Collaboration Board – the true governing body of CMS. (In practice, small institutes join together and choose a common representative.) The representatives can also vote for the spokesperson every two years. “To manage such a complex structure that must achieve very ambitious goals, the collaboration has so far always sought a spokesperson from among those people who have contributed to the experiment in some substantial way over the years and who have demonstrated some managerial and leadership qualities,” notes deputy-spokesperson Tiziano Camporesi of CERN . “We often meet film-makers or journalists who tell us that they want to feature a few people. They want to have ‘stars’ who can be the heroes of the show but we always tell them that the collaboration has literally thousands of heroes. I have often heard it said that we are like an orchestra: the conductor is important but the whole thing only works if every single musician plays well.”

Although two years may seem to be a short term, Joao Varela – who is a professor at the Instituto Superior Técnico of the Technical University of Lisbon and also deputy-spokesperson – believes that there are many positive aspects in changing the top management rather frequently. “The ‘two-years scheme’ allows CMS to grant this prestigious role to more people over time,” he says. “In this way, more institutes and cultures can be represented at such a high level. There is a sense of fairness in the honour being shared across the whole community. Moreover, each time a new person comes in, by human nature he/she is motivated to bring in new ideas.”

As good as the idea is to rotate people in the top management, the CMS collaboration is currently analysing the experience already accumulated to see if things can be improved. “So far deputies have always been elected as spokespersons and this has ensured continuity even during the short overlap. I was myself in physics co-ordination, then deputy and finally spokesperson. Even so, I am learning many new things every day,” points out Incandela.

At CMS the spokesperson also nominates his/her deputies and many of the members of the Executive Board, which brings together project managers and activity co-ordinators. “The members of the Executive Board are responsible for most of the day-to-day co-ordination work that is a big part of what makes CMS work so well,” explains Incandela. “Each member is responsible for managing an organization with large numbers of people and a considerable budget in some cases. Historically, the different projects and activities were somewhat isolated from one another, so that members of the board didn’t really have a chance or need to follow what the other areas were doing. With the start of LHC operations in 2008 this began to change and now people focus on broader issues.” To improve communication among the members of the Executive Board, the new CMS management also decided to organize workshops. “These have turned out to be fantastic events,” says Camporesi. “At the meetings, we discuss important and broad issues openly, from what is the best way to do great physics to how to maintain high morale and attract excellent young people to the collaboration.”

To keep the whole collaboration informed about the outcomes of such strategic meetings and other developments in the experiment in general, the CMS management organizes weekly plenary meetings. “I report once a week to the whole collaboration: we typically have anywhere from 50 to 250 people attending, plus 100–200 remote connections. We are a massive organization and the weekly update is a quick and useful means of keeping everybody informed,” adds Incandela.

The scientific achievements of CMS prove not only that a large scientific collaboration is manageable but also that it is effective. In January this year a new two-year term began for the CMS collaboration, which also renewed all of the members of top management. This is a historic moment for the experiment because many potential discoveries are in the pipeline. “This is my third generation of hadron collider – I participated in the UA2 experiment at CERN’s SPS, CDF at Fermilab’s Tevatron and now CMS at the LHC. When you are proposing a new experiment and then building it, the focus is entirely on the detector,” observes Incandela. “Then, when the beam comes, attention moves rapidly to the data and physics. The collaboration is mainly interested in data and the discoveries that we hope to make. We must ensure the high performance of the detector while providing the means for extremely accurate but quick data analysis. However, although almost everything works perfectly, there are already many small things in the detector that need repairing and upgrading.”

It is obviously important if we discover things. But is also important if we don’t see anything

Joao Valera

The accelerator settings for the LHC’s 2012 run, decided at the Chamonix Workshop in February, will mean that CMS has to operate in conditions that go beyond the design target. “The detector will face tougher pile-up conditions and our teams of experts have been working hard to ensure that all of the subsystems work as expected. It looks like the detector can cope with conditions that are up to 50% higher than the design target”, confirms Camporesi. “Going beyond that could create serious issues for the experiment. We observe that the Level1 trigger starts to be a limitation and the pixel detector starts to lose data, for instance.” CMS is already planning upgrades to improve granularity and trigger performance to cope with the projected higher luminosity beyond 2014.

Going to higher luminosity may be a big technical challenge but it does mean reducing the times to discoveries. “The final word on the Higgs boson is within reach, now measurable in terms of months rather than years. And for supersymmetry, we are changing the strategy. In 2010–2011, we were essentially searching for supersymmetric partners of light quarks because they were potentially more easily accessible. This approach didn’t yield any fruit but put significant constraints on popular models. A lot of people were discouraged,” explains Varela. “However, what we have not ruled out are possible relatively light supersymmetric partners of the third-generation quarks. The third generation is a tougher thing to look for because the signal is smaller and the backgrounds can be higher. By increasing the energy of the collisions to 4 TeV one gains 50–70% in pair production of supersymmetric top, for instance, while the top-pair background rises by a smaller margin. Having said this, and given the unexplored environment, it is obviously important if we discover things. But it is also important if we don’t see anything.”

There is a long road ahead because the searches will continue at higher LHC energies and luminosities after 2014, but the CMS collaboration plans to be well prepared.

The post How the CMS collaboration orchestrates its success appeared first on CERN Courier.

]]>
https://cerncourier.com/a/how-the-cms-collaboration-orchestrates-its-success/feed/ 0 Feature A management model for running large collaborations. https://cerncourier.com/wp-content/uploads/2012/03/CCcms2_03_12.jpg
Events that match companies and researchers https://cerncourier.com/a/events-that-match-companies-and-researchers/ https://cerncourier.com/a/events-that-match-companies-and-researchers/#respond Wed, 25 Jan 2012 12:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/events-that-match-companies-and-researchers/ While researchers strive for better instruments and methods, industrial companies look towards the research community for turning clever ideas and working prototypes into commercial products with improved accuracy and efficacy.

The post Events that match companies and researchers appeared first on CERN Courier.

]]>
Beam diagnostics, beam-profile measurements and quality-assurance methods are of the utmost importance for every accelerator or facility, and especially for radiotherapy beams. While researchers strive for better instruments and methods, industrial companies look towards the research community for turning clever ideas and working prototypes into commercial products with improved accuracy and efficacy.

To foster the transfer of technologies based on high-energy physics, the HEPTech network has launched a series of workshops called “Industry – Academia matching events”. These include summary talks from both industry and the research community, with a poster gallery and live demonstrations. These are aimed at maximizing contact and collaboration between industrial companies and those researchers active in a particular field. The first event, on silicon photomultipliers, was held in February 2011 (p33).

The second event, on the technology and the opportunities for beam instrumentation and measurement, took place on 10–11 November. It was held at GSI Darmstadt, home to much of the early work on heavy-ion radiotherapy. Some 84 participants, including representatives from 18 industrial companies, gathered at the new GSI Conference Building. The next event is planned to take place at DESY this spring and will focus on position-sensitive silicon detectors. It will be organized by the collaboration for Advanced European Infrastructures for Detectors and Accelerators (AIDA) with the support of HEPTech.

The post Events that match companies and researchers appeared first on CERN Courier.

]]>
https://cerncourier.com/a/events-that-match-companies-and-researchers/feed/ 0 News While researchers strive for better instruments and methods, industrial companies look towards the research community for turning clever ideas and working prototypes into commercial products with improved accuracy and efficacy.
Strangeness and heavy flavours in Krakow https://cerncourier.com/a/strangeness-and-heavy-flavours-in-krakow/ https://cerncourier.com/a/strangeness-and-heavy-flavours-in-krakow/#respond Wed, 23 Nov 2011 08:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/strangeness-and-heavy-flavours-in-krakow/ The latest news on strangeness in heavy-ion collisions.

The post Strangeness and heavy flavours in Krakow appeared first on CERN Courier.

]]>
CCsqm1_10_11

The 13th international conference on Strangeness in Quark Matter (SQM 2011) took place in Krakow on 18–24 September. Organized by the Polish Academy of Arts and Sciences (Polska Akademia Umiejętności, PAU), it attracted more than 160 participants from 20 countries. The emphasis was on new data on the production of strangeness and heavy flavours in heavy-ion and hadronic collisions, in particular the new results from the LHC at CERN and the Relativistic Heavy Ion Collider (RHIC) at Brookhaven. With the new high-quality data on identified particles, SQM 2011 in a sense supplemented the Quark Matter conference that was held in Annecy in May.

Summary talks during the first two morning sessions introduced the experimental highlights for the main heavy-ion experiments currently in operation. They included data at energies ranging from the Heavy Ion Synchrotron (SIS) at GSI (the HADES and FOPI experiments), through the Super Proton Synchrotron at CERN (NA49 and NA61) and RHIC (PHENIX and STAR), up to the LHC (ALICE, ATLAS and CMS), as well as prospects for new or future facilities, such as the Facility for Antiproton and Ion Research (FAIR) and the Nuclotron-based Ion Collider Facility (NICA).

In this report we can cover only a small selection of the impressive wealth of new results and information presented at the conference. The following highlights illustrate some of the most recent measurements in nucleus–nucleus collisions at the LHC and in the Beam Energy Scan programme at RHIC. All of them focus on results obtained in the sectors of strange and heavy quarks, which traditionally form the major part of the discussions at SQM conferences.

Experimental results

Maria Nicassio of the University and INFN Bari for the ALICE collaboration presented preliminary results on the production of the charged multi-strange hadrons Ξand Ω and their antiparticles, from peripheral to the most central lead–lead collisions at the current maximum centre-of-mass energy of 2.76 TeV per equivalent nucleon–nucleon (SUSY: the search continues). The enhancement of these particle yields normalized to the number of nucleons participating in the collision and compared with proton–proton (pp) production was shown for the first time (figure 1). As already found in heavy-ion collisions at the SPS and RHIC, the yields at the LHC cannot be achieved in a hadronic phase only, but require a fast equilibration and a large correlation volume. For these reasons, the enhanced production of multi-strange baryons is regarded as one of the signals for the phase transition from ordinary hadronic matter to the quark–gluon plasma (QGP). It was also stressed that, although the absolute production of hyperons increases with energy from RHIC to the LHC, both in heavy-ion and in pp collisions, the relative enhancement decreases as a result of a significant increase in pp yields at the LHC.

CCsqm2_10_11

Using the Beam Energy Scan at RHIC, the STAR collaboration has progressed significantly with tackling the evolution of the collective effects observed in heavy-ion (Au–Au) collisions between √sNN=7.7 GeV and 62.4 GeV, as Shusu Shi of Central China Normal University showed. While studying the excitation function of the second harmonic v2 in the azimuthal distribution of particle (π, K, p and Λ) production, the collaboration identified a significant difference in the behaviour of particles and antiparticles for 0–80% central Au–Au reactions (figure 2). The increasing deviation in v2 between particles and antiparticles, observed with decreasing √sNN, is more pronounced for baryons, such as protons and Λs, than for mesons (charged π and K). However, it must be noted that above 39 GeV, the difference in v2 between particles and antiparticles remains almost constant to higher energies at about 5–10%. The large difference between particle and antiparticle v2 at lower energies could thus be related to an increased amount of transported quarks to mid-rapidity or it could indicate that hadronic interactions become dominant below 11.5 GeV. In the latter case, the difference in v2 could be attributed to different interaction cross-sections of particles and antiparticles in hadronic matter of high baryon density.

CCsqm3_10_11

The ALICE collaboration also presented preliminary results on the azimuthal anisotropy (v2) of charm production in non-central lead–lead collisions at the LHC, in a talk by Chiara Bianchin of the University and INFN Padova. Such a measurement was highly anticipated, after the observation of a large suppression of charmed-meson yields in nucleus–nucleus collisions, which implies strong quenching of charm quarks in dense QGP. The study of charm anisotropy would provide insight on the degree of thermalization of the quenched charm quarks within QGP. Figure 3 shows the v2 parameter of D0 mesons, reconstructed in the Kπ+ channel, as a function of transverse momentum (in red), compared with that of charged hadrons (in black). This measurement, though statistically limited, hints at a non-zero charm v2 at low momentum and bodes well for the continuation of the study with the higher-luminosity lead run in 2011.

Theoretical discussions

The conference witnessed a lively debate on theoretical issues. In the theoretical summary talk, Giorgio Torrieri of the Goethe University, Frankfurt, pointed to various differences in the interpretation of heavy-ion data (e.g. equilibrated vs non-equilibrated hadron gas; statistical vs non-statistical production in small systems). Probably everyone connected with the SQM conferences is enthusiastic about the fact that statistical models do an excellent job in describing hadron production in heavy-ion and hadronic collisions, with the key role being played by the fast strange-quark thermalization. Perhaps, this attitude just defines the SQM community. On the other hand, there exist differences in the approaches and interpretations that should be resolved if the community is to gain a better understanding of hadron production processes. The relatively low proton-to-pion production ratio measured recently by ALICE, presented by Alexander Kalweit of the Technische Universität Darmstadt, will trigger such attempts.

CCsqm4_10_11

The analysis of the data has led to a physical picture that may be regarded as a kind of standard model of relativistic heavy-ion collisions. This model is based on the application of relativistic hydrodynamics combined with the modelling of the initial state on one side, supplemented by the kinetic simulations of freeze-out on the other side. From the theoretical point of view, it is not completely clear how strange particles may be accommodated into this picture, both at RHIC and at the LHC. The results obtained from 2+1 dissipative hydrodynamics, presented by Piotr Bozek of the Institute of Nuclear Physics, Krakow, indicate that the multi-strange particle spectra measured at the LHC cannot be simply reproduced in hydrodynamic calculations that are constructed to describe ordinary hadrons, such as pions, kaons and protons. The new LHC measurement of the elliptic flow of D0 mesons shown in figure 3 will be another important input for hydrodynamic and energy-loss models. As Christoph Blume of the University of Heidelberg indicated, the general concept that strange particles are emitted much earlier than other more abundant hadrons may be challenged in attempts to achieve a uniform description of several observables simultaneously.

Another theoretical activity presented at SQM 2011 was triggered by low-energy experiments aimed at finding the critical point of QCD (RHIC Beam Energy Scan, NA61, FAIR, NICA). This critical point marks the end of the alleged first-order phase transition in the QCD phase diagram. Its position is suggested by the effective models of QCD and lattice QCD simulations. These two approaches suffer from fundamental problems but, nevertheless, deliver useful physical insights. For example, as Christian Schmidt of the Frankfurt Institute for Advanced Studies showed, the lattice QCD calculations suggest that the curvature of the chiral phase-transition line is smaller than that of the freeze-out curve. Moreover, the lattice results are in agreement with the STAR data on net-proton fluctuations. As Krzysztof Redlich of the University of Wroclaw pointed out, theoretical probability distributions of conserved charges may be compared directly with the distributions measured by STAR to probe the critical behaviour.

The last day of the meeting was the occasion for more experimental highlights, presented in the summary talk by Karel Safarik of CERN. The conference ended with a presentation by Orlando Villalobos-Baillie of the next SQM meeting, which will be held in Birmingham, UK, in 2013.

Andrzej Bialas, the founder and the leader of the high-energy physics theoretical group in Krakow, who is currently the president of PAU, was the honorary chair of the conference. The organization chairs were Wojtek Broniowski and Wojtek Florkowski of Jan Kochanowski University, Kielce, and the Institute of Nuclear Physics, Krakow.

The post Strangeness and heavy flavours in Krakow appeared first on CERN Courier.

]]>
https://cerncourier.com/a/strangeness-and-heavy-flavours-in-krakow/feed/ 0 Feature The latest news on strangeness in heavy-ion collisions. https://cerncourier.com/wp-content/uploads/2011/11/CCsqm1_10_11.jpg
ARIS 2011 charts the nuclear landscape https://cerncourier.com/a/aris-2011-charts-the-nuclear-landscape/ https://cerncourier.com/a/aris-2011-charts-the-nuclear-landscape/#respond Wed, 23 Nov 2011 08:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/aris-2011-charts-the-nuclear-landscape/ The first meeting in a new conference series.

The post ARIS 2011 charts the nuclear landscape appeared first on CERN Courier.

]]>
CCari1_10_11

The roots of the first conference on Advances in Radioactive Isotope Science, ARIS 2011, go back to CERN in 1964, when the then director-general Victor Weisskopf called for proposals for on-line experiments to study radioactive nuclei at the 600 MeV synchrocyclotron. Why this should be done – and how – became the subject of a conference held in Lysekil, Sweden, in 1966 and a year later experiments began at ISOLDE, CERN’s Isotope Separator On Line. Following this successful start, in 1970 CERN organized a first meeting on nuclei far from stability in Leysin, Switzerland.

Since then there have been regular conferences within the field, with more specialized meetings arising hand in hand with increasingly sophisticated technical developments (see box). Three years ago the community felt that the time was ripe to streamline the conferences by merging all of the physics into a single meeting held every three years. The result was that at the end of May this year some 300 physicists met in the beautiful medieval town of Leuven in Belgium to attend ARIS 2011. The success of the meeting, with its excellent scientific programme, indicates that this was the perfect decision.

Over the past two decades the experimental possibilities for studying exotic nuclear systems have increased dramatically thanks to impressive technical developments for the production of rare nuclear species, both at rest and as energetic beams. New sophisticated detection methods and data-acquisition techniques with on- and off-line analysis methods have also been developed. The two basic techniques now used at laboratories worldwide are the isotope separator on-line (ISOL) and in-flight production methods, with several variations.

Conference highlights

CCari2_10_11

The conference heard the latest news about plans to make major improvements to existing facilities or to build new facilities, offering new research opportunities. The review of the first results from the new major in-flight facility, the Radioactive Isotope Beam Factory at the RIKEN research institute in Japan, was particularly exciting. The production of 45 new neutron-rich isotopes together with results from the Zero-Degree Spectrometer and the radioactive-ion beam separator, BigRIPS, gave a glimpse of the facility’s quality. Future installations, such as the Facility for Antiproton and Ion Research (FAIR) at GSI, SPIRAL2 at the GANIL laboratory, the High Intensity and Energy ISOLDE at CERN, the Facility for Rare Isotope Beams at Michigan State University (MSU) and the Advanced Rare Isotope Laboratory at TRIUMF were also discussed, together with the advanced plans to build EURISOL, a major new European facility complementary to FAIR.

The nuclear mass is arguably the most basic information to be gained for an isotope. Its measurement has involved various techniques, but a paradigm shift came with the development of mass spectrometers based on Penning traps and such devices are now coupled to the majority of radioactive-beam facilities. This has led to mass-determinations of unprecedented precision for isotopes in all regions of the nuclear chart, making it possible in effect to walk round the mass “landscape” and scrutinize its details (figure 3).

CCari3_10_11

Recent results from the ISOLTRAP mass spectrometer at CERN, which has been in operation for more than 20 years, have a precision in the order of 10–8 for the masses of isotopes with half-lives down to milliseconds. The first determination of masses of neutron-rich francium isotopes, where the mass of 228Fr (T1/2 = 39 s) is a notable example, were presented at ARIS 2011. The JYFLTRAP group, using the IGISOL facility at the physics department of the University of Jyväskylä (JYFL), presented masses for about 40 neutron-rich isotopes in the medium-mass region. The SHIPTRAP spectrometer at GSI has made measurements of masses towards the region of super-heavy elements; 256Lr, produced at a rate of only two atoms a minute, is the heaviest element studied so far. The TRISTAN spectrometer at TRIUMF has boosted precision by “breeding” isotopes to higher charge-states – for example, in a new measurement of the mass of the super-allowed β-emitter 74Rb, which is relevant to the unitarity of the Cabibbo-Kobayashi-Maskawa (CKM) matrix. Results from isochronous mass-spectroscopy with the CSRe storage ring at the National Laboratory of Heavy Ion Research in Lanzhou were also presented. Ion-traps are now routinely used as a key instrument for cooling and bunching the radioactive beams. This gives an improvement of several orders of magnitude in the peak-to-background ratio in laser spectroscopy experiments, or can be used prior to post-acceleration.

The determination of nuclear matter and charge radii has been important to the progress of radioactive-beam physics. The observation of shape co-existence in mercury isotopes at ISOLDE was the starting point for impressive developments, with lasers playing the key role. The most recent results from the same area of the nuclear chart are measurements using the Resonance Ionization Laser Ion Source at ISOLDE of isotope shifts and charge radii for the isotope chain 191–218Po. Another demonstration of the state of the art was shown in the determination of the charge radius of 12Be using collinear laser spectroscopy based on a frequency comb together with a photon-ion coincidence technique. Electron scattering from radioactive beams will be the next step for investigating nuclear shapes; the ELISe project at GSI and SCRIT at RIKEN are examples of such plans.

The determination of matter radii by transmission techniques, pioneered at Berkeley in the mid-1980s, led to the discovery of halo states in nuclei. These are well known today but the main data are limited to the lightest region of the nuclear chart. A step towards heavier cases was presented at ARIS in new data from RIKEN, where results from RIPS and BigRIPS indicate a halo state for 22C and 31Ne and maybe also for 37Mg.

CCari4_10_11

The use of laser spectroscopy in measuring charge radii, nuclear spins, magnetic moments and electric quadrupole moments has been extremely successful over the years. New results from the IGISOL facility – mapping the sudden onset of deformation at N = 60 – and from the ISOLDE cooler/buncher ISCOOL – for copper and gallium isotopes – were highlighted at the conference. The two-neutron halo nucleus 11Li continues to attract interest both theoretically and experimentally, where a better determination of the ratio of the electric quadrupole moments between mass 11 and 9 was needed. Now a measurement at TRIUMF based on a β-detected nuclear-quadrupole resonance technique has yielded a value of Q(11Li)/Q(9Li) = 1.077(1). Here, the cross-fertilization between beam and detector developments has led to laser-resonant ionization becoming an essential ingredient in the production cycle of pure radioactive, sometimes isomeric, beams.

Nuclear-structure studies of exotic nuclei were the topic of many contributions at ARIS. There is progress on the theoretical side with large-scale shell model calculations in the vicinity of 78Ni leading to a unified description of neutron-rich nuclei between N = 40 and N = 50. The evolution of collectivity for the N = 40 isotopes has provided many interesting experimental results. From the strongly deformed N = Z nucleus 80Zr, collectivity is rapidly decreasing to 68Ni with a high-lying 2+ state at 2.03 MeV, suggesting a doubly magic character. Going to 64Cr, there is a new deformed region illustrated by a 2+ state at 470 keV, and research at the National Superconducting Cyclotron Laboratory at MSU has found an enhanced collectivity for 78Sr, with a quadrupole deformation parameter of β2=0.44.

Many of the talks at ARIS addressed the “island of inversion”. A recent result from REX-ISOLDE identifies an excited 0+ state in 32Mg, illustrating shape coexistence at the borders of the island. Many new results – a rotational band in 38Mg observed by BigRIPS, isotope shifts for 21–32Mg measured at ISOLDE, β-decay for chromium isotopes from Berkeley and shape-coexistence in 34Si and 44S – add to the understanding of this interesting region of the nuclear chart. A new island of inversion, indicated by data for 80–84Ga from the ALTO facility in Orsay, was also discussed.

Continuing with nuclear structure, data from GANIL and its MUST 2 array on d(68Ni, p)69Ni give access to the d5/2 orbital, which is crucial for understanding shell structure and deformation in this mass region. The reaction d(34Si, p)35Si shows a density dependence of the spin-orbit splitting leading to a depletion of the nuclear matter density and resulting in a “bubble nucleus” – a topic also discussed in a theory talk.

The doubly magic nucleus 24O has attracted interest for a decade, from experimental and theoretical viewpoints. At ARIS, the coupled-cluster approach was presented as an ideal compromise between computational costs and numerical accuracy in theoretical models, while the absence of bound oxygen isotopes up to the classically expected doubly magic nucleus 28O presents a theoretical challenge.

Experimentally, there is an impressive series of data – over a wide range of elements – from the MINIBALL array at ISOLDE. One of the highlights here was the observation of shape coexistence in the lead region. A theory talk pointed out that the nuclear energy-density functional approach, both for mean-field and beyond-mean-field applications, is an efficient tool for calculations on medium-mass and heavy nuclei.

Early experiments with radioactive beams revealed exotic decay-modes such as β-delayed particle emission. Today these processes are well understood and used as workhorses to learn about the structure of exotic nuclei. The study of β-delayed three-proton emission from 43Cr and two-proton radioactivity from 48Ni using an Optical Time Projection Chamber at MSU was also presented at ARIS. Here it is clear that in future the study of the most exotic decay modes will use active targets, such as in the Maya detector developed at GANIL and the ACTAR-TPC project being planned by GANIL together with MSU. An interesting new result concerns the observation of β-delayed fission-precursors in the Hg-Tl region, where an unexpected asymmetric fragment-distribution has been observed for the β-delayed fission of 180Tl.

Unbound nuclei or resonance states are sometimes debated as “ghosts” without any physics significance. However, developments over the past 5–10 years have provided a huge amount of data, so that most of the previously empty spots on the nuclear chart for the light elements are now filled. The production of 10He and 12,13Li from proton-knockout reactions from 11Li and 14Be, respectively, is a particularly spectacular case. The knockout of a strongly bound proton from the almost unbound nucleus 14Be results in a 11Li nucleus that together with two neutrons shows features that can only be attributed to an unbound 13Li nucleus. Many of the resonance states might be populated in transfer reactions in inverse kinematics, in which the exotic nuclei are used as an energetic beam directed towards a target that was earlier used as the beam. The HELIOS spectrometer, which will use neutron-rich beams from the CARIBU injector at the Argonne Tandem Linear Accelerator System, is a model for what might develop at many facilities in the future.

The super-heavy-element community was represented in several talks at ARIS. Having produced all elements up to Z = 118, the next step is to tackle the Z = 120 barrier, an exciting goal that could become a reality with reactions such as 54Cr+248Cm. Nuclear spectroscopy is also climbing towards ever higher mass numbers and elements, as demonstrated by data from JYFL for 254No. One exciting talk concerned the chemical identification of isotopes of element 114 (287,288Uuq), which is found to belong to group 14 (in modern notation) in the periodic table – the group that contains lead, tin, germanium, silicon and carbon.

CCari5_10_11

The acquisition of data pertinent to nuclear astrophysics has grown tremendously thanks to the access to nuclei in relevant regions of the nuclear chart. Results include the study at JYFL and at the Nuclear-physics Accelerator Institute (KVI), Groningen, of β-decay of 8B for the solar-neutrino problem and the work at ISOLDE, JYFL and KVI on β-decays of 12N and 12B, which are important for the production of 12C in astrophysical environments. Data from ISOLDE and JYFL on the neutron-deficient nuclei 31Ar and 23Al, relevant for explosive hydrogen burning, were also discussed, as were results from MSU relating to the hot carbon–nitrogen–oxygen cycle and the αp-, rp- and r-processes in nucleosynthesis. GANIL has results on the reaction d(60Fe,p)61Fe, which is relevant for type II supernovae, while the Radioactive Ion Beam Facility in Brazil in São Paulo has data on the p(8Li,α)5He reaction. Calculations for proton scattering on 7Be in a many-body approach, combining the resonating-group method with the ab initio no-core shell model, were also described at the conference.

Exotic nuclei can also provide information about fundamental symmetries and interactions. The painstaking collection of data over decades has provided an extremely sensitive test of the unitarity of the top row of the CKM matrix. Today there are precise data for 13 super-allowed β-emitters, which give a value of 0.99990(60) for this quantity. In this context, there are plans for measurements with the Magneto Optical Trap at Argonne of β-neutrino correlations for 6He and the electric dipole moment for 225Ra. The high-precision set-ups – WITCH at CERN, LPC Trap at GANIL and WIRED at the Weizmann Institute – were also discussed at the conference. The claim is that this kind of experiment – the high-precision frontier – will to some extent complement the high-energy frontier in understanding the deepest secrets of nature.

Finally, a review of the different techniques using radioactive isotopes in solid-state physics presented the current state of the art, together with some recent results. This work was pioneered at CERN and has over the years become an important ingredient at many facilities.

In summary, ARIS 2011 turned out to be a successful merger of the former ENAM and RNB conferences (see box). The talks, supported by an excellent poster show, covered the field perfectly. The talks are available on the conference website and the organizers had the excellent idea of putting the posters there too – this is “a first”, to be followed in future.

The post ARIS 2011 charts the nuclear landscape appeared first on CERN Courier.

]]>
https://cerncourier.com/a/aris-2011-charts-the-nuclear-landscape/feed/ 0 Feature The first meeting in a new conference series. https://cerncourier.com/wp-content/uploads/2011/11/CCari1_10_11.jpg
ILC Global Design Effort publishes milestone report https://cerncourier.com/a/ilc-global-design-effort-publishes-milestone-report/ https://cerncourier.com/a/ilc-global-design-effort-publishes-milestone-report/#respond Fri, 26 Aug 2011 10:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/ilc-global-design-effort-publishes-milestone-report/ As its title suggests, the 162-page report represents the current status of the global R&D that is currently co-ordinated by the GDE.

The post ILC Global Design Effort publishes milestone report appeared first on CERN Courier.

]]>
The International Linear Collider (ILC) Global Design Effort (GDE) has released a major milestone report, The International Linear Collider: A Technical Progress Report. As its title suggests, the 162-page report represents the current status of the global R&D that is currently co-ordinated by the GDE. Coming roughly half way through the ILC Technical Design Phase, it documents the considerable progress that has been made worldwide towards a robust and technically mature design of a 500–1000 GeV electron–positron linear collider. With a stated five-year programme for the technical design phase, the GDE felt it necessary to have a significant mid-term publication milestone that would bridge the gap between the publication of the Reference Design Report (RDR) in 2007 and that of the foreseen Technical Design Report (TDR) in 2012. Because much of the R&D referred to in the report is still ongoing, it necessarily represents a snapshot of the current situation.

The focus of the progress report is on the co-ordinated worldwide “risk-mitigating” R&D that was originally identified at the time of the RDR publication. Although the report is comprehensive in covering nearly all areas of R&D, it has a strong focus on the development of the 1.3 GHz superconducting RF accelerating technology – the heart of the linear collider design. A large fraction of the total resource available has been used to develop the necessary worldwide infrastructure and expert-base in this technology, which includes research into high-gradient superconducting cavities as well as a focus on industrialization and mass-production models for this state-of-the-art technology. A further focus is on the three beam-test facilities: TTF/FLASH at DESY Hamburg, for the superconducting RF linac; the CesrTA facility at Cornell, for damping-ring electron cloud R&D; and ATF/ATF2 at KEK, for final focus optics, instrumentation and beam stabilization. Finally, the report also indicates work towards the ILC TDR baseline design and, in particular, the conventional facilities and siting activities.

The technical progress report will serve as a solid base for the production of the final report on the technical design phase R&D, which will be part of the TDR. Some 350 authors from more than 40 institutes around the globe have contributed to its successful publication. Now attention is already turning to producing the TDR – work that will formally start at the joint ILC-CLIC workshop being held in Granada in September.

• The report, which is available online at www.linearcollider.org/interim-report, is the first of two volumes; a second volume, to be released soon by the ILC Research Directorate, will focus on the ILC scientific case and on the design of the detectors associated with the collider.

The post ILC Global Design Effort publishes milestone report appeared first on CERN Courier.

]]>
https://cerncourier.com/a/ilc-global-design-effort-publishes-milestone-report/feed/ 0 News As its title suggests, the 162-page report represents the current status of the global R&D that is currently co-ordinated by the GDE.
Heavy ions in Annecy https://cerncourier.com/a/heavy-ions-in-annecy/ https://cerncourier.com/a/heavy-ions-in-annecy/#respond Fri, 26 Aug 2011 10:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/heavy-ions-in-annecy/ A turning point in understanding at Quark Matter 2011.

The post Heavy ions in Annecy appeared first on CERN Courier.

]]>
CCann1_06_11

Since the early 1980s, the Quark Matter conferences have been the most important venue for showing new results in the field of high-energy heavy-ion collisions. The 22nd in the series, Quark Matter 2011, took place in Annecy on 22–29 May and attracted a record 800 participants. Scheduled originally for 2010, it had been postponed to take place six months after the start of the LHC heavy-ion programme. It was hence – after Nordkirchen in 1987 and Stony Brook in 2001 – the third Quark Matter conference to feature results from a new accelerator.

The natural focus of the conference was on the first results from the approximately 108 lead–lead (Pb+Pb) collisions that each of the three experiments – ALICE, ATLAS and CMS –participating in the LHC heavy-ion programme have recorded at the current maximum centre-of-mass energy of 2.76 TeV per equivalent nucleon–nucleon collision. In addition, the latest results from the PHENIX and STAR experiments at Brookhaven’s Relativistic Heavy Ion Collider (RHIC) and its recent beam energy-scan programme featured prominently, as well as data from the Super Proton Synchrotron (SPS) experiments. The conference aimed at a synthesis in the understanding of heavy-ion data over two orders of magnitude in centre-of-mass energy.

The meeting also covered a range of theoretical highlights in heavy-ion phenomenology and field theory at finite temperature and/or density. And although, as one speaker put it, the wealth of first LHC data contributed much to the spirit that “the future is now”, there were sessions on future projects, including the programme of the approved experiment NA61/SHINE at the SPS, plans for upgrades to RHIC, experiments at the Facility for Antiproton and Ion Research under construction in Darmstadt, a plan for a heavy-ion programme at the Nuclotron-based Ion Collider facility in Dubna, as well as detailed studies for an electron–ion programme at a future electron–proton/electron–ion collider, e-RHIC, at Brookhaven, or LHeC at CERN.

Following a long-standing tradition, the conference was preceded by a “student day” featuring a set of introductory lectures catering for the particular needs of graduate students and young postdocs, who represented a third of the conference participants. The official conference inauguration was held on the morning of 22 May in the theatre at Annecy, the Centre Bonlieu, with welcome speeches from CERN’s director-general, Rolf Heuer, the director of the Institut National de Physique Nucléaire et de Physique des Particules (IN2P3), Jacques Martino, and the president of the French National Assembly, Bernard Accoyer. The same morning session featured an LHC status report by Steve Myers of CERN and a theoretical overview by Krishna Rajagopal of Massachusetts Institute of Technology.

Quark Matter 2011 also continued the tradition of scheduling summary talks of all of the major experiments in the introductory session. When the 800 participants walked in for a late lunch on the first day from the Centre Bonlieu along the Lake of Annecy to the Imperial Palace business centre, the site of the parallel sessions in the afternoon, they had listened to experimental summaries by Jurgen Schukraft for ALICE, Bolek Wyslouch for CMS, Peter Steinberg for ATLAS, Hiroshi Masui for STAR and Stefan Bathe for PHENIX. These 25-minute previews set the scene for the detailed discussions of the entire week.

This short report cannot summarize all of the interesting experimental and theoretical developments but it illustrates the breadth of the discussion with a few of the many highlights. Examples from three particular areas must therefore suffice to illustrate the richness of the new results and their implications.

The importance of flow

Heavy-ion collisions at all centre-of-mass energies have long been known to display remarkable features of collectivity. In particular, in semicentral heavy-ion collisions at ultra-relativistic energies, approximately twice as many hadrons above pT = 2 GeV are produced parallel to the reaction plane rather than orthogonal to it, giving rise to a characteristic second harmonic v2 in the azimuthal distribution of particle production. Only a month after the end of the first LHC heavy-ion run, the ALICE collaboration announced in December 2010 that this elliptic flow, v2, persists unattenuated from RHIC to LHC energies. The bulk of the up to 1600 charged hadrons produced per unit rapidity in a central Pb–Pb collision at the LHC seems to emerge from the same flow field. Moreover, the strength of this flow field at RHIC and at the LHC is consistent with predictions from fluid-dynamic simulations, in which it emerges from a partonic state of matter with negligible dissipative properties. Indeed, one of the main motivations for a detailed flow phenomenology at RHIC and at the LHC is that flow measurements constrain dissipative QCD transport coefficients that are accessible to first-principle calculations in quantum field theory, thus providing one of the most robust links between fundamental properties of hot QCD matter and heavy-ion phenomenology.

CCann2_06_11

Quark Matter 2011 marks a revolution in the dynamical understanding of flow phenomena in heavy-ion collisions. Until recently, flow phenomenology was based on a simplified event-averaged picture according to which a finite impact parameter collision defines an almond-shaped nuclear overlap region; the collective dynamics then translates the initial spatial asymmetries of this event-averaged overlap into the azimuthal asymmetries of the measured particle-momentum spectra. As a consequence, the symmetries of measured momentum distributions were assumed to reflect the symmetries of event-averaged initial conditions. However, over the past year it has become clear – in an intense interplay of theory and experiment – that there are significant fluctuations in the sampling of the almond-shaped nuclear overlap region on an event-by-event basis. The eventwise propagation of these fluctuations to the final hadron spectra results in characteristic odd flow harmonics, v1, v3, v5, which would be forbidden by the symmetries of an event-averaged spatial distribution at mid-rapidity.

In Annecy, the three LHC experiments and the two at RHIC all showed for the first time flow analyses at mid-rapidity that were not limited to the even flow harmonics v2 and v4; in addition, they indicated sizeable values for the odd harmonics that unambiguously characterize initial-state fluctuations (figure 1). This “Annecy spectrum” of flow harmonics was the subject of two lively plenary debates. The discussion showed that there is already an understanding – both qualitatively and on the basis of first model simulations – of how the characteristic dependence on centrality of the relative size of the measured flow coefficients reflects the interplay between event-by-event initial-state fluctuations and event-averaged collective dynamics.

CCann3_06_11

Several participants remarked on the similarity of this picture with modern cosmology, where the mode distribution of fluctuations of the cosmic microwave background also gives access to the material properties of the physical system under study. The counterpart in heavy-ion collisions may be dubbed “vniscometry”. Indeed, since uncertainties in the initial conditions of heavy-ion collisions were the main bottleneck in using data so far for precise determinations of QCD transport coefficients such as viscosity, the measurement of flow coefficients that are linked unambiguously to fluctuations in the initial state has a strong potential to constrain further the understanding of flow phenomena and the properties of hot strongly interacting matter to which they are sensitive.

Quark Matter 2011 also featured major qualitative advances in the understanding of high-momentum transfer processes embedded in hot QCD matter. One of the most important early discoveries of the RHIC heavy-ion programme was that hadronic spectra are generically suppressed at high transverse momentum by up to a factor of 5 in the most central collisions. With the much higher rate of hard processes at the tera-electron-volt scale, the first data from ALICE and CMS have already extended knowledge of this nuclear modification of single inclusive hadron spectra up to p= 100 GeV/c. In the range below 20 GeV/c, these data show a suppression that is slightly stronger but qualitatively consistent with the suppression observed at RHIC. Moreover, the increased accuracy of LHC data allows, for the first time, the identification of a nonvanishing dependence on transverse momentum of the suppression pattern from a factor of around 7 at pT = 6–7 GeV/c to a factor of about 2 at pT = 100 GeV/c, thus adding significant new information.

Another important constraint on understanding high-pT hadron production in dense QCD matter was established by the CMS collaboration with the first preliminary data on Z-boson production in heavy-ion collisions and on isolated photon production at pT up to 100 GeV/c. In contrast to all of the measured hadron spectra, the rate of these electroweakly interacting probes is unmodified in heavy-ion collisions (figure 2). The combination of these data gives strong support to models of parton energy loss in which the rate of hard partonic processes is equivalent to that in proton–proton collisions but the produced partons lose energy in the surrounding dense medium.

The next challenge in understanding high-momentum transfer processes in heavy-ion collisions is to develop a common dynamical framework for understanding the suppression patterns of single inclusive hadron spectra and the medium-induced modifications of reconstructed jets. Already in November 2010, the ATLAS and CMS collaborations reported that di-jet events in heavy-ion collisions show a strong energy asymmetry, consistent with the picture that one of the recoiling jets contains a much lower energy fraction in its jet conical catchment area as a result of medium-induced out-of-cone radiation. At Quark Matter 2011, CMS followed up on these first jet-quenching measurements by showing the first characteristics of the jet fragmentation pattern. Remarkably, these first findings are consistent with a certain self-similarity, according to which jets whose energy was degraded by the medium go on to fragment in the vacuum in a similar fashion to jets of lower energy.

This was the first Quark Matter conference in which data on the nuclear modification factor were discussed in the same session as data on reconstructed jets. All of the speakers agreed in the plenary debate that there will be much more to come. On the experimental side, advances are expected from the increased statistics of future runs, complementary analyses of the intra-jet structure and spectra for identified particles, as well as from a proton–nucleus run at the LHC, which would allow the dominant jet-quenching effect to be disentangled from possibly confounding phenomena. On the theoretical side, speakers emphasized the need to improve the existing Monte-Carlo tools for jet quenching with the aim of constraining quantitatively how properties of the hot and dense QCD matter produced in heavy-ion collisions are reflected in the modifications of hard processes.

Another highlight of the conference was provided by the first measurements of bottomonium in heavy-ion collisions, reported by the STAR collaboration for gold–gold (Au–Au) collisions at RHIC and the CMS collaboration for Pb–Pb collisions at the LHC. The charmonium and bottomonium families represent a well defined set of Bohr radii that are commensurate with the typical thermal length scales expected in dense QCD matter. On general grounds, it has long been conjectured that, depending on the temperature of the produced matter, the higher excited states of the quarkonium families should melt while the deeper-bound ground states may survive in the dense medium.

While the theoretical formulation of this picture is complicated by confounding factors related to limited understanding of the quarkonium formation process in the vacuum and possible new medium-specific formation mechanisms via coalescence, the CMS collaboration presented preliminary data of the Υ family that are in qualitative support of this idea (figure 3). In particular, CMS has established within statistical uncertainties the absence of higher excited states of the Υ family in the di-muon invariant mass spectrum, while the Υ 1s ground state is clearly observed. The rate of this ground state is reduced by around 40% (suppression factor, RAA = 0.6) in comparison with the yield in proton–proton collisions, consistent with the picture that the feed-down from excited states into this 1s state is stopped in dense QCD matter. STAR also reported a comparable yield. Clearly, this field is now eagerly awaiting LHC operations at higher luminosity to gain a clearer view of the conjectured hierarchy of quarkonium suppression in heavy-ion collisions.

In addition to the scientific programme, Quark Matter 2011 was accompanied by an effort to reach out to the general public. The week before the conference, the well known French science columnist Marie-Odile Monchicourt chaired a public debate between Michel Spiro, president of CERN Council, and Etienne Klein, director of the Laboratoire de Recherches sur les Sciences de la Matière at Saclay and professor of philosophy of science at the Ecole Central de Paris, attracting an audience of around 400 from the Annecy area. During the Quark Matter conference, physicists and the general public attended a performance by actor Alain Carré and the world-famous Annecy-based pianist Francois-René Duchable that merged classical music, literature and artistically transformed pictures from CERN. On another evening, the company Les Salons de Genève performed the play The Physicists, by Swiss writer Friedrich Dürrenmatt, in Annecy’s theatre. While the conference reached out successfully to the general public, participants encountered some problems in reaching out because the wireless in the conference centre turned out to be dysfunctional. However, the highlights were sufficiently numerous to reduce this to a footnote. As one senior member of the community put it during the conference dinner: “It was definitively the best conference since the invention of the internet.”

• For the full programme and videos of Quark Matter 2011, see http://qm2011.in2p3.fr.

The post Heavy ions in Annecy appeared first on CERN Courier.

]]>
https://cerncourier.com/a/heavy-ions-in-annecy/feed/ 0 Feature A turning point in understanding at Quark Matter 2011. https://cerncourier.com/wp-content/uploads/2011/08/CCann1_06_11.jpg
New European novel accelerator network formed https://cerncourier.com/a/new-european-novel-accelerator-network-formed/ https://cerncourier.com/a/new-european-novel-accelerator-network-formed/#respond Tue, 19 Jul 2011 10:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/new-european-novel-accelerator-network-formed/ The European Network for Novel Accelerators (EuroNNAc) was formally launched at a workshop held at CERN on 3–6 May as part of the EuCARD project.

The post New European novel accelerator network formed appeared first on CERN Courier.

]]>
The European Network for Novel Accelerators (EuroNNAc) was formally launched at a workshop held at CERN on 3–6 May as part of the EuCARD project. The aim was to form the network and define the work towards a significant Framework Programme 8 proposal for novel accelerator facilities in Europe.

The workshop was widely supported, with 90 participants from 51 different institutes, including 10 from outside Europe, and had high-level CERN support, with talks by Rolf Heuer, Steve Myers and Sergio Bertolucci. There were also contributions from leading experts in the field such as Gerard Mourou of the Institute Lumiere Extreme and Toshi Tajima of Ludwig Maximilians Universität, two senior pioneers in this field.

The field of plasma wakefield acceleration, which the new network plans to develop, is changing fast. Interesting beams of 0.3–1 GeV, with 1.5–2.5% energy spread, have now been produced in several places including France, Germany, the UK and the US, with promising reproducibility. Conventional accelerator laboratories are now interested to see if an operational accelerator can be built with these parameters. To avoid replication of work, a distributed test facility spread across many labs is envisaged for creating such a new device.

If a compact, 1 GeV test accelerator were pioneered, it could be copied for use around the world. Possible applications include tests in photon science or as a test beam for particle detectors. This could ease the present restrictions on beam time experienced by many researchers. These developments are currently being restricted to electron accelerators because they can be useful even when not fully reliable. Proton machines for medical purposes would, however, need to be more reliable.

In addition to the R&D aspects, the network discussed plans to create a school on Conventional to Advanced Accelerators – possibly linked to the CERN Accelerator School – and to establish a European Advanced Accelerator Conference.

The network activities will be closely co-ordinated with the TIARA and ELI projects. There is currently high funding support for laser science in Europe – about €4 billion in the next decade. EuroNNAc will help in defining the optimal way towards a compact, ultra-high-gradient linac. CERN will co-ordinate this work with help from the École Polytechnique and the University of Hamburg/DESY.

The post New European novel accelerator network formed appeared first on CERN Courier.

]]>
https://cerncourier.com/a/new-european-novel-accelerator-network-formed/feed/ 0 News The European Network for Novel Accelerators (EuroNNAc) was formally launched at a workshop held at CERN on 3–6 May as part of the EuCARD project.
LHC physics meets philosophy https://cerncourier.com/a/lhc-physics-meets-philosophy/ https://cerncourier.com/a/lhc-physics-meets-philosophy/#respond Tue, 19 Jul 2011 10:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/lhc-physics-meets-philosophy/ A report on a new interdisciplinary school held in Germany.

The post LHC physics meets philosophy appeared first on CERN Courier.

]]>
CCphi1_06_11

At the end of March, the first Spring School on Philosophy and Particle Physics took place in Maria in der Aue, a conference resort of the archbishopric of Cologne in the rolling hills of the area called Bergisches Land, between Cologne and Wuppertal. It was organized by the members of the Deutsche Forschungsgemeinschaft’s interdisciplinary research project, “Epistemology of the Large Hadron Collider”, which is based at the Bergische Universität Wuppertal. Part of the time was reserved for lecture series by distinguished representatives of each field, including: Wilfried Buchmüller, Gerardus ’t Hooft, Peter Jenni and Chris Quigg from physics; Jeremy Butterfield, Doreen Fraser and Paul Hoyningen-Huene from philosophy; and Helge Kragh from the history of science. The afternoons were devoted to five working groups of philosophy and physics students who discussed specific topics such as the reality of quarks and grand unification. The students then presented their results at the end of the school.

The large number of applications – more than 100 for 30 available places – from PhD students and young post-docs from all over the world demonstrated the strong interest in this interdisciplinary dialogue. There was an almost equal share of applicants from physics and philosophy. The pairing of students and lecturers from such different backgrounds made the school a great success. Almost all of the students rated it “very good” or “excellent” in their evaluations.

Theory and reality

The diverse academic backgrounds of the participants stimulated plenty of discussions during the lectures and working groups, as well as late into the night over beer. They centred on the main lecture topics: the reality of physical theories and concepts, experimental and theoretical methods in particle physics, and the history and philosophy of science.

For example, one of the working groups was concerned with the question, “Are quarks real?” Most physicists would, of course, answer “yes”. But then again, the existence of quarks is inferred in a way that is indirect and theory laden – much more than for, say, chairs and tables. Are there different levels of reality? Or are quarks just auxiliary constructs that will be superseded by other concepts in the future, as happened with the ether in the 19th century, for example? A comprehensive picture of philosophical attitudes towards the reality content of physical theories was discussed by the philosopher Hoyningen-Huene of the University of Hannover. His lecture series also presented critically other aspects of the philosophy of science, focusing on the classic ideas of Karl Popper and Thomas Kuhn: What qualifies as a scientific theory? Are physical theories verifiable? Are they falsifiable? How do physical theories evolve over time?

Fraser, of the University of Waterloo, and Butterfield, of the University of Cambridge, discussed the scope and applicability of particle and field concepts in the interpretation of quantum field theory (QFT), an area that is certainly one of the most successful achievements in physics. However, Fraser pointed out that the need for renormalization in QFT, as used in particle physics, reflects a conceptional problem. On the other hand, the more rigorous algebraic QFT does not allow for an interpretation in terms of particles, at least in the traditional sense.

Another topic that has attracted the attention of philosophers in recent years concerns gauge theories and spontaneous symmetry breaking, as Holger Lyre, of Otto-von-Guericke-Universität Magdeburg, discussed in his lecture. He asked whether it is justified to speak of “spontaneous breaking of a gauge symmetry” given that gauge symmetries are unobservable, a theme that was also discussed in a working group. Again, most physicists would take the pragmatic view that it is justified as long as all physical predictions are observed. Philosophers, however, look for the aspects of gauge theories that can count as being “objectively real”.

The contrarian attitudes between physicists and philosophers were put in a nutshell when a renowned physicist was asked whether he considers the electron to be a field or a particle, and the physicist replied: “Well, I usually think of it as a small yellow ball.” Pragmatism – motivated by a remarkably successful theoretical and experimental description of particle physics – clashed with the attempt to find unambiguous definitions for its basic theoretical constructs. It was one of the goals of the school to understand each other’s viewpoints in this context.

The physics lectures covered both experiment and theory. On the experimental side, Jenni, of CERN, and Peter Mättig, of the University of Wuppertal, discussed methods and basic assumptions that allow us to deduce the existence of new particles from electronic detector signals. As also discussed in one of the working groups, the inference from basic (raw) detector signals to claiming evidence for a theory is a long reach. The related philosophical question is on the justification of the various steps and their theory-ladenness; i.e. in which sense do theoretical concepts bias experimentation, and vice versa. Close to this is the additional question addressed in the discussion as to what extent the LHC experiments are fit to find any new particle or interaction that may occur.

The theory lectures of Robert Harlander, of the University of Wuppertal, Michael Krämer, of RWTH Aachen, and Quigg, of Fermilab, focused on the driving forces for new theories beyond the Standard Model. Apart from cosmological indications – comprehensively reviewed by DESY’s Buchmüller in one of the evening sessions – there is no inherent need for such a theory. Yet, almost everyone expects the LHC to open the door to a more encompassing theory. Why are physicists not happy with the Standard Model and what are the aims and criteria of a “better” theory? One of the working groups discussed specifically the quest for unification as one of the driving forces for a more aesthetic theory.

A current, highly valued guiding principle for model building is the concept of “naturalness”. To what extent are small ratios of natural parameters acceptable, such as the size of an atom compared with the size of the universe? As Nobel laureate ’t Hooft discussed in an evening talk, again there is no direct physics contradiction in having arbitrarily small parameters. But the physicists’ attitude is that large hierarchies are crying out for an explanation. Naturalness requires that a small ratio can arise only from a slightly broken symmetry. This is the background for many models that increase the symmetry of the Standard Model to justify the smallness of the weak scale relative to the Planck scale. Another idea that ’t Hooft discussed is to invoke anthropic arguments fuelled, for example, by the discovery of the string landscape consisting of something like 10500 different vacua.

Closely related to the philosophy of science is the history of science. The development of the Standard Model was the subject of one of the working groups and was also comprehensively discussed by Kragh, of the University of Aarhus. Looking at the sometimes controversial emergence of the Standard Model revealed lessons that may well shape the future. Kragh reminded the audience that what is considered “certain” today only emerged after a long struggle against some “certain facts” of former times.

At first glance, philosophical questions may not be directly relevant for our day-to-day work as physicists. Nevertheless, communication between the two fields can be fruitful for both sides. Philosophy reminds us to retain a healthy scepticism towards concepts that appear too successful to be questioned. In return, the developments of new experimental and theoretical methods and ideas may help to sharpen philosophical concepts. Looking into the history of physics may teach us how sudden perspectives can change. Coming at the brink of the possible discovery of new physics at the LHC, the school was a great experience, reflecting about what we as physicists take for granted. The plan is to have another school in two years.

The post LHC physics meets philosophy appeared first on CERN Courier.

]]>
https://cerncourier.com/a/lhc-physics-meets-philosophy/feed/ 0 Feature A report on a new interdisciplinary school held in Germany. https://cerncourier.com/wp-content/uploads/2011/07/CCphi1_06_11.jpg
Second Banff Workshop debates discovery claims https://cerncourier.com/a/second-banff-workshop-debates-discovery-claims/ https://cerncourier.com/a/second-banff-workshop-debates-discovery-claims/#respond Tue, 26 Oct 2010 10:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/second-banff-workshop-debates-discovery-claims/ On 11–16 July, the Banff International Research Station in the Canadian Rockies hosted a workshop for high-energy physicists, astrophysicists and statisticians to debate statistical issues related to the significance of discovery claims. This was the second such meeting at Banff (CERN Courier November 2006 p34) and the ninth in a series of so-called “PHYSTAT” workshops and […]

The post Second Banff Workshop debates discovery claims appeared first on CERN Courier.

]]>

On 11–16 July, the Banff International Research Station in the Canadian Rockies hosted a workshop for high-energy physicists, astrophysicists and statisticians to debate statistical issues related to the significance of discovery claims. This was the second such meeting at Banff (CERN Courier November 2006 p34) and the ninth in a series of so-called “PHYSTAT” workshops and conferences that started at CERN in January 2000 (CERN Courier May 2000 p17). The latest meeting was organized by Richard Lockhart, a statistician from Simon Fraser University, together with two physicists, Louis Lyons of Imperial College and Oxford, and James Linnemann of Michigan State University.

The 39 participants, of whom 12 were statisticians, prepared for the workshop by studying a reading list compiled by the organizers and by trying their hand at three simulated search problems inspired by real data analyses in particle physics. These problems are collectively referred to as the “second Banff Challenge” and were put together by Wade Fisher of Michigan State University and Tom Junk of Fermilab.

Significant issues

Although the topic of discovery claims may seem rather specific, it intersects many difficult issues that physicists and statisticians have been struggling with over the years. Particularly prominent at the workshop were the topics of model selection, with the attendant difficulties caused by systematic uncertainties and the “look-elsewhere” effect; measurement sensitivity; and parton density function uncertainties. To bring everyone up to date on the terminology and problematics of searches, three introductory speakers surveyed the relevant aspects of their respective fields: Lyons for particle physics, Tom Loredo of Cornell University for astrophysics and Lockhart for statistics.

Bob Cousins of the University of California, Los Angeles, threw the question of significance into sharp relief by discussing a famous paradox in the statistics literature, originally noted by Harold Jeffreys and later developed by Dennis Lindley, both statisticians. The paradox demonstrates with a simple measurement example that it is possible for a frequentist significance test to reject a hypothesis, whereas a Bayesian analysis indicates evidence in favour of that hypothesis. Perhaps even more disturbing is that the frequentist and Bayesian answers scale differently with sample size (CERN Courier September 2007 p39). Although there is no clean solution to this paradox, it yields several important lessons about the pitfalls of testing hypotheses.

One of these is that the current emphasis in high-energy physics on a universal “5σ” threshold for claiming discovery is without much foundation. Indeed, the evidence provided by a measurement against a hypothesis depends on the size of the data sample. In addition, the decision to reject a hypothesis is typically affected by one’s prior belief in it. Thus one could argue, for example, that to claim observation of a phenomenon predicted by the Standard Model of elementary particles, it is not necessary to require the same level of evidence as for the discovery of new physics. Furthermore, as Roberto Trotta of Imperial College pointed out in his summary talk, the emphasis on 5σ is not practiced in other fields, in particular cosmology. For example, Einstein’s theory of gravity passed the test of Eddington’s measurement of the deflection of light by the Sun with rather weak evidence when judged by today’s standards.

Statistician David van Dyk, of the University of California, Irvine, came back to the 5σ issue in his summary talk, wondering if we are really worried about one false discovery claim in 3.5 million tries. His answer, based on discussions during the workshop, was that physicists are more concerned about systematic errors and the “look-elsewhere” effect (i.e. the effect by which the significance of an observation decreases because one has been looking in more than one place). According to van Dyk, the 5σ criterion is a way to sweep the real problem under the rug. His recommendation: “Honest frequentist error rates, or a calibrated Bayesian procedure.”

Many workshop participants commented on the look-elsewhere effect. Taking this effect properly into account usually requires long and difficult numerical simulations, so that techniques to simplify or speed up the latter are eagerly sought. Eilam Gross, of the Weizmann Institute of Science, presented the work that he did on this subject with his student Ofer Vitells. Using computer studies and clever guesswork, they obtained a simple formula to correct significances for the look-elsewhere effect. In his summary talk, Luc Demortier of Rockefeller University showed how this formula could be derived rigorously from results published by statistician R B Davies in 1987. Statistician Jim Berger of Duke University explained that in the Bayesian paradigm the look-elsewhere effect is handled by a multiplicity adjustment: one assigns prior probabilities to the various hypotheses or models under consideration, and then averages over these.

Likelihoods and measurement sensitivity

Systematic uncertainties, the second “worry” mentioned by van Dyk, also came under discussion several times. From a statistical point of view, these uncertainties typically appear in the form of “nuisance parameters” in the physics model, for example a detector energy scale. Glen Cowan, of Royal Holloway, University of London, described a set of procedures for searching for new physics, in which nuisance parameters are eliminated by maximizing them out of the likelihood function, thus yielding the so-called “profile likelihood”. An alternative treatment of these parameters is to elicit a prior density for them and integrate the likelihood weighted by this density; the resulting marginal likelihood was shown by Loredo to take better account of parameter uncertainties in some unusual situations.

While the marginal likelihood is essentially a Bayesian construct, some statisticians have advocated combining a Bayesian handling of nuisance parameters with a frequentist handling of parameters of interest. Kyle Cranmer of New York University showed how this hybrid approach could be implemented in general within the framework of the RooFit/RooStats extension of CERN’s ROOT package. Unfortunately, systematic effects are not always identified at the beginning of an analysis. Henrique Araújo of Imperial College illustrated this with a search for weakly interacting massive particles that was conducted blindly until the discovery of an unforeseen systematic bias. The analysis had to be redone after taking this bias into account – and was no longer completely blind.

In searches for new physics, the opposite of claiming discovery of a new object is excluding that it was produced at a rate high enough to be detected. This can be quantified with the help of a confidence limit statement. For example, if we fail to observe a Higgs boson of given mass, we can state with a pre-specified level of confidence that its rate of production must be lower than some upper limit. Such a statement is useful to constrain theoretical models and to set the design parameters of the next search and/or the next detector. Therefore, in calculating upper limits, it is of crucial importance to take into account the finite resolution of the measuring apparatus.

How exactly to do this is far from trivial. Bill Murray of Rutherford Appleton Laboratory reviewed how the collaborations at the Large Electron–Positron collider solved this problem with a method known as CLS. He concluded that although this method works for the simplest counting experiment, it does not behave as desired in other cases. Murray recommended taking a closer look at an approach suggested by ATLAS collaborators Gross, Cowan and Cranmer, in which the calculated upper limit is replaced by a sensitivity bound whenever the latter is larger. Interestingly, van Dyk and collaborators had recently (and independently) recommended a somewhat similar approach in astrophysics.

Parton density uncertainties

As Lyons pointed out in his introductory talk, parton distribution functions (PDFs) are crucial for predicting particle-production rates, and their uncertainties affect the background estimates used in significance calculations in searches for new physics. It is therefore important to understand how these uncertainties are obtained and how reliable they are. John Pumplin of Michigan State University and Robert Thorne of University College London reviewed the state of the art in PDF fits. These fits use about 35 experimental datasets, with a total of approximately 3000 data points. A typical parametrization of the PDFs involves 25 floating parameters, and the fit quality is determined by a sum of squared residuals. Although individual datasets exhibit good fit quality, they tend to be inconsistent with the rest of the datasets. As a result, the usual rule for determining parameter uncertainties (Δχ2 = 1) is inappropriate, as Thorne illustrated with measurements of the production rate of W bosons.

The solution proposed by PDF fitters is to determine parameter uncertainties using a looser rule, such as Δχ2 = 50. Unfortunately, there is no statistical justification for such a rule. It clearly indicates that the assumption of Gaussian statistics badly underestimates the uncertainties, but it is not yet understood whether this is the result of unreported systematic errors in the data, systematic errors in the theory or the choice of PDF parametrization.

Statistician Steffen Lauritzen of the University of Oxford proposed a random-effects model to separate the experimental variability of the individual datasets from the variance arising from systematic differences. The idea is to assume that the theory parameter is slightly different for each dataset and that all of these individual parameters are constrained to the formal parameter of the theory via some distributional assumptions (a multivariate t prior, for example). Another suggestion was to perform a “closure test”, i.e. to check to what extent one could reproduce the PDF uncertainties by repeatedly fluctuating the individual data points by their uncertainties before fitting them.

In addition to raising issues that require further thought, the workshop provided an opportunity to discuss the potential usefulness of statistical techniques that are not well known in the physics community. Chad Schafer of Carnegie Mellon University presented an approach to constructing confidence regions and testing hypotheses that is optimal with respect to a user-defined performance criterion. This approach is based on statistical decision theory and is therefore general: it can be applied to complex models without relying on the usual asymptotic approximations. Schafer described how such an approach could help solve the Banff Challenge problems and quantify the uncertainty in estimates of the parton densities.

Harrison Prosper of Florida State University criticized the all-too frequent use of flat priors in Bayesian analyses in high-energy physics, and proposed that these priors be replaced by the so-called “reference priors” developed by statisticians José Bernardo, Jim Berger and Dongchu Sun over the past 30 years. Reference priors have several properties that should make them attractive to physicists; in particular their definition is very general, they are covariant under parameter transformations and they have good frequentist sampling behaviour. Jeff Scargle, of NASA’s Ames Research Center, dispatched some old myths about data binning and described an optimal data-segmentation algorithm known as “Bayesian blocks”, which he applied to the Banff Challenge problems. Finally, statistician Michael Woodroofe of the University of Michigan presented an importance-sampling algorithm to calculate significances under nonasymptotic conditions. This algorithm can be generalized to cases involving a look-elsewhere effect.

After the meeting, many participants expressed their enthusiasm for the workshop, which raised issues that need further research and pointed to new tools for analysing and interpreting observations. The discussions between sessions provided a welcome opportunity to deepen understanding of some topics and exchange ideas. That the meeting took place in the magical surroundings of the Banff National Park could only help its positive effect.

Further reading

The most recent PHYSTAT conference was at CERN in 2007, see http://phystat-lhc.web.cern.ch/phystat-lhc/. (Links to the earlier meetings can be found at www.physics.ox.ac.uk/phystat05/reading.htm.) Details about the 2010 Banff meeting are available at www.birs.ca/events/2010/5-day-workshops/10w5068.

The post Second Banff Workshop debates discovery claims appeared first on CERN Courier.

]]>
https://cerncourier.com/a/second-banff-workshop-debates-discovery-claims/feed/ 0 Feature
An international future for nuclear-physics research https://cerncourier.com/a/an-international-future-for-nuclear-physics-research/ https://cerncourier.com/a/an-international-future-for-nuclear-physics-research/#respond Tue, 28 Sep 2010 10:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/an-international-future-for-nuclear-physics-research/ An IUPAP working group takes a forward look.

The post An international future for nuclear-physics research appeared first on CERN Courier.

]]>
CCiup1_08_10

The International Union of Pure and Applied Physics (IUPAP) was established nearly 90 years ago to foster international co-operation in physics. It does this in part through the activities of a number of commissions for different areas of research, including the Commission on Nuclear Physics (C12), set up in 1960. In the mid-1990s, under Erich Vogt as chair, C12 identified the need for a coherent effort to stimulate international co-operation in nuclear physics. While it took some time for this new thrust to gain momentum, by 2003, under Shoji Nagamiya as chair, C12 established a subcommittee on International Co-operation in Nuclear Physics. This body, chaired by Anthony Thomas, then became IUPAP’s ninth official working group, WG.9, at the IUPAP General Assembly in Cape Town in October 2005. As many will be aware the first working group, IUPAP WG.1, is the International Committee of Future Accelerators (ICFA), which was formed more than 40 years ago and plays such an important role in particle physics.

The membership of IUPAP WG.9 was chosen to constitute a broad representation of geographical regions and nations, as one would expect for a working group of IUPAP. Its members consist of the working group’s chair, past-chair and secretary; the chairs and past-chairs of the Nuclear Physics European Collaboration Committee (NuPECC ), the Nuclear Science Advisory Committee (NSAC), the Asia Nuclear Physics Association (ANPhA) and the Latin-American Association for Nuclear Physics (ALAFNA); the chair of IUPAP C12; the directors of the large nuclear-physics facilities (up to four each from Asia, Europe and North America); and one further representative from Latin America. The working group meets every year at the same location as, and on the day prior to, the AGM of IUPAP C12 – whose members are encouraged to attend all meetings of IUPAP WG.9 as observers. Other meetings, such as the two-day Symposium on Nuclear Physics and Nuclear Physics Facilities, are held as required.

The first task of IUPAP WG.9 was to answer three specific questions:

• What constitutes nuclear physics from an international perspective?

• Which are the facilities that are used to investigate nuclear physics phenomena?

• Which are the scientific questions that these facilities are addressing?

The answers to these questions are given in IUPAP Report 41, which was published in 2007 and is posted on the IUPAP WG.9 website (IUPAP 2007). It contains entries for all nuclear-physics user facilities that agreed to submit data. The 90 entries range from smaller facilities with more restricted regional users to large nuclear-physics accelerator laboratories with a global user group. The report also has a brief review, prepared by the IUPAP WG.9 members, of the major scientific questions facing nuclear physics today, together with a summary of how these questions are being addressed by the current facilities or how they will be addressed by future and planned facilities. There is also a short account of the benefits that society has received, or is receiving, as a result of the discoveries made in nuclear physics.

In late 2005 the Office of Nuclear Physics in the US Department of Energy’s Office of Science requested the OECD Global Science Forum (GSF) that it establish a GSF Working Group on Nuclear Physics. The purpose of this working group was to prepare an international “landscape” for nuclear physics for the next 10 to 15 years. In particular, it was clear that for policy makers in many countries it is essential to understand how proposals for future facilities fit within an international context. IUPAP WG.9 agreed to provide expert advice to the GSF Working Group, and the chair and secretary of WG.9 as well as the chair of IUPAP C12 served as members of the GSF Working Group.

CCiup2_08_10

The work of the GSF Working Group was completed in March 2008, with the final version of the report being accepted by the OECD GSF. IUPAP Report 41 provided a great deal of valuable input, with the data and analysis contained within it helping to guide the deliberations of the GSF Working Group. Copies of the final OECD GSF report, which provides a global roadmap for nuclear physics for the next decade, in a format suitable for science administrators, are available from the OECD Secretariat; it also downloadable from the GSF website (OECD GSF 2008).

Central themes

In response to the mandate given to IUPAP WG.9 by the OECD GSF in a missive from its chair, Hermann-Friedrich Wagner, a two-day Symposium on Nuclear Physics and Nuclear Physics Facilities took place at TRIUMF on 2–3 July. The purpose of the symposium was to provide a forum where the international proponents of nuclear science could be appraised of, and discuss, the present and future plans for nuclear physics research, as well as the upgraded and new research facilities that will be required to realize these plans. This symposium was the first time that proponents of nuclear science, laboratory directors of the large nuclear physics facilities and governmental science administrators have met in an international context. The symposium is expected to be held every three years.

At the 2009 AGM of IUPAP WG.9, which was held at the Forschungszentrum Jülich in August 2009, the decision was taken to update the 90 descriptions of the nuclear-physics facilities and institutions. Following the requests for updated information, 35 replies with updated descriptions were received. These were entered into the online version of IUPAP Report 41 in January 2010. Following the International Symposium on Nuclear Physics and Nuclear Physics Facilities it became apparent that the introduction to the IUPAP Report 41 also needed updating. IUPAP WG.9 is currently reformulating the six main themes of nuclear physics today:

• Can the structure and interactions of hadrons be understood in terms of QCD?

• What is the structure of nuclear matter?

• What are the phases of nuclear matter?

• What is the role of nuclei in shaping the evolution of the universe, with the known forms of matter comprising only a meagre 5%?

• What physics is there beyond the Standard Model?

• What are the chief nuclear-physics applications serving society worldwide?

It is anticipated that these new descriptions for the roadmap for nuclear science will be entered in the online version of IUPAP Report 41 in January 2011.

The post An international future for nuclear-physics research appeared first on CERN Courier.

]]>
https://cerncourier.com/a/an-international-future-for-nuclear-physics-research/feed/ 0 Feature An IUPAP working group takes a forward look. https://cerncourier.com/wp-content/uploads/2010/09/CCiup2_08_10.jpg
IUPAP working group organizes two-day international symposium on plans for worldwide nuclear physics https://cerncourier.com/a/iupap-working-group-organizes-two-day-international-symposium-on-plans-for-worldwide-nuclear-physics/ https://cerncourier.com/a/iupap-working-group-organizes-two-day-international-symposium-on-plans-for-worldwide-nuclear-physics/#respond Tue, 24 Aug 2010 10:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/iupap-working-group-organizes-two-day-international-symposium-on-plans-for-worldwide-nuclear-physics/ A two-day Symposium on Nuclear Physics and Nuclear Physics Facilities, held at TRIUMF on 2–3 July, provided the opportunity for proponents of nuclear science across the world to learn about and discuss present and future plans for research in nuclear physics, as well as the upgraded and new research facilities that will be required to realize these plans.

The post IUPAP working group organizes two-day international symposium on plans for worldwide nuclear physics appeared first on CERN Courier.

]]>
A two-day Symposium on Nuclear Physics and Nuclear Physics Facilities, held at TRIUMF on 2–3 July, provided the opportunity for proponents of nuclear science across the world to learn about and discuss present and future plans for research in nuclear physics, as well as the upgraded and new research facilities that will be required to realize these plans.

The Working Group on International Cooperation in Nuclear Physics (WG.9) of the International Union of Pure and Applied Physics (IUPAP) organized the symposium. It was held as a response to the mandate given to the group by the OECD Global Science Forum in a missive from its chair, Hermann-Friedrich Wagner, following the recent report of the OECD Global Science Forum Working Group on Nuclear Physics. Three half-day presentations were arranged by the US Nuclear Science Advisory Committee (NSAC), by the Nuclear Physics European Collaboration Committee (NuPECC) and by the Asian Nuclear Physics Association (ANPhA), which was formed about two years ago on the urging of IUPAP WG.9.

The presentations at the symposium focused on five main themes of nuclear physics today: “Can the structure and interactions of hadrons be understood in terms of QCD?”, “What is the structure of nuclear matter?”, “What are the phases of nuclear matter?”, “What is the role of nuclei in shaping the evolution of the universe, with the known forms of matter comprising only a meagre 5%?” and “What is the physics beyond the Standard Model?”

The presentations led to extensive discussions among the various representatives. On the final half day, after a synopsis of the presentations and discussions by Robert Tribble of Texas A&M University, a panel discussion took place between the three nuclear-physics groupings of NSAC, NuPECC and ANPhA. This was followed by a series of statements by science administrators from the US Department of Energy, the Office of Science Nuclear Physics, the National Science Foundation Nuclear Physics, the INFN Third Commission, the French research bodies IN2P3/CNRS and the CEA/Service de Physique Nucleaire, the Japan Ministry of Education, Science, and Technology, the Korea Research Council and the China Institute of Atomic Energy.

For the first time, the symposium brought together nuclear-physics researchers, laboratory directors and nuclear-science administrators in an international setting. It showed a vigorous field of nuclear physics with demanding forefront challenges and large nuclear physics facilities being upgraded or coming on line presently or in the near future: CEBAF 12 GeV at Jefferson Laboratory, FRIB at Michigan State University, SPIRAL2 at GANIL, ISAC at TRIUMF, RIBF at RIKEN Nishina Center, J-PARC, FAIR at GSI, the upgraded RHIC at Brookhaven and in the more distant future EIC at Brookhaven or Jefferson Lab, ENC at FAIR, EURISOL (Europe charts future for radioactive beams) and LHeC at CERN. There are also several nuclear-physics facilities planned for China and Korea.

IUPAP WG.9 has given great encouragement to efforts aimed at strengthening co-operation in regional and international nuclear physics. At the symposium the nuclear-physics community was informed of the formation of a Latin America Nuclear Physics Association (ALAFNA) to strengthen nuclear physics in Latin America. Similar attempts may be undertaken in Africa.

• For further details about the working group, see the WG.9 website at www.iupap.org/wg/icnp.html.

The post IUPAP working group organizes two-day international symposium on plans for worldwide nuclear physics appeared first on CERN Courier.

]]>
https://cerncourier.com/a/iupap-working-group-organizes-two-day-international-symposium-on-plans-for-worldwide-nuclear-physics/feed/ 0 News A two-day Symposium on Nuclear Physics and Nuclear Physics Facilities, held at TRIUMF on 2–3 July, provided the opportunity for proponents of nuclear science across the world to learn about and discuss present and future plans for research in nuclear physics, as well as the upgraded and new research facilities that will be required to realize these plans.
Particle physics INSPIREs information retrieval https://cerncourier.com/a/particle-physics-inspires-information-retrieval/ https://cerncourier.com/a/particle-physics-inspires-information-retrieval/#respond Wed, 31 Mar 2010 07:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/particle-physics-inspires-information-retrieval/ A look at a new service for state-of-the-art information management.

The post Particle physics INSPIREs information retrieval appeared first on CERN Courier.

]]>
CCins1_04_10

Particle physicists thrive on information. They first create information by performing experiments or elaborating theoretical conjectures. Then they convey it to their peers by writing papers that are disseminated in a preprint form long before publication. Keeping track of this information has long been the task of libraries at the larger laboratories, such as at CERN, DESY, Fermilab and SLAC, as well as being the focus of indispensable services including arXiv and those of the Particle Data Group.

It is household knowledge that the web was born at CERN, and every particle physicist knows about SPIRES, the place where they can find papers, citations and information about colleagues. However, not everyone knows that the first US web server and the first database on the web came about at SLAC with just one aim: to bring scientific information to the fingertips of particle physicists through the SPIRES platform. SPIRES was hailed as the first “killer” application of the then nascent web.

No matter how venerable, the information tools currently serving particle physicists no longer live up to expectations and information management tools used elsewhere in the world have been catching up with those of the high-energy physics community. The soon to be released INSPIRE service will bring state-of-the-art information retrieval to the fingertips of researchers in high-energy physics once more, not only enabling more efficient searching but paving the way for modern technologies and techniques to augment the tried-and-tested tools of the trade.

Meeting demand

The INSPIRE project involves information specialists from CERN, DESY, Fermilab and SLAC working in close collaboration with arXiv, the Particle Data Group and publishers within the field of particle physics. “We separate the work such that we don’t duplicate things. Having one common corpus that everyone is working on allows us to improve remarkably the quality of the end product,” explains Tim Smith, head of the User and Document Services Group in the IT Department at CERN, which is providing the Invenio technology that lies at the core of INSPIRE.

In 2007, many providers of information in the field came together for a summit at SLAC to see how physics-information resources could be enhanced. The INSPIRE project emerged from that meeting and the vision behind it was built from a survey launched by the four labs to evaluate the real needs of the community (Gentil-Beccot et al. 2008.). A large number of physicists replied enthusiastically, even writing reams of details in the boxes that were made available to input free text. The bulk of the respondents noted that the SPIRES and arXiv services were together the dominant resources in the field. However, they pointed out that SPIRES in particular was “too slow” or “too arcane” to meet their current needs.

INSPIRE responds to this directive from the community by combining the most successful aspects of SPIRES (a joint project of DESY, Fermilab and SLAC) with the modern technology of Invenio (the CERN open-source digital-library software). “SPIRES’ underlying software was overdue for replacement, and adopting Invenio has given INSPIRE the opportunity to reproduce SPIRES’ functionality using current technology,” says Travis Brooks, manager of the SPIRES databases at SLAC. The name of the service, with the “IN” from Invenio augmenting SPIRES’ familiar name, underscores this beneficial partnership. “It reflects the fact that this is an evolution from SPIRES because the SPIRES service is very much appreciated by a large community of physicists. It is a sort of brand in the field,” says Jens Vigen, head of the Scientific Information Group at CERN.

However, INSPIRE takes its own inspiration from more than just SPIRES and Invenio. In searching for a paper, INSPIRE will not only fully understand the search syntax of SPIRES, but will also support free-text searches like those in Google. “From the replies we received to the survey, we could observe that young people prefer to just throw a text string in a field and push the search button, as happens in Google,” notes Brooks.

This service will facilitate the work of the large community of particle physicists. “Even more exciting is that after releasing the initial INSPIRE service, we will be releasing many new features built on top of the modern platform,” says Zaven Akopov of the DESY library. INSPIRE will enable authors and readers to help catalogue and sort material so that everyone will find the most relevant material quickly and easily. INSPIRE will also be able to store files associated with documents, including the full text of older or “orphaned” preprints. Stephen Parke, senior scientist at the Fermilab Theory Department looks forward to these enhancements: “INSPIRE will be a fabulous service to the high-energy-physics community. Not only will you be able to do faster, more flexible searching but there is a real need to archive all conference slides and the full text of PhD theses; INSPIRE is just what the community needs at this time.”

CCins2_04_10

Pilot users see INSPIRE already rising to meet these expectations, as remarked on by Tony Thomas, director of the Australian Research Council Special Research Centre for the Structure of Matter: “I tried the alpha version of INSPIRE and was amazed by how rapidly it responded to even quite long and complex requests.”

The Invenio software that underlies INSPIRE is a collaborative tool developed at CERN for managing large digital libraries. It is already inspiring many other institutes around the world. In particular, the Astrophysics Data System (ADS) – the digital library run by the Harvard-Smithsonian Center for Astrophysics for NASA – recently chose Invenio as the new technology to manage its collection. “We can imagine all sorts of possible synergies here,” Brooks anticipates. “ADS is a resource very much like SPIRES, but focusing on the astronomy/astrophysics and increasingly astroparticle community, and since our two fields have begun to do a lot of interdisciplinary work the tighter collaboration between these resources will benefit both user communities.”

Invenio is also being used by many other institutes around the world and many more are considering it. “In the true spirit of CERN, Invenio is an open-source product and thus it is made available under the GNU General Public Licence,” explains Smith. “At CERN, Invenio currently manages about a million records. There aren’t that many products that can actually handle so many records,” he adds.

Invenio has at the same time broadened its scope to include all sorts of digital records, including photos, videos and recordings of presentations. It makes use of a versatile interface that makes it possible, for example, to have the site available in 20 languages. Invenio’s expandability is being exploited to the full for the INSPIRE project where a rich set of back-office tools are being developed for cataloguers. “These tools will greatly ease the manual tasks, thereby allowing us to get papers faster and more accurately into INSPIRE,” explains Heath O’Connell from the Fermilab library. “This will increase the search accuracy for users. Furthermore, with the advanced Web 2.0 features of INSPIRE, users will have a simpler, more powerful way to submit additions, corrections and updates, which will be processed almost in real time”.

Researchers in high-energy physics were once the beneficiaries of world-leading information management. Now INSPIRE, anchored by the Invenio software, aims once again to give the community a world-class solution to its information needs. The future is rich with possibilities, from interactive PDF documents to exciting new opportunities for mining this wealth of bibliographic data, enabling sophisticated analyses of citations and other information. The conclusion is easy: if you are a physicist, just let yourself be INSPIREd!

• The INSPIRE service is available at http://inspirebeta.net/.

The post Particle physics INSPIREs information retrieval appeared first on CERN Courier.

]]>
https://cerncourier.com/a/particle-physics-inspires-information-retrieval/feed/ 0 Feature A look at a new service for state-of-the-art information management. https://cerncourier.com/wp-content/uploads/2010/03/CCins1_04_10.jpg
Extreme light rises in Eastern Europe https://cerncourier.com/a/extreme-light-rises-in-eastern-europe/ https://cerncourier.com/a/extreme-light-rises-in-eastern-europe/#respond Wed, 20 Jan 2010 09:48:40 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/extreme-light-rises-in-eastern-europe/ A new international player has entered the arena of intense short-pulse coherent light technology, with the latest developments in the Extreme Light Infrastructure (ELI) European project, which was launched in November 2007 in its preparatory phase and involves nearly 40 research and academic institutions from 13 EU member states.

The post Extreme light rises in Eastern Europe appeared first on CERN Courier.

]]>
A new international player has entered the arena of intense short-pulse coherent light technology, with the latest developments in the Extreme Light Infrastructure (ELI) European project, which was launched in November 2007 in its preparatory phase and involves nearly 40 research and academic institutions from 13 EU member states. At the end of 2009, ELI decided to create a pan-European Extreme Light Facility based at several research sites. The first three sites have been selected and a decision on a fourth site, to deal with “ultrahigh peak power”, will be taken in 2012 after validation of the technology.

The field of “extreme light” is opening up a new direction in fundamental and applied research. It is currently carried out in Europe – mainly in France, Germany, Russia and the UK – as well as in China, Japan, South Korea and the US. With the new initiative, other European countries hosting the three sites for the new facility are set to take a leading role.

The site in Prague, Czech Republic, will focus on providing ultrashort-pulse beams of energetic particles (10 GeV) and radiation (up to a few mega-electron-volts) produced from compact pulsed-laser plasma accelerators with a planned overall laser peak-power reaching 50 PW. In Hungary, a site in Szeged will be dedicated to extremely fast dynamics, taking snap-shots at the attosecond scale (10–18 s) of electron dynamics in atoms, molecules, plasmas and solids based on an optical few-femtosecond laser with an average power of several kilowatts.

The third site in Magurele, near Bucharest, Romania, will produce radiation and beam particles at energies high enough to address nuclear processes. With this facility a renaissance in the field of nuclear physics is expected. The planned laser peak-power will reach 30 PW. Intense radiation created at ELI could help to clarify the processes limiting the lifetime of nuclear power reactors, offer new avenues to control the lifetime of nuclear waste, fabricate new nuclear pharmaceutical products, and lead to laser-driven hadron therapy, and phase-contrast imaging as a medical diagnostic tool.

Completion of the fourth ELI site will afford new fundamental investigations into particle physics, nuclear physics, acceleration physics and ultrahigh-pressure physics, leading on to applications in astrophysics and cosmology. It will offer new research directions in high-energy physics relating to particle acceleration and the study of the vacuum structure and critical acceleration conditions.

ELI’s host countries have been mandated to form a pan-European Research Infrastructure Consortium (ERIC), which will be open to all European countries, and possibly others, willing to contribute to the realization of the project. A unique centralized management will preside over the integrated infrastructure. The host countries are to provide about 15% of the funding, while the EU is contributing the balance under its infrastructure investment programme. A total of €750 million is currently earmarked for the initial three sites.

The post Extreme light rises in Eastern Europe appeared first on CERN Courier.

]]>
https://cerncourier.com/a/extreme-light-rises-in-eastern-europe/feed/ 0 News A new international player has entered the arena of intense short-pulse coherent light technology, with the latest developments in the Extreme Light Infrastructure (ELI) European project, which was launched in November 2007 in its preparatory phase and involves nearly 40 research and academic institutions from 13 EU member states.
CERN to be reference lab for ITER’s superconductor tests https://cerncourier.com/a/cern-to-be-reference-lab-for-iters-superconductor-tests/ https://cerncourier.com/a/cern-to-be-reference-lab-for-iters-superconductor-tests/#respond Wed, 20 Jan 2010 09:43:10 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/cern-to-be-reference-lab-for-iters-superconductor-tests/ The fourth meeting of the Steering Committee of the CERN/ITER Collaboration Agreement took place at CERN on 19 November.

The post CERN to be reference lab for ITER’s superconductor tests appeared first on CERN Courier.

]]>
CCnew3_01_10

The fourth meeting of the Steering Committee of the CERN/ITER Collaboration Agreement took place at CERN on 19 November. It marked not only the end of a second year of successful collaboration between ITER and CERN on superconducting magnets and associated technologies but also the establishment of CERN as the ITER reference laboratory for superconducting strand testing for the next five years.

The implementation agreement for 2009 encompassed a variety of topics. These included expertise in stainless steel and welding, high-voltage engineering, the design of high-temperature superconductor current leads, and testing and consultancy in cryogenics and vacuum technology.

The main role of CERN as the ITER reference laboratory will be: to carry out yearly benchmarking of the acceptance test facilities at the six domestic agencies involved in superconducting strand production; to help in the training of the personnel involved in these tests around the world; and to carry out third-party inspection and expertise in case of problems during production. To this end, CERN will use the facilities that were set up for strand qualification for the LHC, but with an important modification: the upgrade of magnetic fields from 10 T to 15 T to properly test samples of niobium-tin (Nb3Sn) superconductors.

This programme has considerable synergy with the study for high-gradient quadrupoles in Nb3Sn that CERN is pursuing to prepare new technology for the LHC luminosity upgrade. Nb3Sn has a superior performance to the niobium-titanium alloy employed in the LHC. However, the brittleness of Nb3Sn and the need for high-temperature heat treatments mean that much R&D is still required. ITER will see the first large-scale use of Nb3Sn: some 400 tonnes of the conductor will be used for the toroidal field coils and the central solenoid.

The post CERN to be reference lab for ITER’s superconductor tests appeared first on CERN Courier.

]]>
https://cerncourier.com/a/cern-to-be-reference-lab-for-iters-superconductor-tests/feed/ 0 News The fourth meeting of the Steering Committee of the CERN/ITER Collaboration Agreement took place at CERN on 19 November. https://cerncourier.com/wp-content/uploads/2010/01/CCnew3_01_10.jpg
Insight starts here at DESY https://cerncourier.com/a/insight-starts-here-at-desy/ https://cerncourier.com/a/insight-starts-here-at-desy/#respond Mon, 07 Dec 2009 08:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/insight-starts-here-at-desy/ DESY director Helmut Dosch presents the laboratory's goals.

The post Insight starts here at DESY appeared first on CERN Courier.

]]>

The fundamental questions about the origins and the future of the universe motivated me to choose physics as a course of study when I was 18 years old. My career as a scientist then led me to do research in solid-state physics and finally to investigate solid-state boundaries and nanomaterials using synchrotron radiation and neutrons. As a result, I have more or less closed a circle through my work at DESY. Here, both focuses of my research are united under one roof: particle physics with its fundamental questions, and structural research using cutting-edge light sources – both are fields that provide us with the knowledge base for technological and medical progress.

In an anniversary year, it is time not only to cast a backward glance but also to look forward at a clear objective: working together with all of the people at DESY to strengthen further the lab’s world-class international position. Now that HERA has been decommissioned, the focus for our facilities in Hamburg and Zeuthen clearly lies on the new and innovative light sources that are being realized in the Hamburg metropolitan region. “Insight starts here” is the slogan that we have chosen for DESY’s research – insight based on top-quality accelerator facilities and an important role as a partner in international projects.

With PETRA III, we have built a synchrotron-radiation source that will outperform all other competitors that use storage-ring technology. As the most brilliant light source of its kind, PETRA III will offer outstanding opportunities for experimentation. It will be of particular benefit to scientists who need strongly focused, very short-wave X-ray radiation to gain high-resolution insights at the atomic level into biological specimens or new high-performance materials. There is a tremendous demand from researchers aiming to develop new materials in the area of nanotechnology or new medicines based on molecular biology. A new interdisciplinary centre for structural systems biology is being set up in the direct vicinity of PETRA III.

This equips us perfectly to deal with the challenges of today and tomorrow. But the DESY tradition is also to keep in mind the challenges of the day after tomorrow – in other words, to build the light sources of the future. With the free-electron lasers, DESY has again assured itself a place in the world’s leading ranks when it comes to the development of a new key technology. On the basis of the superconducting TESLA technology, we have created light sources that are entering completely new territory by generating high-intensity, ultrashort, pulsed X-ray radiation with genuine laser properties. With this kind of radiation, scientists can for the first time observe processes in the nano-cosmos in real time. They can, for instance, view “live broadcasts” of the formation and dissolution of chemical bonds. That is why there is such a great demand for the FLASH free-electron laser at DESY. The expectations concerning the European X-ray laser, the European XFEL, which is now being built in the Hamburg area, are correspondingly high. DESY is playing a key role regarding this new beacon for science. Among other things, it is building the heart of the facility: the accelerator, which is approximately 2 km long.

International scope

In the fields of high-energy and astroparticle physics, DESY is facing the challenges of the future, which are becoming increasingly global; the era of national accelerator facilities is now a thing of the past. The field is dominated by internationally oriented “world machines” such as the LHC at CERN. So it is quite appropriate that the laboratory already has a long tradition of international co-operation across cultural and political boundaries. At its two locations in Hamburg and Zeuthen, DESY is involved in a number of major facilities that are no longer supported by one country alone, but are implemented as international projects. For example, DESY is participating in the experiments at the LHC and computer centres are being built on the DESY campus to monitor the data-taking and analysis. DESY is also playing a major role in the next future-oriented project in particle physics, the design study for the International Linear Collider.

DESY researchers are also active in astroparticle physics, in projects that include the neutrino telescope IceCube at the South Pole and the development work for a future gamma-ray telescope facility, the Cherenkov Telescope Array. With these two projects, the researchers are taking advantage of the fastest and most reliable messengers from the far reaches of the cosmos – high-energy neutrinos and gamma radiation – to investigate the early stages of the universe.

This broad international orientation is one element of the base that will continue to support DESY in the future. We will go on systematically developing the three main research pillars of DESY: accelerator development, photon science and particle physics. Another important element is the promotion of young scientists, an activity in which DESY engages intensely in co-operation with universities. Our goal is to be a magnet for the best and most creative brains and to co-operate with them in the future to do what we do best: ensuring that insight starts here.

The post Insight starts here at DESY appeared first on CERN Courier.

]]>
https://cerncourier.com/a/insight-starts-here-at-desy/feed/ 0 Feature DESY director Helmut Dosch presents the laboratory's goals. https://cerncourier.com/wp-content/uploads/2009/12/CCnew10_09_09.jpg
Looking beyond the LHC https://cerncourier.com/a/looking-beyond-the-lhc/ https://cerncourier.com/a/looking-beyond-the-lhc/#respond Mon, 07 Dec 2009 09:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/looking-beyond-the-lhc/ A CERN Theory Institute investigated the impact of early LHC data.

The post Looking beyond the LHC appeared first on CERN Courier.

]]>
CClhc1_09_09

The LHC at CERN is about to start the direct exploration of physics at the tera-electron-volt energy scale. Early ground-breaking discoveries may be possible, with profound implications for our understanding of the fundamental forces and constituents of the universe, and for the future of the field of particle physics as a whole. These first results at the LHC will set the agenda for further possible colliders, which will be needed to study physics at the tera-electron-volt scale in closer detail.

Once the first inverse femtobarns of experimental data from the LHC have been analysed, the worldwide particle-physics community will need to converge on a strategy for shaping the field over the years to come. Given that the size and complexity of possible accelerator experiments will require a long construction time, the decision of when and how to go ahead with a future major facility needs to be undertaken in a timely fashion. Several projects for future colliders are currently being developed and soon it may be necessary to set priorities between these options, informed by whatever the LHC reveals at the tera-electron-volt scale

The CERN Theory Institute “From the LHC to a Future Collider” reviewed the physics goals, capabilities and possible results coming from the LHC and studied how these relate to possible future collider programmes. Participants discussed recent physics developments and the near-term capabilities of the Tevatron, the LHC and other experiments, as well as the most effective ways to prepare for providing scientific input to plans for the future direction of the field. To achieve these goals, the programme of the institute centred on a number of questions. What have we learnt from data collected up to this point? What may we expect to know about the emerging new physics during the initial phase of LHC operation? What do we need to know from the LHC to plan future accelerators? What scientific strategies will be needed to advance from the planned LHC running to a future collider facility? To answer the last two questions, the participants looked at what to expect from the LHC with a specific early luminosity, namely 10 fb–1, for different scenarios for physics at the tera-electron-volt scale and investigated which strategy for future colliders would be appropriate in each of these scenarios. Figure 1 looks further ahead and indicates a possible luminosity profile for the LHC and its sensitivity to new physics scenarios to come.

Present and future

The institute’s efforts were organized into four broad working groups on signatures that might appear in the early LHC data. Their key considerations were the scientific benefits of various upgrades of the LHC compared with the feasibility and timing of other possible future colliders. Hence, the programme also included a series of presentations on present and future projects, one on each possible accelerator followed by a talk on the strong physics points. These included the Tevatron at Fermilab, the (s)LHC, the International Linear Collider (ILC), the LHeC, the Compact Linear Collider (CLIC) concept and a muon collider.

Working Group 1, which was charged with studying scenarios for the production of a Higgs boson, assessed the implications of the detection of a state with properties that are compatible with a Higgs boson, whether Standard Model (SM)-like or not. If nature has chosen an SM-like Higgs, then ATLAS and CMS are well placed to discover it with 10 fb–1 (assuming √s = 14 TeV, otherwise more luminosity may be needed) and measure its mass. However, measuring other characteristics (such as decay width, spin, CP properties, branching ratios, couplings) with an accuracy better than 20–30% would require another facility.

The ILC would provide e+e collisions with an energy of √s = 500 GeV (with an upgrade path to √s = 1 TeV). It would allow precise measurements of all of the quantum numbers and many couplings of the Higgs boson, in addition to precise determinations of its mass and width – thereby giving an almost complete profile of the particle. CLIC would allow e+e collisions at higher energies, with √s = 1–3 TeV, and if the Higgs boson is relatively light it could give access to more of the rare decay modes. CLIC could also measure the Higgs self-couplings over a large range of the Higgs mass and study directly any resonance up to 2.5 TeV in mass in WW scattering.

Working Group 2 considered scenarios in which the first 10 fb–1 of LHC data fail to reveal a state with properties that are compatible with a Higgs boson. It reviewed complementary physics scenarios such as gauge boson self-couplings, longitudinal vector-boson scattering, exotic Higgs scenarios and scenarios with invisible Higgs decays. Two generic scenarios need to be considered in this context: those in which a Higgs exists but is difficult to see and those in which no Higgs exists at all. With higher LHC luminosity – for instance with the sLHC, an upgrade that gives 10 times more luminosity – it should be possible in many scenarios to determine whether or not a Higgs boson exists by improving the sensitivity to the production and decays of Higgs-like particles or vector resonances, for example, or by measuring WW scattering. The ILC would enable precision measurements of even the most difficult-to-see Higgs bosons, as would CLIC. The latter would be also good for producing heavy resonances.

Working Group 3 reviewed missing-energy signatures at the LHC, using supersymmetry as a representative model. The signals studied included events with leptons and jets, with the view of measuring the masses, spins and quantum numbers of any new particles produced. Studies of the LHC capabilities at √s = 14 TeV show that with 1 fb–1 of LHC luminosity, signals of missing energy with one or more additional leptons would give sensitivity to a large range of supersymmetric mass scales. In all of the missing-energy scenarios studied, early LHC data would provide important input for the technical and theoretical requirements for future linear-collider physics. These include the detector capabilities where, for example, the resolution of mass degeneracies could require exceptionally good energy resolution for jets, running scenarios, required threshold scans and upgrade options – for a γγ collider, for instance, and/or an e+e collider operating in “GigaZ” mode at the Z mass. The link with dark matter was also explored in this group.

CClhc2_09_09

Working Group 4 studied examples of phenomena that do not involve a missing-energy signature, such as the production of a new Z’ boson, other leptonic resonances, the impact of new physics on observables in the flavour sector, gravity signatures at the tera-electron-volt scale and other exotic signatures of new physics. The sLHC luminosity upgrade has the capability to provide additional crucial information on new physics discovered during early LHC running, as well as to increase the search sensitivity. On the other hand, a future linear collider – with its clean environment, known initial state and polarized beams – is unparalleled in terms of its abilities to conduct ultraprecise measurements of new and SM phenomena, provided that the new-physics scale is within reach of the machine. For example, in the case of a Z’, high-precision measurements at a future linear collider would provide a mass reach that is more than 10 times higher than the centre-of-mass energy of the linear collider itself. Attention was also given to the possibility of injecting a high-energy electron beam onto the LHC proton beam to provide an electron–proton collider, the LHeC. Certain phenomena such as the properties of leptoquarks could be studied particularly well with such a collider; for other scenarios, such as new heavy gauge-boson scattering, the LHeC can contribute crucial information on the couplings, which are not accessible with the LHC alone.

The physics capabilities of the sLHC, the ILC and CLIC are relatively well understood but will need refinement in the light of initial LHC running. In cases where the exploration of new physics might be challenging at the early LHC, synergy with a linear collider could be beneficial. In particular, a staged approach to linear-collider energies could prove promising.

The purpose of this CERN Theory Institute was to provide the particle-physics community with some tools for setting priorities among the future options at the appropriate time. Novel results from the early LHC data will open exciting prospects for particle physics, to be continued by a new major facility. In order to seize this opportunity, the particle-physics community will need to unite behind convincing and scientifically solid motivations for such a facility. The institute provided a framework for discussions now, before the actual LHC results start to come in, on how this could be achieved. In this context, the workshop report was also mentioned and made available to the European Strategy Session of the CERN Council meetings in September 2009. We now look forward to the first multi-tera-electron-volt collisions in the LHC, as well as to the harvest of new physics that these results will provide.

• For more about the institute, see http://indico.cern.ch/conferenceDisplay.py?confId=40437. The institute summary is available at http://arxiv.org/abs/0909.3240.

The post Looking beyond the LHC appeared first on CERN Courier.

]]>
https://cerncourier.com/a/looking-beyond-the-lhc/feed/ 0 Feature A CERN Theory Institute investigated the impact of early LHC data. https://cerncourier.com/wp-content/uploads/2009/12/CClhc1_09_09.jpg
Krakow welcomes 2009 EPS-HEP conference https://cerncourier.com/a/krakow-welcomes-2009-eps-hep-conference/ https://cerncourier.com/a/krakow-welcomes-2009-eps-hep-conference/#respond Wed, 30 Sep 2009 07:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/krakow-welcomes-2009-eps-hep-conference/ A report from Europe’s major international particle physics meeting.

The post Krakow welcomes 2009 EPS-HEP conference appeared first on CERN Courier.

]]>
CCeps1_08_09

The 13th-century merchants’ town of Krakow and former capital is now one of the largest and oldest cities in Poland. The scenic city centre – a UNESCO World Heritage Site – with its fascinating history and pleasant climate, provided the perfect setting for discussing new results and future developments at the biennial European Physical Society (EPS) conference on High Energy Physics (HEP). The event was held on 16–22 July at the new conference centre of the Jagiellonian University – the Auditorium Maximum.

The conference began with 35 parallel sessions and more than 350 contributions over two and a half days. Then, as is tradition, the EPS and European Committee for Future Accelerators scheduled a joint plenary meeting for Saturday afternoon. This focused on a number of talks concerning the future of the field: with Christian Spiering of DESY on “Astroparticle physics and relations with the LHC”; CERN’s director-general, Rolf Heuer, on “The high-energy frontier”; Alain Blondel of Geneva University on “The future of -accelerator-based neutrino physics'”; and Tatsuya Nakada of the École Polytechnique Fèdèrale de Lausanne on “Super-B factories”. Each presentation was followed by a lively discussion.

Sunday provided the opportunity for several sightseeing trips in and around Krakow. Monday saw a fresh start to the week and time for another tradition: the presentation of the EPS awards. For the first time, the High Energy and Particle Physics (EPS HEPP) prize was awarded to an experimental collaboration, Gargamelle, for the observation of the weak neutral current. After the awards ceremony, Frank Wilczek, the 2004 Nobel laureate, gave a special talk on “some ideas and hopes for fundamental physics”. This provided an excellent start to three days of plenary sessions, with around 35 presentations.

The Standard Model still reigns

CCeps2_08_09

The Tevatron proton–antiproton collider continues its smooth operation. With more than 6 fb–1 integrated luminosity delivered and peak luminosities exceeding 3.5 × 1032 cm–2s–1, the CDF and DØ experiments are steadily increasing their statistics. Both collaborations are pushing forward on the analysis of their latest data in a joint effort to confirm and enlarge the previously reported exclusion region for the Higgs mass of around 160–170 GeV. At the same time, several new ideas are emerging on how to improve the sensitivity of these experiments to more challenging Higgs decay channels. In addition to the direct search for the Higgs boson, both collaborations reported on new mass measurements of the W boson (MW = 80.399 ± 0.023 GeV) and confirmed the combined experimental result for the top quark mass (mt = 173.1 ± 1.3 GeV), pushing the error below the 1% level. These values lead to a further reduction of the preferred mass region for the Standard Model Higgs, as John Conway of the University of California Davis pointed out in his plenary presentation. Moreover, these and other precision measurements of the weak parameters (sin2θW = 0.2326 ± 0.0018stat ± 0.0006sys as compared with the theoretical prediction of sin2θW = 0.23149 ± 0.00013) show growing evidence that the Standard Model prefers a light Higgs, which, as Conway concluded, will make life difficult. Even for the large LHC experiments, ATLAS and CMS, this region of the window on the Higgs mass will require high statistics, combining different decay modes and sophisticated analyses.

A number of sophisticated statistical procedures are being developed and becoming available as complete software packages – for example, GFITTER – to simplify or fine-tune multidimensional analyses of experimental data. At the same time, there is impressive progress in calculating amplitudes for multileg processes and loops. A rather complete set of automatically derived “2 → 4 particle” cross-sections (the “Les Houches 2007 wish list”) demonstrates that higher-order corrections to important physics processes at the LHC cannot be ignored.

CCeps3_08_09

Increasing statistics at the Tevatron are also consolidating the observation of single top production, but at the same time the parameter space for new physics at or below the 1 TeV scale is becoming smaller, as Volker Büscher of Mainz explained. CDF and DØ have conducted studies that probe mass values for the charginos of supersymmetry up to 176 GeV; they find no evidence for neutralino production in their current data sample. In addition, the studies shift the possibility of quark compositeness or large extra dimensions further towards a higher energy scale.

While the latest updates on analyses of data from RHIC and the SPS were presented in the parallel sessions, Urs Wiedemann of CERN covered theoretical aspects of collective phenomena in his plenary talk. He summarized the motivation for experiments at RHIC (√sNN = 200 GeV) and the LHC (√sNN = 5500 GeV) to study the QCD properties of dense matter at the 150 MeV scale, which will be accessible at these high collision energies.

A wealth of new data is also emerging from the experimental analysis of B-physics – from both hadron colliders and e+e machines – ranging from analyses of rare exclusive decay modes to spectroscopy and physics related to the Cabibbo–Kobayashi–Maskawa matrix (CKM). The results further confirm oscillations in the neutral D and Bs sectors. This is another area where the Standard Model seems not to be seriously challenged: the CKM triangle appears to remain “closed” (within experimental errors). Nevertheless, as Andrzej Buras of TU Munich pointed out in his talk on “20 goals in flavour physics for the next decade”, there are still many challenges ahead. A breakthrough could come with firm experimental evidence for flavour-changing neutral currents in excess of Standard Model predictions. Buras’s message is clear: stay focused on the many observables that are not yet well measured and the decay modes that are not so far (or poorly) studied; spectacular deviations from the Standard Model remain possible.

CCeps4_08_09

With a new series of experiments under construction and several experiments producing new data, neutrino physics remains an experimentally driven enterprise. The neutrino sessions were – not surprisingly – very well attended. Better mass measurements are coming within reach, be it from obtaining upper limits by measuring time shifts in neutrinos from supernovae (mν < 30 eV) or from measuring the tritium β-decay spectrum (mν < 2 eV) or mass differences from oscillations (all Δm2 < 1 eV2). Because neutrinos are abundant in the universe, even a small neutrino mass will have implications in astrophysics. Dave Wark of Imperial College summarized the broad spectrum of neutrino physics experiments and their discovery potential. Anticipating the various experimental approaches and progress, he explained under which terms the Majorana phases, for example, could be determined.

New frontiers – on Earth and in space

CCeps5_08_09

While the large LHC experiments are commissioning their triggers, new ideas on the future of the LHC machine are being explored. These include high-luminosity schemes and higher beam energies, which will have different implications for future upgrades of both machine and experiments. R&D on accelerators is focusing not only on higher-energy frontiers and currents, but also on more efficient beam-crossing (“crab”) scenarios.

In a worldwide effort, the International Linear Collider collaboration aims to present a Technical Design Report in 2012 for a high-energy e+e machine. The Compact Linear Collider Study (CLIC) based at CERN, which aims for a Conceptual Design Report at the end of 2010, investigates different approaches and may reach a higher beam energy (3 TeV v 1 TeV). However, the physics simulations and detector designs for the two schemes face equal challenges.

The development of “super factories” is an ongoing effort that is complementary to the high-energy machines. These facilities should provide high-statistics experiments on, for example, the neutrino, charm and bottom sectors, with the necessary infrastructure for high-precision measurements. Caterina Biscari of Frascati presented a comprehensive overview of existing machines and (possible) future accelerators, in which she compared their main parameters.

CCeps6_08_09

The conference saw substantial contributions from astroparticle physics. The Auger experiment probing the highest energy cosmic rays (1020 eV) shows growing evidence for the Greisen–Zatsepin–Kuzmin cut-off. The energy spectrum agrees well (within the 25% calibration uncertainty on the energy scale) with results from the HiRes collaboration. Active galactic nuclei are now also observed by the High Energy Stereoscopic System (HESS) and the Large Area Telescope on the Fermi Gamma-ray Space Telescope. In particular, the core of Centaurus A appears to be extremely interesting owing to the bright radio source in its centre. High-energy cosmic rays are predominantly produced by “nearby” (< 100 Mpc) sources, while there is a slight indication that the composition changes with increasing energy, towards more heavy nuclei.

PAMELA (launched in 2006), the Advanced Thin Ionization Calorimeter balloon experiment (2008), Fermi (launched in 2008) and HESS show some excesses in the e± spectrum. The interpretation of these signals remains uncertain. Is it related to the nature of non-baryonic, that is, dark matter, or can the spectra be explained by astrophysics phenomena such as pulsars or supernova remnants? The PAMELA data have generated huge theoretical interest resulting in a multitude of dark-matter models. However, much more data are needed from both space-based experiments and ground-based searches for decaying weakly interacting massive particles. The Alpha Magnetic Spectrometer, finally scheduled to be launched in 2010, should at least provide much improved limits on the antiproton flux.

The next international Europhysics conference on high-energy physics will take place in Grenoble on 21–27 July 2011. After last years’ successful injection of proton beams into the LHC, followed by the unfortunate incident and subsequent repairs and consolidation, the starting date for high-energy collisions at the LHC is now rapidly approaching. At the meeting in Grenoble there will be lively discussions on Tevatron data – perhaps with surprises – and extensive reports, among others, on dark-matter searches. Of course, we all look forward to reports on first data analyses by the LHC experiments.

• The local organization of EPS-HEP 2009 by the Institute of Nuclear Physics PAN, Jagiellonian University, the AGH University of Science and Technology and the Polish Physical Society is acknowledged.

The post Krakow welcomes 2009 EPS-HEP conference appeared first on CERN Courier.

]]>
https://cerncourier.com/a/krakow-welcomes-2009-eps-hep-conference/feed/ 0 Feature A report from Europe’s major international particle physics meeting. https://cerncourier.com/wp-content/uploads/2009/09/CCeps1_08_09.jpg
Study group considers how to preserve data https://cerncourier.com/a/study-group-considers-how-to-preserve-data/ https://cerncourier.com/a/study-group-considers-how-to-preserve-data/#respond Wed, 29 Apr 2009 10:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/study-group-considers-how-to-preserve-data/ How can high-energy physics data best be saved for the future?

The post Study group considers how to preserve data appeared first on CERN Courier.

]]>

High-energy-physics experiments collect data over long time periods, while the associated collaborations of experimentalists exploit these data to produce their physics publications. The scientific potential of an experiment is in principle defined and exhausted within the lifetime of such collaborations. However, the continuous improvement in areas of theory, experiment and simulation – as well as the advent of new ideas or unexpected discoveries – may reveal the need to re-analyse old data. Examples of such analyses already exist and they are likely to become more frequent in the future. As experimental complexity and the associated costs continue to increase, many present-day experiments, especially those based at colliders, will provide unique data sets that are unlikely to be improved upon in the short term. The close of the current decade will see the end of data-taking at several large experiments and scientists are now confronted with the question of how to preserve the scientific heritage of this valuable pool of acquired data.

To address this specific issue in a systematic way, the Study Group on Data Preservation and Long Term Analysis in High Energy Physics formed at the end of 2008. Its aim is to clarify the objectives and the means of preserving data in high-energy physics. The collider experiments BaBar, Belle, BES-III, CLEO, CDF, D0, H1 and ZEUS, as well as the associated computing centres at SLAC, KEK, the Institute of High Energy Physics in Beijing, Fermilab and DESY, are all represented, together with CERN, in the group’s steering committee.

Digital gold mine

The group’s inaugural workshop took place on 26–28 January at DESY, Hamburg. To form a quantitative view of the data landscape in high-energy physics, each of the participating experimental collaborations presented their computing models to the workshop, including the applicability and adaptability of the models to long-term analysis. Not surprisingly, the data models are similar – reflecting the nature of colliding-beam experiments.

The data are organized by events, with increasing levels of abstraction from raw detector-level quantities to N-tuple-like data for physics analysis. They are supported by large samples of simulated Monte Carlo events. The software is organized in a similar manner, with a more conservative part for reconstruction to reflect the complexity of the hardware and a more dynamic part closer to the analysis level. Data analysis is in most cases done in C++ using the ROOT analysis environment and is mainly performed on local computing farms. Monte Carlo simulation also uses a farm-based approach but it is striking to see how popular the Grid is for the mass-production of simulated events. The amount of data that should be preserved for analysis varies between 0.5 PB and 10 PB for each experiment, which is not huge by today’s standards but nonetheless a large amount. The degree of preparation for long-term data varies between experiments but it is obvious that no preparation was foreseen at an early stage of the programs; any conservation initiatives will take place in parallel with the end of the data analysis.

The main issue will be the communication between the experimental collaborations and the computing centres after final analyses

From a long-term perspective, digital data are widely recognized as fragile objects. Speakers from a few notable computing centres – including Fabio Hernandez of the Centre de Calcul de l’Institut, National de Physique Nucléaire et de Physique des Particules, Stephen Wolbers of Fermilab, Martin Gasthuber of DESY and Erik Mattias Wadenstein of the Nordic DataGrid Facility – showed that storage technology should not pose problems with respect to the amount of data under discussion. Instead, the main issue will be the communication between the experimental collaborations and the computing centres after final analyses and/or the collaborations where roles have not been clearly defined in the past. The current preservation model, where the data are simply saved on tapes, runs the risk that the data will disappear into cupboards while the read-out hardware may be lost, become impractical or obsolete. It is important to define a clear protocol for data preservation, the items of which should be transparent enough to ensure that the digital content of an experiment (data and software) remains accessible.

On the software side, the most popular analysis framework is ROOT, the object-oriented software and library that was originally developed at CERN. This offers many possibilities for storing and documenting high-energy-physics data and has the advantage of a large existing user community and a long-term commitment for support, as CERN’s René Brun explained at the workshop. One example of software dependence is the use of inherited libraries (e.g. CERNLIB or GEANT3), and of commercial software and/or packages that are no longer officially maintained but remain crucial to most running experiments. It would be an advantageous first step towards long-term stability of any analysis framework if such vulnerabilities could be removed from the software model of the experiments. Modern techniques of software emulation, such as virtualization, may also offer promising features, as Yves Kemp of DESY explained. Exploring such solutions should be part of future investigations.

Examples of previous experience with data from old experiments show clearly that a complete re-analysis has only been possible when all of the ingredients could be accounted for. Siggi Bethke of the Max Planck Institute of Physics in Munich showed how a re-analysis of data from the JADE experiment (1979–1986), using refined theoretical input and a better simulation, led to a significant improvement in the determination of the strong coupling-constant as a function of energy. While the usual statement is that higher-energy experiments replace older, low-energy ones, this example shows that measurements at lower energies can play a unique role in a global physical picture.

The experience at the Large Electron-Positron (LEP) collider, which Peter Igo-Kemenes, André Holzner and Matthias Schroeder of CERN described, suggested once more that the definition of the preserved data should definitely include all of the tools necessary to retrieve and understand the information so as to be able to use it for new future analyses. The general status of the LEP data is of concern, and the recovery of the information – to cross-check a signal of new physics, for example – may become impossible within a few years if no effort is made to define a consistent and clear stewardship of the data. This demonstrates that both early preparation and sufficient resources are vital in maintaining the capability to reinvestigate older data samples.

The next-generation publications database, INSPIRE, offers extended data-storage capabilities that could be used immediately to enhance public or private information related to scientific articles

The modus operandi in high-energy physics can also profit from the rich experience accumulated in other fields. Fabio Pasian of Trieste told the workshop how the European Virtual Observatory project has developed a framework for common data storage of astrophysical measurements. More general initiatives to investigate the persistency of digital data also exist and provide useful hints as to the critical points in the organization of such projects.

There is also an increasing awareness in funding agencies regarding the preservation of scientific data, as David Corney of the UK’s Science and Technology Facilities Council, Salvatore Mele of CERN and Amber Boehnlein of the US Department of Energy described. In particular, the Alliance for Permanent Access and the EU-funded project in Framework Programme 7 on the Permanent Access to the Records of Science in Europe recently conducted a survey of the high-energy-physics community, which found that the majority of scientists strongly support the preservation of high-energy-physics data. One important aspect that was also positively appreciated in the survey answers was the question of open access to the data in conjunction with the organizational and technical matters, an issue that deserves careful consideration. The next-generation publications database, INSPIRE, offers extended data-storage capabilities that could be used immediately to enhance public or private information related to scientific articles, including tables, macros, explanatory notes and potentially even analysis software and data, as Travis Brooks of SLAC explained.

While this first workshop compiled a great deal of information, the work to synthesize it remains to be completed and further input in many areas is still needed. In addition, the raison d’être for data preservation should be clearly and convincingly formulated, together with a viable economic model. All high-energy-physics experiments have the capability of taking some concrete action now to propose models for data preservation. A survey of technology is also important, because one of the crucial factors may indeed be the evolution of hardware. Moreover, the whole process must be supervised by well defined structures and steered by clear specifications that are endorsed by the major laboratories and computing centres. A second workshop is planned to take place at SLAC in summer 2009 with the aim of producing a preliminary report for further reference, so that the “future of the past” will become clearer in high-energy physics.

The post Study group considers how to preserve data appeared first on CERN Courier.

]]>
https://cerncourier.com/a/study-group-considers-how-to-preserve-data/feed/ 0 Feature How can high-energy physics data best be saved for the future? https://cerncourier.com/wp-content/uploads/2009/04/CCdat1_04_09.jpg
ATLAS makes a smooth changeover at the top https://cerncourier.com/a/atlas-makes-a-smooth-changeover-at-the-top/ https://cerncourier.com/a/atlas-makes-a-smooth-changeover-at-the-top/#respond Wed, 29 Apr 2009 10:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/atlas-makes-a-smooth-changeover-at-the-top/ Peter Jenni and Fabiola Gianotti reflect on the role of ATLAS spokesperson.

The post ATLAS makes a smooth changeover at the top appeared first on CERN Courier.

]]>
CCint1_04_09

If you think that it might be time to retire after more than 15 years of leading a constantly growing international collaboration and of constructing the world’s largest-volume particle detector, then Peter Jenni would disagree. Nicknamed the “father of ATLAS” by his colleagues, Jenni was there in 1992 when the ATLAS collaboration was born out of two early proto-collaborations. Initially co-spokesperson, he was spokesperson from 1995 until March 2009, when he handed over to Fabiola Gianotti. Now he looks forward to getting back to the main purpose of ATLAS: the physics.

“I am very proud to have helped the collaboration to construct ATLAS. Twenty years ago we could only imagine the experiment in our dreams and now it exists,” says Jenni. “I could lead the collaboration for so long because I was supported by very good ATLAS management teams where the right people, such as Fabiola Gianotti, Steinar Stapnes, Marzio Nessi and Markus Nordberg over the past five years, were in the right places.”

As with most particle-physics experiments, the management of one of the two largest detectors at the LHC is a challenge that changes during the lifetime of the collaboration: it starts with the design phase, continues with the R&D and the construction and ends up with the data-taking and analysis. “Over the years I tried to balance the emphasis given by the collaboration to the different aspects, that is, the hardware part (initially very strong), the data preparation, computing and software,” confirms Jenni.

Originally “only” about 800-strong, the ATLAS collaboration today has almost 3000 members from all over the world. “Keeping the groups united, inviting new groups to join the collaboration, negotiating to find the funds necessary for the construction… these have been among my key tasks during the past 15 years,” he explains. “My efforts also went into keeping groups whose technologies were not retained in the collaboration. Most of the time we managed to have everyone accept the best arguments, but unfortunately there were a few exceptions.”

With such a vast amount of experience, what does Jenni regard as the key element for managing a successful collaboration? “Talking with as many people as possible is a key factor,” he says. “ATLAS members, even the youngest ones, knew that I was available to discuss all problems or issues at any time. With the exception of the Christmas period, I have tried to reply to all e-mails within 24 hours. By the way, that is why my son thinks physics is crazy and decided to study microtechnologies instead!”

While Jenni’s functions have changed, his engagement with ATLAS definitely has not. “A significant part of my work remains the same, particularly in the relationships of ATLAS with the outside world. My main duty is to help obtain a smooth transition, which is facilitated by the fact that Fabiola was one of my two deputies – and I have enjoyed working with her before.” Indeed, having more freedom now, he can think of doing more than just sharing some management duties. “In the medium term I have the ambition to study physics with ATLAS,” he says. “I am already ‘selling’ LHC physics in many public talks but I would like to contribute some real physics myself.”

The ATLAS collaboration is clearly appreciative of its father’s dedication over the years. At the party organized in Jenni’s honour on 19 February, the Collaboration Board (CB) chairs directed by Katie McAlpine – the author and singer of the LHC rap – sang: “We’ve been CB chairs/and we’re here to affirm /Peter’s time was more an era/ than just a few terms/ leading ATLAS to completion/ like no one else can/ Of course he did it/ Jenni is the man.”

The changeover

Now with the construction complete, it’s Gianotti’s turn to fill the spokesperson’s many shoes, after Jenni passed her the leadership baton in March. At the very beginning she joined LHC R&D activities and then the proto-ATLAS collaboration in 1990. “Heading such an ambitious scientific project, and a large and geographically distributed collaboration, is certainly a big honour, responsibility and challenge,” she says. “However, I have inherited a very healthy situation from Peter: the experiment has already shown that it performs well, the collaboration is united and strong, and we can continue to prepare for the first collisions without any major worry.”

Indeed, activity on ATLAS hasn’t stopped since the LHC incident on 19 September 2008. “The first single beams that circulated in the machine before the incident were very useful for studying several aspects of the experiment, such as the timing of the trigger system. After the LHC stopped, we decided to focus on some repairs to the detector and on the optimization of the software and computing infrastructure, of the data distribution chain, and of the event simulation and reconstruction,” confirms Gianotti.

An effective distribution of data to the worldwide community is a key point for the new ATLAS spokesperson because she thinks that this is the prime requisite for a motivated and successful collaboration. “The crucial challenge for me is to make sure that each single member of ATLAS can participate effectively and successfully in the adventure that this experiment represents. ATLAS has a very exciting future ahead, with many possible discoveries that will change the landscape of high-energy physics. I consider it very important that each individual in this experiment can actively participate in the data analysis, regardless of whether he or she can physically be at CERN or not. In particular, we have to make sure the younger generations are nurtured in a stimulating environment, share the excitement for the wonderful physics opportunities and are given visibility and recognition,” she explains.

While the sharing of data relies mostly on the performance of the Grid and the software and computing infrastructure put in place by the collaboration, it cannot occur without the other side of the coin – effective and open communication in real-time with all members of the collaboration. “The solution we have envisaged is a web space where ATLAS people will be able to find updated ‘on-line’ news about the machine, the experiment, the physics results, anything that is relevant to ATLAS’ life,” explains Gianotti.

Asked about the potential “competition” among many people working on the same analysis, she says: “I think it is healthy that people from different groups work on the same topic with a collaborative and constructive spirit. This will allow us to produce solid, verified and fully understood results.” Regarding the relationship with CMS, the other general-purpose LHC experiment, she says, “There is a healthy competition, but also collaboration. For instance, ATLAS and CMS have set up a common group that works on statistics tools and how to combine the information coming from both experiments.”

The excitement about the restart of the LHC is growing again at CERN and around the world, and the experiments all have their own plans and strategies. “Before undertaking the path towards discoveries, we will need to understand the performance of our detector in all details and ‘rediscover’ the Standard Model,” says Gianotti. “I believe that we will be ready to start investigating new territories when we have observed top-quark production. Indeed, final states arising from the production of top quark–antiquark pairs contain most of the interesting physics objects, from leptons to missing energy and light- and heavy-flavour jets. In addition, this process is the main background to many searches for new physics. Being able to reconstruct these events successfully, and perform our first measurements of the top production cross-section and mass, will give us a clear indication that we are ready for discoveries.”

When does Gianotti expect ATLAS to release the first results? “It all depends on the performance of the machine – and its luminosity and energy profile. If everything goes well we expect to have first results, mainly addressing the detector performance, for the winter physics conferences early in 2010; then we hope to present the first interesting physics results at the summer conferences of the same year.”

The post ATLAS makes a smooth changeover at the top appeared first on CERN Courier.

]]>
https://cerncourier.com/a/atlas-makes-a-smooth-changeover-at-the-top/feed/ 0 Feature Peter Jenni and Fabiola Gianotti reflect on the role of ATLAS spokesperson. https://cerncourier.com/wp-content/uploads/2009/04/CCint1_04_09-feature.jpg
CERN inaugurates the LHC https://cerncourier.com/a/cern-inaugurates-the-lhc/ https://cerncourier.com/a/cern-inaugurates-the-lhc/#respond Mon, 08 Dec 2008 01:15:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/cern-inaugurates-the-lhc/ The LHC inauguration ceremony officially marked the end of 24 years of conception, development, construction and assembly of the biggest and most sophisticated scientific tool in the world.

The post CERN inaugurates the LHC appeared first on CERN Courier.

]]>
CClhc1_10_08

The member-state representatives, including Swiss president Pascal Couchepin, French prime minister François Fillon and several ministers, applaud CERN’s director-general, Robert Aymar, during the opening presentation. “The LHC is a marvel of modern technology, which would not have been possible without the continuous support of our member states,” he said. “This is an opportunity for me to thank them on behalf of the world’s particle-physics community.”

CClhc2_10_08

The LHC inauguration ceremony officially marked the end of 24 years of conception, development, construction and assembly of the biggest and most sophisticated scientific tool in the world. After the LHC was proposed in 1984, it was 10 years before Council approved the project. “Its construction has taken more than 14 years and there have been many challenges, which have all been overcome,” said the LHC project leader, Lyn Evans, in his speech at the ceremony. “We are now looking forward to the start of the experimental programme, where new secrets of nature will undoubtedly be revealed.”

CClhc3_10_08

A highlight of the presentation by Lyn Evans was an excerpt from a recording made on 12 October when the Morriston Orpheus Choir from Swansea was joined by Welsh first minister Rhodri Morgan in the CERN Control Centre and blessed the LHC with song.

CClhc4_10_08

François Fillon (third from right), prime minister of France, in the LHC tunnel near the CMS experiment, together with Philippe Lebrun (right), head of the Accelerator Technology Department at CERN. In his speech later, Fillon said: “When the decision was taken to construct the LHC, I was minister for higher education and research. I fought for this project, which some regarded as an impossible dream. I believe that this dream can be realized. That was 14 years ago. Today the facility exists and it is spectacular.”

CClhc5_10_08

Director-general Robert Aymar (centre) with leaders of CERN’s two host states: Pascal Couchepin (left), president of the Swiss Confederation, and François Fillon, prime minister of the Republic of France.

CClhc6_10_08

Ján Mikolaj, deputy prime minister and minister of education of the Slovak Republic, during his presentation at the ceremony. “We know of the importance of fundamental research,” he said. “We also know that CERN is a driving force for development of new technologies.”

CClhc7_10_08

Annette Schavan, Germany’s federal minister of education and research, being greeted by Robert Aymar. One of three ministers who gave speeches during the ceremony, she noted that CERN: “Reflects the strength of research in a special way: the strength that lies behind people’s yearning to find out more about the origins of the universe.”

CClhc8_10_08

Ministers wait in line to take their turn as José Mariano Gago (left), Portuguese minister for science, technology and higher education, signs the commemorative electronic plate, watched by CERN’s Carlos Lourenço. During his speech, which was very well received, Gago referred to CERN as “a miraculous scientific laboratory that is a decisive attractor of talent from all over the world”.

CClhc9_10_08

This Daruma Doll was originally painted with only one eye to mark the start of the LHC project. It was presented to former CERN director-general Christopher Llewellyn Smith 13 years ago. To mark the end of the project, Toshio Yamauchi (left), senior vice-minister of education, culture, sports, science and technology in Japan, added the second eye and presented the completed doll to current director-general, Robert Aymar (right). Llewellyn Smith witnessed the ceremony together with Swiss president Pascal Couchepin. The Daruma Doll Ceremony is a Japanese tradition that symbolizes the completion of a project.

CClhc10_10_08

Carolyn Kuan conducts the Orchestre de la Suisse Romande in Origins, an audiovisual concert specially commissioned for the LHC inauguration. It featured the imagery of National Geographic photographer Frans Lanting and the music of Philip Glass, adapted from Life: A Journey Through Time, which was originally produced in California. The visual score charted the history of the universe from the Big Bang to the present day, and it included imagery from CERN’s experiments.

CClhc11_10_08

After the ceremony, guests were treated to a buffet of molecular gastronomy. Chef Ettore Bocchia collaborated with the physics and chemistry departments of Parma and Ferrara universities in Italy to create a scientific feast of Italian cuisine, which was optimized for both taste and health.
Image credit: M Struik.

CClhc12_10_08

Musical highlights of the LHCfest for CERN personnel, which followed the official inauguration, included a live performance of the “LHC rap” by AlpineKat, who was joined on stage by a very special backing dancer – none other than the LHC project leader Lyn Evans.
Image credit: M Struik.

CClhc13_10_08

A wall of fame in particle physics greeted VIPs as they entered the ceremony, with photos from the exhibition Accelerating Nobels, beginning (right) with Donald Glaser, inventor of the bubble chamber, who received the Nobel Prize in Physics in 1960. Between 2006 and 2008, photographer Volker Steger, with the help of CERN and the Lindau Meetings Foundation, photographed more than 40 Nobel laureates and invited each of them to draw their most important discoveries.
Image credit: M Struik.

CClhc14_10_08

Accelerating Nobels is an exhibition that centres on 19 laureates whose work is closely related to CERN and the LHC. The exhibition has been on view at CERN in the Globe of Science and Innovation, where CERN’s Nobel laureates, including Carlo Rubbia (left) and Simon van der Meer, took pride of place.

CClhc15_10_08

Over the past decade, industry has played an important part in developing, building and assembling the LHC, its experiments and the computing infrastructure. To thank industry for its exceptional contributions to the LHC project, CERN organized a special industry day on 20 October. More than 70 companies attended. Here Lucio Rossi, who led the LHC magnet construction, addresses the assembled participants.

CClhc16_10_08

Ten firms were honoured on the industry day for their fundamental contributions to the LHC machine, detectors and computing grid: Ineo GDF Suez, Air Liquide, Alstom, ASG Superconductors, ATI Wah Chang, Babcock Noell, Intel, Linde Kryotechnik, Luvata Group and Oracle. A plaque in the lobby area near CERN’s main auditorium, unveiled by the director-general during the day, commemorates their exceptional contributions.

• For a video of the highlights of the ceremony see http://cdsweb.cern.ch/record/1136012.

The post CERN inaugurates the LHC appeared first on CERN Courier.

]]>
https://cerncourier.com/a/cern-inaugurates-the-lhc/feed/ 0 Feature The LHC inauguration ceremony officially marked the end of 24 years of conception, development, construction and assembly of the biggest and most sophisticated scientific tool in the world. https://cerncourier.com/wp-content/uploads/2008/12/CClhc1_10_08-feature.jpg
ASPERA names its magnificent seven https://cerncourier.com/a/aspera-names-its-magnificent-seven/ https://cerncourier.com/a/aspera-names-its-magnificent-seven/#respond Mon, 20 Oct 2008 10:53:25 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/aspera-names-its-magnificent-seven/ In the same room that hosted the first Solvay conference in 1911 at the Hotel Metropôle in Brussels, on 29–30 September the AStro Particle ERAnet (ASPERA) network presented the European strategy for astroparticle physics.

The post ASPERA names its magnificent seven appeared first on CERN Courier.

]]>
In the same room that hosted the first Solvay conference in 1911 at the Hotel Metropôle in Brussels, on 29–30 September the AStro Particle ERAnet (ASPERA) network presented the European strategy for astroparticle physics. As a result of two years of intensive coordination and brainstorming work, the document highlights seven large-scale projects that should shed light on some of the most exciting questions about the universe.

Questions surrounding dark matter, the origin of cosmic rays, violent cosmic processes and the detection of gravitational waves are among those that the “magnificent seven” of ASPERA’s roadmap will address. Specifically, the projects are: CTA, a large array of Cherenkov Telescopes for detection of cosmic high-energy gamma rays; KM3NeT, a cubic-kilometre-scale neutrino telescope in the Mediterranean sea; tonne-scale detectors for dark matter searches; a tonne-scale detector for the determination of the fundamental nature and mass of neutrinos; a megatonne-scale detector for the search for proton decay, neutrino astrophysics and the investigation of neutrino properties; a large array for the detection of charged cosmic rays; and a third-generation underground gravitational antenna.

 

Each of these large-scale projects may cost several hundred-million euros, and therefore needs to gather funds through large international collaborations that extend beyond Europe. The ASPERA conference gathered about 200 scientists and officials from funding agencies around the world. Representatives from Canada, China, the EU, India, Japan, Russia and the US agreed on the importance of defining a global strategy and coordinating efforts worldwide. In line with this approach, the Astroparticle Physics European Coordination (the initiator of the ASPERA network) is starting negotiation with the OECD Science Global Forum.

The EU will nevertheless remain a major actor, having funded ASPERA and will fund the follow-up, ASPERA2. It also supports the European Strategy Forum on Research Infrastructures (ESFRI). The ESFRI committee released a first road map in 2006 and is expected to release an updated road map at the end of 2008, including some of the projects selected by ASPERA.

The post ASPERA names its magnificent seven appeared first on CERN Courier.

]]>
https://cerncourier.com/a/aspera-names-its-magnificent-seven/feed/ 0 News In the same room that hosted the first Solvay conference in 1911 at the Hotel Metropôle in Brussels, on 29–30 September the AStro Particle ERAnet (ASPERA) network presented the European strategy for astroparticle physics.
CERN Council looks forward to imminent start-up of the LHC https://cerncourier.com/a/cern-council-looks-forward-to-imminent-start-up-of-the-lhc/ https://cerncourier.com/a/cern-council-looks-forward-to-imminent-start-up-of-the-lhc/#respond Tue, 08 Jul 2008 08:13:37 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/cern-council-looks-forward-to-imminent-start-up-of-the-lhc/ At its 147th meeting on 20 June, CERN Council heard news on progress towards start-up of the LHC later this summer.

The post CERN Council looks forward to imminent start-up of the LHC appeared first on CERN Courier.

]]>
At its 147th meeting on 20 June, CERN Council heard news on progress towards start-up of the LHC later this summer. In addition, the latest in a series of audits covering all aspects of safety and environmental was presented to Council at the meeting. It addressed the question of whether there is any danger related to the production of new particles at the LHC.

Commissioning of the 27 km LHC started in 2007 with the first cool down of one of the machine’s eight sectors. Once successfully cooled, each sector has to pass through hardware commissioning, which involves intensive electrical tests, before being handed over to the operations team. By the time of the Council meeting, five of the eight sectors were at or close to the operating temperature of 1.9 K and the remaining three were at various stages of being cooled down. Moreover, sector 5-6 had passed through all steps of the hardware commissioning and was in the hands of the operations team.

When the LHC starts up this summer, its proton beams will collide at higher energies than have ever been produced in a particle accelerator, although nature routinely produces higher energies in cosmic-ray collisions. Nevertheless, concerns about the safety of whatever might be created in such high-energy particle collisions have been addressed for many years.

The latest review of the safety of the LHC’s collisions was prepared by the LHC Safety Assessment Group (LSAG), which comprises scientists at CERN, the University of California, Santa Barbara, and the Institute for Nuclear Research of the Russian Academy of Sciences. The LSAG report updates a 2003 paper by the LHC Safety Study Group and incorporates recent experimental and observational data. It confirms and strengthens the conclusion of the 2003 report that there is no cause for concern. Whatever the LHC will do, nature has already done many times over during the lifetime of the Earth and other astronomical bodies.

The new report has been reviewed by the Scientific Policy Committee (SPC), which advises Council on scientific matters. A panel of five independent scientists, including one Nobel Laureate, reviewed and endorsed the authors’ approach of basing their arguments on irrefutable observational evidence to conclude that new particles produced at the LHC will pose no danger. The panel presented its conclusions to a meeting of the full 20 members of the SPC, who unanimously approved this conclusion, prior to the Council meeting.

• The LSAG report is accompanied by a summary in non-technical language. It is available together with other documents relating to the safety and environmental impact of the LHC at http://public.web.cern.ch/public/en/LHC/Safety-en.html.

The post CERN Council looks forward to imminent start-up of the LHC appeared first on CERN Courier.

]]>
https://cerncourier.com/a/cern-council-looks-forward-to-imminent-start-up-of-the-lhc/feed/ 0 News At its 147th meeting on 20 June, CERN Council heard news on progress towards start-up of the LHC later this summer.
A vision for the future of CERN https://cerncourier.com/a/viewpoint-a-vision-for-the-future-of-cern/ Wed, 16 Apr 2008 12:38:12 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/viewpoint-a-vision-for-the-future-of-cern/ CERN is a de facto global laboratory, with the LHC set to be the centre of particle-physics research for a decade or more, and comprises the largest scientific user community in the world.

The post A vision for the future of CERN appeared first on CERN Courier.

]]>
CERN is a de facto global laboratory, with the LHC set to be the centre of particle-physics research for a decade or more, and comprises the largest scientific user community in the world. More than just a particle factory, CERN is a knowledge factory, enabling scientists to make discoveries, disseminate them and train younger generations. CERN is an example to the world of international scientific, technical and human collaboration.

CCview1_04_08

The Council recognized – in the European Strategy for Particle Physics – that the next five years will be crucial, not only for CERN but for the future of particle physics in general. The start of LHC exploitation will provide a unique capability to explore new experimental vistas and an opportunity to seek support for possible new projects, such as upgrading the LHC itself and for a successor to explore the LHC’s breakthroughs.

Gathering the necessary support will require motivating and mobilizing the energies of all CERN stakeholders, both internal and external. Not only is it essential that the LHC be a technical success, but also that the implications of its discoveries for possible new projects be evaluated promptly and convincingly. This new physics should excite the imaginations not only of high-energy physicists, but also the wider scientific community and the general public – even schoolchildren and politicians. Only then could the future of accelerator-based research into the fundamental nature of matter – and any major new project – be assured.

Even so, it will be essential to optimize the deployment of the resources available for particle physics at CERN and other European laboratories. Any new project will surely be global in nature, so it will also be necessary to amplify the dialogue with our prospective partners in other regions of the world.

Possible directions

The European Strategy for Particle Physics also recognizes the importance of R&D on possible future projects in the period before LHC results set the favoured direction for particle physics. The Proton Accelerators for the Future group that advises the director-general has already made an initial plan for upgrading CERN’s Proton Accelerator Complex (Garoby 2007). Following these studies, the director-general set out the R&D priorities that have been approved by CERN Council.

Another report, by the CERN advisory group on Physics Opportunities with Future Proton Accelerators (POFPA) (Blondel et al. 2006), also reviewed some of the scientific motivation for upgrading the LHC. This report discusses possible synergies of upgrades of the LHC injector chain with research programmes in fixed-target physics, neutrino physics and nuclear physics.

Beyond the LHC, there is a general consensus that the priority for the next major international project is a linear electron–positron collider. The International Linear Collider (ILC) concept is potentially very interesting if the LHC reveals new physics within its energy range. Nevertheless, even in this case, physics will eventually require higher energies, hence the need for R&D on the Compact Linear Collider (CLIC) concept, within the international framework provided by the CLIC Test Facility 3 collaboration. CLIC, however, is ambitious; significant technical hurdles remain to be overcome before its feasibility can be demonstrated.

In view of the different options for the location, energy and timescale of a future linear electron–positron collider, CERN should collaborate with partners in Europe and elsewhere on R&D for a possible next-generation neutrino project. This might be based on the “super-beam” and “beta-beam” concepts, or it might be a full-blown “neutrino factory” based on a muon storage ring. The choice between these options will depend on technical feasibility as much as new results in neutrino physics, such as measurements of – or constraints on – the third neutrino-mixing angle.

In parallel with these major projects, the European Strategy recognizes the importance of a variety of smaller-scale projects at CERN that address complementary issues in particle and nuclear physics. Many of these, such as the Antiproton Decelerator, ISOLDE, nTOF and some fixed-target experiments are of unique global scientific interest. Such projects broaden the appeal of CERN and help train many young physicists. The POFPA report underlined the interest of several proposals for the future, in addition to those that would be made possible by upgrades of the LHC injector chain.

While the laboratory’s technical strength is the bedrock upon which any future CERN project will be built, this is likely to be even more global in nature than the LHC, with CERN becoming recognized explicitly as a “world laboratory”. Hence, CERN will need to nurture and build on its existing international partnerships with Canada, Japan, Russia and the US, while collaboration with emerging powers such as China and India should be expanded. CERN’s growing contact with other world regions such as the Middle East and Latin America will also become more important. CERN’s future plans should be discussed with its international collaborators in a spirit of partnership, in which the interests of all regions of the world are respected.

• R Garoby 2007 (http://doc.cern.ch/archive/electronic/cern/preprints/ab/ab-2007-074.pdf) and A Blondel et al. 2006 (http://arXiv.org/pdf/hep-ph/0609102).

The post A vision for the future of CERN appeared first on CERN Courier.

]]>
Opinion CERN is a de facto global laboratory, with the LHC set to be the centre of particle-physics research for a decade or more, and comprises the largest scientific user community in the world. https://cerncourier.com/wp-content/uploads/2008/04/CCview1_04_08-feature.jpg
Neuroscience explores our internal universe https://cerncourier.com/a/neuroscience-explores-our-internal-universe/ https://cerncourier.com/a/neuroscience-explores-our-internal-universe/#respond Wed, 16 Apr 2008 11:48:20 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/neuroscience-explores-our-internal-universe/ Neuroscientist Wolf Singer engages the audience at CERN during his colloquium on "The brain, an orchestra without conductor".

The post Neuroscience explores our internal universe appeared first on CERN Courier.

]]>
When physicists at CERN try to understand the basic building blocks of the universe, they build gigantic detectors – complex, intricately wired instruments that are capable of measuring and identifying hundreds of particles with extraordinary precision. In a sense, they build “brains” to analyse the particle interactions. For prominent neuroscientist Wolf Singer, director of the Max Planck Institute for Brain Research in Frankfurt, the challenge is quite the opposite. He and other researchers are trying to decode the dynamics of a mass of intricate “wiring”, with as many as 1011 neurons connected by 1014 “wires”. The brain is the most complex living system, and neuroscientists are only beginning to unravel its secrets.

CCint1_04_08

Until recently, according to Singer, the technical tools available to neuroscientists were rather primitive. “Until a decade ago, most researchers in electrophysiology used handmade electrodes – either of glass tubes or microwires – to record the activity of a single element in this complex system,” explained Singer. “The responses were studied in a meticulous way and it was hoped that a greater understanding would arise of how the brain works. It was believed that a central entity was the source of our consciousness, where decisions are made and actions are initiated. We have now learned that the system isn’t built the way we thought – it is actually a highly distributed system with no central coordinator.”

Myriads of processes occur simultaneously in the brain, computing partial results. There is no place in this system where all of the partial results come together to be interpreted coherently. The fragments are all cross-connected and researchers are only now discovering the blueprint of this circuitry.

This mechanism poses some new and interesting problems that have intrigued Singer for many years. How is it possible for the partial results that are distributed in the brain to be bound together in dynamical states, even though they never meet at any physical location? Singer gives the example of looking at a barking dog. When this happens, all 30 areas of the cerebral cortex that deal with visual information are activated. Some of these areas are interested in colour, some in texture, others in motion and still others in spatial relations. All of these areas are simultaneously active, processing various signals and applying memory-based knowledge in order to perceive a coherent object. A tag is needed in this distributed system at a given moment of time so as to distinguish between the myriads of neurons activated by a particular object or situation, and those activated by simultaneous background stimuli. In 1986 Singer discovered that neurons engage in synchronized oscillatory activity. His hypothesis is that the nervous system uses synchronization to communicate.

Singer stresses that researchers in his field are closer to theorists in high-energy physics, because the tools necessary to decode the large amount of data generated by the brain’s activity do not exist yet. “This morning when I toured the ATLAS experiment, I heard how the data generated at the collision point is much richer, but physicists use filters to extract the most interesting data, which they formulate in highly educated ways,” said Singer. “The amount of data generated by the sensory organs is more than the brain could digest, so it reduces redundancy. Due to this enormous amount of data, the brain, by evolution, developed a way to filter it all. The most important information for us is based on survival, such as where food can be found or how our partners look.”

Brain function and communication

Singer began his career as a medical student at the Ludwig Maximilian University in 1962 in his hometown of Munich. He was inspired to specialize in neuroscience after attending a seminar by Paul Matussek and Otto Creutzfeldt, who discussed schizophrenia and “split brain” patients. After his postgraduate studies in psychophysics and animal behaviour at the University of Sussex, he worked on the staff of the Department of Neurophysiology at the Max Planck Institute for Psychiatry in Munich and completed his Habilitation in physiology at the Technical University of Munich. In 1981 he was appointed director at the Max Planck Institute for Brain Research in Frankfurt and in 2004 he co-founded the Frankfurt Institute for Advanced Studies.

The 20th century brought many advances in fundamental physics, including the discovery of elementary particles. During this same period, neuroscience provided greater illumination of the brain’s functions. One of the most significant is the identification of individual nerve cells and their connections by Camillo Golgi and Santiago Ramón y Cajal, winners of the Nobel Prize for Medicine in 1906. Another important advance was the introduction of the discontinuity theory, which regards neurons as isolated cells that transmit chemical signals to each other. This understanding allowed neuroscientists to determine the way in which the brain communicates with other parts of itself and the rest of the body.

Some of the results of the first studies of the relationships between function and the different areas of the brain were made using patients injured during the First World War. Later, with the discovery of magnetic imaging to study brain function, researchers were able to turn to non-invasive methods, but there is still much more development needed. With procedures such as magnetic resonance imaging, a neurologist can find out where a signal originates; but the signal is indirect, coming from the more oxygenated areas. A magnetic field of 3 T applied to an area of a square millimetre can show which part of the brain is activated (e.g. by emotions and pain) and reveal the various networks along which the signals travel.

The system is so complex and we are constantly learning new things

Wolf Singer

At the same time, neuroscientists are trying to decode the system and explain how biophysical processes can produce what is experienced in a non-material way – a meta-to-mind kind of riddle – with new entities and the creation of social realities such as sympathy and empathy. This is leading to a new branch of neuroscience, known as social neuroscience.

In other research, colleagues of Singer are studying the effects of meditation on the brain. They found that it creates a huge change in brain activity. It increases synchronization and is in fact a highly active state, which explains why it cannot be achieved by immature brains, such as in small children. Buddhist monks use their attention to focus the “inner eye” on their emotional outlet and so cleanse their platform of consciousness. In 2005 Singer attended the annual meeting for the Society of Neuroscience in Washington, DC, together with the Dalai Lama. Their meeting resulted in discussions about the synchronization of certain brain waves when the mind is highly focused or in a state of meditation.

Singer is also no stranger to controversy. His ideas about how some of the results of brain research could have an impact on legal systems caused a sensation in 2004. His theory that free will is merely an illusion is based on converging evidence from neurobiological investigations in animals and humans. He states that in neurobiology the way in which someone reacts to events is something that he or she could not have done much differently. “In everyday conditions the system is deterministic and you want your system to function reliably. The system is so complex and we are constantly learning new things,” explained Singer. There are many factors that determine how free someone is in their will and thinking. Someone could have false wiring in the part of the brain that deals with moral actions, or perhaps does not store values properly in their brain, or could have a chemical imbalance. All of these biological factors contribute to how someone reacts in a given situation.

Singer feels strongly that the general public should be aware of what scientists are working on and that enlightenment is essential. “Science should be a cultural activity,” he said, adding that in society the people who are considered “cultured” generally are knowledgeable in art, music, languages and literature, but not well versed in mathematics and science.

In 2003 he received the Communicator Prize of the Donors’ Association for the Promotion of Sciences and Humanities and the Deutsche Forschungsgemeinschaft in Germany. Communicating his passion to the young has been a challenging and yet highly rewarding experience. He works to engage society in discussions about the research in his field, providing greater transparency and comprehension. His dedication to improving communication between scientists and schools is evident in the programme that he has initiated: Building Bridges – Bringing Science into Schools. This creates a stronger dialogue between scientists, students and teachers. • For Wolf Singer’s colloquium at CERN, “The brain, an orchestra without conductor”, see indico.cern.ch/conferenceDisplay.py?confId=26835

The post Neuroscience explores our internal universe appeared first on CERN Courier.

]]>
https://cerncourier.com/a/neuroscience-explores-our-internal-universe/feed/ 0 Feature Neuroscientist Wolf Singer engages the audience at CERN during his colloquium on "The brain, an orchestra without conductor". https://cerncourier.com/wp-content/uploads/2008/04/CCint1_04_08-feature.jpg
Terascale Alliance takes off in Germany https://cerncourier.com/a/terascale-alliance-takes-off-in-germany/ https://cerncourier.com/a/terascale-alliance-takes-off-in-germany/#respond Wed, 16 Apr 2008 11:14:19 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/terascale-alliance-takes-off-in-germany/ The next big advances in particle physics are expected to happen at the "terascale". The tremendous complexity and size of experiments at the LHC and the proposed International Linear Collider (ILC) challenge the way that physicists have traditionally worked in high-energy physics.

The post Terascale Alliance takes off in Germany appeared first on CERN Courier.

]]>
The next big advances in particle physics are expected to happen at the “terascale”. The tremendous complexity and size of experiments at the LHC and the proposed International Linear Collider (ILC) challenge the way that physicists have traditionally worked in high-energy physics. The German project Physics at the Terascale – a Helmholtz Alliance that will receive €25 m over five years from Germany’s largest organization of research centres, the Helmholtz Association – will address these challenges.

CCtera1_04_08

The Alliance bundles and enhances resources at 17 German universities, two Helmholtz Centres (the Forschungszentrum Karlsruhe and DESY) and at the Max Planck Institute for Physics in Munich. It focuses on the creation of a first-class research infrastructure and complements the existing funding mechanisms in Germany at local and federal level. With the help of the new project, central infrastructures are developed and are shared among all Alliance members. The Alliance will fund many of these measures for the first few years. From the beginning, a central point of the proposal has been that the long-term future of these activities is guaranteed by the universities and the research centres beyond the running period of the Alliance funds.

CCtera2_04_08

The Alliance supports four well defined research topics (physics analysis, Grid computing, detector science and accelerators) and a number of central “backbone” activities, such as fellowships, interim professorships, communication and management.

Close-knit infrastructure

What is new about this common infrastructure? Previously, each of these institutes developed their infrastructure and expertise for their own purposes. Now, triggered by the Alliance, different institutes share their resources. Common infrastructures are developed and are made available to all physicists in Germany working on terascale physics. For example, this means that if PhD student Jenny Boek of Wuppertal wants to develop a chip for slow controls, she can now use the infrastructure and take advantage of the expertise in chip design in Bonn.

These central infrastructures can be concrete installations – like a chip development and design laboratory, located at a specific location – or virtual ones, like the National Analysis Facility, which will help all LHC groups in Germany to participate more efficiently in the analysis of data from the LHC. Common to all of these is that these infrastructures are open to all members of the Alliance, and are initially funded through it.

CCtera3_04_08

An important goal of the Alliance is to organize interactions between the different experimental groups and between the experiment and theory communities on all topics of interest for physics analysis at the terascale. This includes meetings and the formation of working groups with members from all interested communities, the organization of schools and other common activities. It can also mean basic services, such as the design and maintenance of Monte Carlo generators, or include exchanges on the underlying theoretical models. In all of these studies, while the focus is initially on the LHC, the role of the ILC will also feature as a future facility of key importance in the field.

CCtera4_04_08

In the same spirit, Alliance funds are used to improve the Grid infrastructure significantly in Germany, to serve the global computing needs of the LHC as well as the specific requirements of German physicists to contribute to the data analysis. Funds are provided to supplement the existing Tier-2 structure in Germany by building up Tier-2s at several universities, and to support the National Analysis Facility at DESY. Additional money is provided to allow for significant contributions to improve Grid technologies with the emphasis on making the Grid easier for the general user.

The third research topic, detector development, involves plans for the future beyond the immediate data flow from the LHC. Institutes are already developing next-generation detectors for the ILC and for LHC upgrades. A Virtual Laboratory for Detector Development will provide central infrastructures to support the different groups for these projects. A number of universities and DESY are setting up infrastructures with special emphasis on chip design, irradiation facilities, test beams and engineering support. Again, although these facilities are at specific locations, they serve the whole community.

Fostering young talent

The Alliance also wants to increase the involvement of universities in accelerator research in Germany. Through a number of programmes – for example a school for students on accelerator science or lectures at universities – the Alliance tries to increase the involvement of universities in accelerator research over the long term. Rolf Heuer of DESY, one of the initiators of the project, explains the motivation: “Germany led the way to the TESLA technology collaboration and its success, and we want to stay at the forefront of accelerator development. Without it, progress in many areas of science will not be possible.”

A substantial part of the Alliance’s funding goes into the creation of more than 50 positions for young scientists and engineers all over Germany. The five Alliance Young Investigators groups and the Alliance fellowships play a special role: they are supposed to attract young physics talents from all over the world to Germany and to the terascale. Many of these positions are tenure-track, something quite rare in Germany. In addition, positions are created to support the infrastructure activities, to set up the central tasks and support the work of the Alliance. More than 250 people have already applied for the new positions over the last eight months.

A significant fraction of the accepted applications are from women. This is in accordance with the Alliance’s aim to enhance the role of women in physics. One way to attract smart and ambitious young people to the German research landscape is the dual career option – the Alliance pays half a salary for the partner to work at the same institution. So when Karina Williams, now in the final year of her particle physics phenomenology PhD at Durham University in the UK, applied for postdoctorate positions, she made sure that the places where she applied would also have a job for her partner. It worked out at Bonn University, where she and her partner start later this year. “I think it’s wonderful that schemes like this exist,” she says. “I know so many people who have either had to put up with very long-distance relationships or left the subject because their partner could not get a job nearby. When I first started applying for jobs, I was told that long-distance relationships were just part of the postdoc life.”

Centralized community

DESY plays a special role within the Alliance. It provides unique and basic infrastructures for accelerator research, as well as large-scale engineering support for detector research. This is a tradition that goes back to when DESY ran accelerators for high-energy physics. A new role for DESY is to host central services for the German physics community to support physics analysis in Germany. One of these services is the Analysis Centre, where research will focus on areas of general interest, which are often emphasised less at universities. Examples of these topics are statistical tools or parton distribution functions, where the Alliance will profit from the outstanding expertise at DESY from HERA. Of course it is not only R&D that researchers at the Analysis Centre will pursue; another purpose is to form a kind of helpdesk to answer questions and offer help in organizing topical workshops. Expanding on its role as an LHC Tier-2 centre, DESY is also setting up the National Analysis Facility, a large-scale computer installation to support the user analysis of LHC data. The first processors are already installed in DESY’s computing centre, providing fast computing power for efficient analyses by German LHC groups.

CCterabox1_04_08

Another example of “central services” – like Alliance fellowships, equal opportunity measures or dual career options – is a “scientist replacement” programme. The goal of scientist replacement is to enable senior professors to take up roles of responsibility at the LHC experiments by sponsoring junior professors to replace them at university. Karl Jakobs is physics coordinator at ATLAS and a part-time bachelor. His home and family are in Freiburg in southern Germany, but he has had a flat in Saint Genis-Pouilly near CERN since October last year and a great deal of long-term responsibility within the experiment – something that would have been impossible less than a year ago. Now the Alliance is funding his replacement in Freiburg. In this way, German particle physicists can play leading roles in current and future experiments more easily. This may sound like a trivial thing – but all German professors are obliged both to do research and to teach, binding them to their university and only releasing them during breaks and the occasional half-year sabbatical. Jakobs’ classes are currently, until the end of the summer, being taught by Philip Bechtle from DESY. Another example is Ian Brock, scientific manager of the Alliance, whose replacement during his leave of absence from Bonn University is paid for and provided by the Alliance.

CCterabox2_04_08

The Alliance was officially approved in May last year, funding started in July, and it is already a prominent part of the German landscape of particle physics. It had an impressive start and most of the structures of the Alliance have begun working intensively. A major event was the “kick-off” workshop at DESY in December. With 354 registered participants (many of them undergraduate, graduate and PhD students), a large part of the German high-energy physics community was there. The workshop proved a great opportunity for young particle physicists to get to know each other and exchange ideas: Terascale gives them a backbone structure that they will now fill with content.

The Alliance is already changing the way particle physics is done in Germany. The main idea is to establish cooperation among the different pillars of German research in particle physics. Expertise, which is scattered around many different places, is being combined to become more efficient. As Heuer explains: “The Alliance strengthens R&D on LHC physics in Germany, pushes for accelerator physics and prepares for the ILC. It is our hope that this helps in the worldwide effort to unravel the basic structure of matter and to understand how the universe has developed.”

Jakobs, meanwhile, is happy to benefit from the arrangement at CERN. “Everything is happening here. You cannot be physics coordinator and not be stationed at CERN. There are regular meetings, you talk to people all the time, watch their progress and coordinate to optimize.” As physics coordinator he has to make sure that all ATLAS people who work on Higgs analysis and other special topics work together in a coherent way. There is a complicated sub-group structure and all simulations and data have to be perfectly understood. “The good thing is that after my job here, I will be able to return to Freiburg with a clear conscience and spend a lot of time analysing the data I helped to prepare,” he explains. “Administration, teaching, funding proposals, forms and management – all that takes time at home. It is a great luxury to be able to concentrate on one thing only here: pure physics.

• For more information about Physics at the Terascale, see www.terascale.de.

• For further information about the Helmholtz research association, see www.helmholtz.de/en

The post Terascale Alliance takes off in Germany appeared first on CERN Courier.

]]>
https://cerncourier.com/a/terascale-alliance-takes-off-in-germany/feed/ 0 Feature The next big advances in particle physics are expected to happen at the "terascale". The tremendous complexity and size of experiments at the LHC and the proposed International Linear Collider (ILC) challenge the way that physicists have traditionally worked in high-energy physics. https://cerncourier.com/wp-content/uploads/2008/04/CCtera1_04_08-feature.jpg
IHEP and CERN collaborate well on beam-loss monitors https://cerncourier.com/a/ihep-and-cern-collaborate-well-on-beam-loss-monitors/ https://cerncourier.com/a/ihep-and-cern-collaborate-well-on-beam-loss-monitors/#respond Wed, 19 Sep 2007 11:34:37 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/ihep-and-cern-collaborate-well-on-beam-loss-monitors/ The LHC beam-loss monitoring (BLM) system is the key to protecting the machine against dangerous beam "losses" of this kind.

The post IHEP and CERN collaborate well on beam-loss monitors appeared first on CERN Courier.

]]>
Beam-loss monitoring ionization chambers

The circulating beams will store an unprecedented amount of energy when the LHC is in operation. If even a small fraction of this beam deviates from the correct orbit, it may induce a quench in the superconducting magnets or even cause physical damage to system components. The LHC beam-loss monitoring (BLM) system is the key to protecting the machine against dangerous beam “losses” of this kind.

The BLM system generates a beam abort trigger when the measured rate of lost beam exceeds pre-determined safety thresholds. The lost beam particles initiate hadronic showers through the magnets, which are measured by monitors installed outside of the cryostat around each quadrupole magnet. About 4000 BLMs – mainly ionization chambers – will be installed around the LHC ring. They are the result of a successful collaboration between CERN and the Institute for High Energy Physics (IHEP) in Protvino, Russia. CERN developed the monitors and IHEP manufactured them during the past year, using industry-produced components.

Signal speed and robustness against aging were the main design criteria. The monitors are about 60 cm long with a diameter of 9 cm and a sensitive volume of 1.5 l. Each one contains 61 parallel aluminium electrode plates separated by 0.5 cm and is filled with nitrogen at 100 mbar overpressure and permanently sealed inside a stainless-steel cylinder. They operate at 1.5 kV and are equipped with a low-pass filter at the high-voltage input. The collection time of the electrons and ions is 300 ns and 80 μs, respectively.

The radiation dose on the detectors over 20 years of LHC operation is estimated at 2 × 108 Gy in the collimation sections and 2 × 104 Gy at the other locations. To avoid radiation aging, production of the chamber components included a strict ultra-high vacuum (UHV) cleaning procedure. As a result, impurity levels from thermal and radiation-induced desorption should remain in the range of parts per million. Standardized test samples analysed at CERN periodically helped to check the cleaning performance.

The team at IHEP designed and built a special UHV stand to ensure suitable conditions for building the monitors. They performed checks throughout the production phase and documented the results. The quality of the welding is a critical aspect, so the team tested all of the welds for leak tightness at several stages. They also monitored constantly the vacuum and the purity of the filling gas. It was necessary to test the components before welding, and the assembled monitors during and after production, to ensure that the leakage current of the monitors stayed below 1 pA. Overall, IHEP achieved a consistently high quality for the monitors during the whole production period and kept to the tight production schedule. Tests at CERN’s Gamma Irradiation Facility of all 4250 monitors found fewer than 1% to be outside of the strict tolerance levels.

The post IHEP and CERN collaborate well on beam-loss monitors appeared first on CERN Courier.

]]>
https://cerncourier.com/a/ihep-and-cern-collaborate-well-on-beam-loss-monitors/feed/ 0 News The LHC beam-loss monitoring (BLM) system is the key to protecting the machine against dangerous beam "losses" of this kind.
PHYSTAT-LHC combines statistics and discovery https://cerncourier.com/a/phystat-lhc-combines-statistics-and-discovery/ https://cerncourier.com/a/phystat-lhc-combines-statistics-and-discovery/#respond Mon, 20 Aug 2007 11:52:46 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/phystat-lhc-combines-statistics-and-discovery/ With the LHC due to start running next year, the PHYSTAT-LHC workshop on Statistical Issues for LHC Physics provided a timely opportunity to discuss the statistical techniques to be used in the various LHC analyses. The meeting, held at CERN on 27–29 June, attracted more than 200 participants, almost entirely from the LHC experiments. The PHYSTAT series […]

The post PHYSTAT-LHC combines statistics and discovery appeared first on CERN Courier.

]]>
With the LHC due to start running next year, the PHYSTAT-LHC workshop on Statistical Issues for LHC Physics provided a timely opportunity to discuss the statistical techniques to be used in the various LHC analyses. The meeting, held at CERN on 27–29 June, attracted more than 200 participants, almost entirely from the LHC experiments.

The PHYSTAT series of meetings began at CERN in January 2000 and has addressed various statistical topics that arise in the analysis of particle-physics experiments. The meetings at CERN and Fermilab in 2000 were devoted to the subject of upper limits, which are relevant when an experiment fails to observe an effect and the team attempts to quantify the maximum size the effect could have been, given that no signal was observed. By contrast, with exciting results expected at the LHC, the recent meeting at CERN focused on statistical problems associated with quantifying claims of discoveries.

The invited statisticians, though few in number, were all-important participants – a tradition started at the SLAC meeting in 2003. Sir David Cox of Oxford, Jim Berger of Duke University and the Statistical and Applied Mathematical Sciences Institute, and Nancy Reid of Toronto University all spoke at the meeting, and Radford Neal, also from Toronto, gave his talk remotely. All of these experts are veterans of previous PHYSTAT meetings and are familiar with the language of particle physics. The meeting was also an opportunity for statisticians to visit the ATLAS detector and understand better why particle physicists are so keen to extract the maximum possible information from their data – we devote much time and effort to building and running our detectors and accelerators.

The presence of statisticians greatly enhanced the meeting, not only because their talks were relevant, but also because they were available for informal discussions and patiently explained statistical techniques. They gently pointed out that some of the “discoveries” of statistical procedures by particle physicists were merely “re-inventions of the wheel” and that some of our wheels resemble “triangles”, instead of the already familiar “circular” ones.

The meeting commenced with Cox’s keynote address, The Previous 50 Years of Statistics: What Might Be Relevant For Particle Physics. In particular, he discussed multiple testing and the false discovery rate – particularly relevant in general-purpose searches, where there are many possibilities for a statistical fluctuation to be confused with a discovery of some exciting new physics. Cox reminded the audience that it is more important to ask the correct question than to perform a complicated analysis, and that when combining data, first to check that they are not inconsistent.

One approach to searching for signs of new physics is to look for deviations from the Standard Model. This is usually quantified by calculating the p-value, which gives the probability – assuming the Standard Model is true – of finding data at least as discrepant as that observed. Thus a small p-value implies an inconsistency between the data and the Standard Model prediction. This approach is useful in looking for any sort of discrepancy. The alternative is to compare the predictions of the Standard Model with some specific alternative, such as a particular version of supersymmetry. This is a more powerful way of looking for this particular form of new physics, but is likely to be insensitive to other possibilities.

Luc Demortier of Rockefeller University gave an introduction to the subject of p-values, and went on to discuss ways of incorporating systematic uncertainties in their calculation. He also mentioned that, in fitting a mass spectrum to a background-only hypothesis and to a background plus a 3-parameter peak, it is a common misconception that in the absence of any signal, the difference in χ2 of the two fits behaves as a χ2 with 3 degrees of freedom. The talk by Kyle Cranmer of Brookhaven National Laboratory dealt with the practical issues of looking for discoveries at the LHC. He too described various methods of incorporating systematics to see whether a claim of 5 σ statistical significance really does correspond to such a low probability. The contributed talk by Jordan Tucker from University of California, Los Angeles, (UCLA) also explored this.

While the LHC is being completed, the CDF and DØ experiments are taking data and analysing it at the Fermilab collider, which currently provides the highest accelerator energies. Fermilab’s Wade Fisher summarized the experience gained from searches for new phenomena. Later, speakers from the major LHC experiments discussed their “statistical wish-lists” – topics on which they would like advice from statistics experts. Jouri Belikov of CERN spoke for the ALICE experiment, Yuehong Xie of Edinburgh University for LHCb and Eilam Gross of the Weizmann Institute for ATLAS and CMS. It was particularly pleasing to see the co-operation between the big general-purpose experiments ATLAS and CMS on this and other statistical issues. The Statistics Committees of both work in co-operation and both experiments will use the statistical tools that are being developed (see below). Furthermore, it will be desirable to avoid the situation where experiments make claims of different significance for their potential discoveries, not because their data are substantially different, but simply because they are not using comparable statistical techniques. Perhaps PHYSTAT can claim a little of the credit for encouraging this collaboration.

When making predictions for the expected rate at which any particle will be produced at the LHC, it is crucial to know the way that the momentum of a fast-moving proton is shared among its constituent quarks and gluons (known collectively as partons). This is because the fundamental interaction in which new particles are produced is between partons in the colliding protons. This information is quantified in the parton distribution functions (PDFs), which are determined from a host of particle-physics data. Robert Thorne of University College London, who has been active in determining PDFs, explained the uncertainties associated with these distributions and the effect that they have on the predictions. He stressed that other effects, such as higher-order corrections, also resulted in uncertainties in the predicted rates.

Statisticians have invested much effort on “experimental design”. A typical example might be how to plan a series of tests investigating the various factors that might affect the efficiency of some production process; the aim would be to determine which factors are the most critical and to find their optimal settings. One application for particle physicists is to decide how to set the values of parameters associated with systematic effects in Monte Carlo simulations; the aim here is to achieve the best accuracy of the estimate of systematic effects with a minimum of computing. Since a vast amount of computing time is used for these calculations, the potential savings could be very useful. The talks by statisticians Reid and Neal, and by physicist Jim Linnemann of Michigan State University, addressed this important topic. Plans are underway to set up a working group to look into this further, with the aim of producing recommendations on this issue.

Two very different approaches to statistics are provided by the Bayesian and frequentist approaches (see box). Berger gave a summary of the way in which the Bayesian method could be helpful. One particular case that he discussed was model selection, where Bayesianism provides an easy recipe for using data to choose between two or more competing theories.

With the required software for statistical analyses becoming increasingly complicated, Wouter Verkerke from NIKHEF gave an overview of what is currently available. A more specific talk by Lorenzo Moneta of CERN described the statistical tools within the widely used ROOT package developed by CERN’s René Brun and his team. An important topic in almost any analysis in particle physics is the use of multivariate techniques for separating the wanted signal from undesirable background. There is a vast array of techniques for doing this, ranging from simple cuts via various forms of discriminant analysis to neural networks, support vector machines and so on. Two largely complementary packages are the Toolkit for Multivariate Data Analysis and StatPatternRecognition. These implement a variety of methods that facilitate comparison of performance, and were described by Frederik Tegenfeldt of Iowa State University and Ilya Narsky of California Institute of Technology, respectively. It was reassuring to see such a range of software available for general use by all experiments.

Although the theme of the meeting was the exciting discovery possibilities at the LHC, Joel Heinrich of University of Pennsylvania returned to the theme of upper limits. At the 2006 meeting in Banff, Heinrich had set up what became known as the “Banff challenge”. This consisted of providing data with which anyone could try out their favourite method for setting limits. The problem included some systematic effects, such as background uncertainties, which were constrained by subsidiary measurements. Several groups took up the challenge and provided their upper limit values to Heinrich. He then extracted performance figures for the various methods. It seemed that the “profile likelihood” did well. A talk by Paul Baines of Harvard University described the “matching prior” Bayesian approach.

Unlike the talks that sometimes conclude meetings, the summary talk by Bob Cousins of UCLA was really a review of the talks presented at the workshop. He had put an enormous amount of work into reading through all of the available presentations and then giving his own comments on them, usefully putting the talks of this workshop in the context of those presented at earlier PHYSTAT meetings.

Overall, the quality of the invited talks at PHYSTAT-LHC was impressive. Speakers went to great lengths to make their talks readily intelligible: physicists concentrated on identifying statistical issues that need clarification; statisticians presented ideas that can lead to improved analysis. There was also plenty of vigorous discussion between sessions, leading to the feeling that meetings such as these really do lead to an enhanced understanding of statistical issues by the particle-physics community. Gross coined the word “phystatistician” for particle physicists who could explain the difference between the probability of A, given that B had occurred, compared with the probability of B, given that A had occurred. When the LHC starts up in 2008, it will be time to put all of this into practice. The International Committee for PHYSTAT concluded that the workshop was successful enough that it was worth considering a further meeting at CERN in summer 2009, when real LHC data should be available.

The post PHYSTAT-LHC combines statistics and discovery appeared first on CERN Courier.

]]>
https://cerncourier.com/a/phystat-lhc-combines-statistics-and-discovery/feed/ 0 Feature
NSF selects Homestake for deep lab site https://cerncourier.com/a/nsf-selects-homestake-for-deep-lab-site/ https://cerncourier.com/a/nsf-selects-homestake-for-deep-lab-site/#respond Mon, 20 Aug 2007 07:12:21 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/nsf-selects-homestake-for-deep-lab-site/ The US National Science Foundation (NSF) has selected a proposal to produce a technical design for a deep underground science and engineering laboratory (DUSEL) at the former Homestake gold mine near Lead, South Dakota, site of the pioneering solar-neutrino experiment by Raymond Davis.

The post NSF selects Homestake for deep lab site appeared first on CERN Courier.

]]>
The US National Science Foundation (NSF) has selected a proposal to produce a technical design for a deep underground science and engineering laboratory (DUSEL) at the former Homestake gold mine near Lead, South Dakota, site of the pioneering solar-neutrino experiment by Raymond Davis. A 22-member panel of external experts reviewed proposals from four teams and unanimously determined that the Homestake proposal offered the greatest potential for developing a DUSEL.

CCnew10_07_07

The selection of the Homestake proposal, which was submitted through the University of California (UC) at Berkeley by a team from various institutes, only provides funding for design work. The team, led by Kevin Lesko from UC Berkeley and the Lawrence Berkeley National Laboratory, could receive up to $5 million a year for up to three years. Any decision to construct and operate a DUSEL, however, will entail a sequence of approvals by the NSF and the National Science Board. Funding would ultimately have to be approved by the US Congress. If eventually built as envisioned by its supporters, a Homestake DUSEL would be the largest and deepest facility of its kind in the world.

The concept of DUSEL grew out of the need for an interdisciplinary “deep science” laboratory that would allow researchers to probe some of the most compelling mysteries in modern science, from the nature of dark matter and dark energy to the characteristics of microorganisms at great depth. Such topics can only be investigated at depths where hundreds of metres of rock can shield ultra-sensitive physics experiments from background activity, and where geoscientists, biologists and engineers can have direct access to geological structures, tectonic processes and life forms that cannot be studied fully in any other way. Several countries, including Canada, Italy and Japan, have extensive deep-science programmes, but the US has no existing facilities below a depth of 1 km. In September 2006, the NSF solicited proposals to produce technical designs for a DUSEL-dedicated site. Four teams had submitted proposals by the January 2007 deadline, but in four different locations.

The review panel included outside experts from relevant science and engineering communities and from supporting fields such as human and environmental safety, underground construction and operations, large project management, and education and outreach.

Scientists from Japan, Italy, the UK and Canada also served on the panel. The review process included site visits by panellists to all four locations, with two meetings to review the information, debate and vote on which, if any, of the proposals would be recommended for funding.

The post NSF selects Homestake for deep lab site appeared first on CERN Courier.

]]>
https://cerncourier.com/a/nsf-selects-homestake-for-deep-lab-site/feed/ 0 News The US National Science Foundation (NSF) has selected a proposal to produce a technical design for a deep underground science and engineering laboratory (DUSEL) at the former Homestake gold mine near Lead, South Dakota, site of the pioneering solar-neutrino experiment by Raymond Davis. https://cerncourier.com/wp-content/uploads/2007/08/CCnew10_07_07-feature.jpg
Theory in the computer age https://cerncourier.com/a/viewpoint-theory-in-the-computer-age/ Mon, 04 Jun 2007 11:14:26 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/viewpoint-theory-in-the-computer-age/ Some years ago, it was customary to divide work in the exact sciences of physics, chemistry and biology into three categories: experimental, theoretical and computational.

The post Theory in the computer age appeared first on CERN Courier.

]]>
Some years ago, it was customary to divide work in the exact sciences of physics, chemistry and biology into three categories: experimental, theoretical and computational. Those of us breathing the rarified air of pure theory often considered numerical calculations and computer simulations as second-class science, in sharp contrast to our highbrow elaborate analytical work.

CCvie1_06_07

Nowadays, such an attitude is obsolete. Practically all theoreticians use computers as an essential everyday tool and find it hard to imagine life in science without the glow of a monitor in front of their eyes. Today an opposite sort of prejudice seems to hold sway. A referee might reject an article demonstrating the nearly forgotten fine art of rigorous theoretical thought and reasoning if the text is not also full of plots showing numerous results of computer calculations.

Sometimes it seems that the only role that remains for theoreticians – at least in nuclear physics, which I know best – is to write down a computer code, plug in numerical values, wait for the results and finally insert them into a prewritten text. However, any perception of theorists as mere data-entry drones misses the mark.

First, to write reasonable code one needs to have sound ideas about the underlying nature of physical processes. This requires clear formulation of a problem and deep thinking about possible solutions.

Second, building a model of physical phenomena means making hard choices about including only the most relevant building blocks and parameters and neglecting the rest.

Third, the computer results themselves need to be correctly interpreted, a point made by the now-famous quip of theoretical physicist Eugene Wigner. “It is nice to know that the computer understands the problem,” said Wigner when confronted with the computer-generated results of a quantum-mechanics calculation. “But I would like to understand it, too.”

We live in an era of fast microprocessors and high-speed internet connections. This means that building robust high-performance computing centres is now within reach of far more universities and laboratories. However, physics remains full of problems of sufficient complexity to tax even the most powerful computer systems. These problems, many of which are also among the most interesting in physics, require appropriate symbiosis of human and computer brains.

Consider the nuclear-shell model, which has evolved to be a powerful tool for achieving the most specific description of properties of complex nuclei. The model describes the nucleus as a self-sustaining collection of protons and neutrons moving in a mean field created by the particles’ co_operative action. On top of the mean field there is a residual interaction between the particles.

Applying the model means being immediately faced by a fundamental question: What is the best way to reasonably restrict the number of particle orbits plugged into the computer? The answer is important since information about the orbits is represented in matrices that must subsequently be diagonalized. For relatively heavy nuclei these matrices are so huge – with at least many billions of dependent variables – that they are intractable even for the best computers. This is why, at least until a few years ago, the shell model was relegated for use describing relatively light nuclei.

The breakthrough came by combining the blunt power of contemporary computing with the nuanced theoretical intellect of physicists. It was theorists who determined that a full solution of the shell-model problem is unnecessary and that it is sufficient to calculate detailed information for a limited number of low-lying states; theorists who came up with a statistical means to average the higher-level states by applying principles of many-body quantum chaos; and theorists who figured out how to use such averages to determine the impact on low-lying states.

Today physicists have refined techniques for truncating shell-model matrices to a tractable size, getting approximate results, and then adding the influence of the higher-energy orbits with the help of the theory of quantum chaos. The ability to apply the shell model to heavier nuclei may eventually advance efforts to understand nucleosynthesis in the cosmos, determine rates of stellar nuclear reactions, solve condensed-matter problems in the study of mesoscopic systems, and perform lattice QCD calculations in the theory of elementary particles. Eventually, that is, because many answers to basic physics questions remain beyond the ken of even the most innovative human–computer methods of inquiry.

So yes, one can grieve over the fading pre-eminence of theory. However, few among us would want to revert to the old days, despite our occasional annoyance with the rise of computer-centric physics and the omnipresent glow of the monitor on our desks. As for my opinion, I happen to agree with the chess grandmaster who, when recently complaining about regular defeats of even the best human players by modern chess computers, said: “Glasses spoil your eyes, crutches spoil your legs and computers your brain. But we can’t do without them.”

The post Theory in the computer age appeared first on CERN Courier.

]]>
Opinion Some years ago, it was customary to divide work in the exact sciences of physics, chemistry and biology into three categories: experimental, theoretical and computational. https://cerncourier.com/wp-content/uploads/2007/06/CCvie1_06_07.jpg
Power and prejudice: women in physics https://cerncourier.com/a/power-and-prejudice-women-in-physics/ https://cerncourier.com/a/power-and-prejudice-women-in-physics/#respond Mon, 04 Jun 2007 10:45:55 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/power-and-prejudice-women-in-physics/ Marianne Johansen goes in search of facts about an emotive issue.

The post Power and prejudice: women in physics appeared first on CERN Courier.

]]>
Physics has always had a relatively low proportion of female students and researchers. In the EU there are on average 33% female PhD graduates in the physical sciences, while the percentage of female professors amounts to 9% (ECDGR 2006). At CERN the proportion is even less, with only 6.6% of the research staff in experimental and theoretical physics being women (Schinzel 2006). The fact that there is no proportional relationship between the number of PhD graduates and professors also suggests that women are less likely to succeed in an academic career than men.

Before examining the findings of various studies, it is worth asking if this low representation of women in physics is a problem – do we actually need more female physicists? In my opinion this question has to be answered from three perspectives: the perspective of society, the perspective of science and the perspective of women.

Starting from the viewpoint of society, there are several issues to consider. First, physics is a field of innovation. Many technological advancements that have a huge impact on society and everyday life come directly or indirectly from physics. Being a physicist therefore means having access to people and knowledge that set the technological agenda.

Second, in many countries research and academic positions are regarded as high-status jobs. Academic staff are often appointed to committees that fund research projects or advise governments on issues that are closely related to their field of expertise. As such, scientists influence the focus of research and the general development of society.

Finally, it is a democratic principle that power and influence should be distributed equally and proportionally among different groups in society. An EU average of 9% female physics professors does not even come close to equal representation in this field. The fact that women fund research through tax payments adds to the demand for more female scientists.

CCwom1_06_07

From a scientific point of view, the lack of women represents a huge waste of talent. For physics to develop further as a science, it needs more people with excellent analytical, communicational and social skills. There are also reports that departments without women suffer in many ways (Main 2005).

From the perspective of women, they will of course benefit from increased influence in society, but contributing to physics is not only about struggling for influence and power. Fundamental questions have been asked throughout history by men and women alike. Contributing to physics is to participate in a human project, driven by curiosity and wonder that seeks to understand the world around us.

What the studies find

So why do women fail to advance to the top levels in academia? Some reports state that it is because women are less likely to give priority to their career (Pinker and Spelke 2005), while others cite inferiority in the ability to do science compared with men or the lack of some of the abilities necessary to be successful in science. For example, one report suggests that men are on average more aggressive than women, and that this characteristic (among others) is necessary to succeed in academic work (Lawrence 2006). What these reports have in common is that they all conclude that there will never be as many women as men in academia because of innate differences between the genders, and also that these differences are the main reason for the under-representation of women.

Other reports state that women do not succeed in physics because of prejudice, discrimination and unfriendly attitudes towards them. Studies have shown that women need to be twice as productive as men to be considered equally competent (Wennerås and Wold 1997). In fact both men and women rate men’s work higher than that of women (Goldberg 1968). There is also the psychological mechanism called “stereotype threat”, which causes individuals who are made aware of the negative stereotypes connected to the social group to which they belong – such as age, gender, ethnicity and religion – to underperform in a manner consistent with the stereotype. White male engineering students will for instance perform significantly worse on tests when they are told that Asian students usually outperform them on the same tests (Steele 2004). It is important to remember that these prejudices are present in most human beings and do not necessarily arise from bad will or conscious hostility.

CCwom2_06_07

A survey designed to identify issues that are important to female physicists also reported on their negative experiences as a minority group owing to the male domination in the field (Ivie and Guo 2005). In this survey 80% state that attitudes towards women in physics need to be improved, while 65% believed discrimination is a problem that needs to be dealt with. This survey also reported on positive experiences among female physicists, in particular their love for their field and the support that they have received from others.

To produce an exhaustive list of reasons for why so few women are able to reach the highest positions in academia would be a tedious endeavour with many conflicting opinions. However, if we agree that we need more women in physics, it is clear that we need to take action. In this regard it is important to recognize that some of these actions will also be beneficial to men, improving their ability to succeed in a scientific career.

In academia several things can be changed to eliminate discrimination and hostile attitudes towards women (and men):

• Transparency in selection processes for scholarships, funding and positions, i.e. making all evaluation done by the selection committees public so that any discriminating mechanism can be unveiled. This will also benefit men, since they are also subjects of discrimination (Wennerås and Wold 1997).

• Investigate hostile attitudes in institutes and laboratories. Those who discriminate tend not to see how their behaviour affects their environment, and those discriminated against are usually reluctant to admit it. The Institute of Physics in London visits institutes, on invitation only, to investigate their attitudes towards women (Main 2005).

• Make the career path more predictable. Both genders suffer from the unpredictability and requirement of mobility in an academic physics career, and this can also conflict with the desire to start a family (Ivie and Guo 2005).

• Awareness of discrimination. Nobody wants to discriminate against others; the use of stereotypes and prejudice is a part of the human mind. It is therefore important to be aware of how these properties affect the way that we evaluate and treat others. Awareness of discriminating procedures have caused changes. Both the US National Institutes of Health (Carnes 2006) and the Swedish Medical Research Council (Wennerås and Wold 1997) changed their routines after being made aware that their evaluation and recruitment schemes were prejudiced against women.

CCwom3_06_07

There is no doubt that the under-representation of women in physics is a sensitive issue. Women and men who have never experienced discrimination or bias towards their gender often feel repelled when the issue is discussed. However, I believe the numbers speak for themselves: women do not have the same possibility to succeed in academia as men. As individuals we would like to think that we can all approach any branch of society without being met with hostility or bias, no matter what ethnic group, social class, religion or gender we might belong to. In the end most women would just like to be able to make the same mistakes, produce the same number of papers and be respected, accepted or rejected on the same conditions as their male colleagues, not more, not less.

• With thanks to the ATLAS Women Group, David Milstead, Robindra Prabhu, Helene Rickhard, Josi Schinzel, Jonas Strandberg and Sara Strandberg.

The post Power and prejudice: women in physics appeared first on CERN Courier.

]]>
https://cerncourier.com/a/power-and-prejudice-women-in-physics/feed/ 0 Feature Marianne Johansen goes in search of facts about an emotive issue. https://cerncourier.com/wp-content/uploads/2007/06/CCwom2_06_07.jpg
A new challenge for particle physics https://cerncourier.com/a/viewpoint-a-new-challenge-for-particle-physics/ Mon, 30 Apr 2007 22:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/viewpoint-a-new-challenge-for-particle-physics/ TRIUMF's new director Nigel Lockyer looks to the future of co-operation in particle physics, and Canada's role in this increasingly global adventure.

The post A new challenge for particle physics appeared first on CERN Courier.

]]>
Particle physics often describes itself, and correctly so, as having brought countries and people together that previously had been unable to co-operate with each other. In Europe, CERN was born out of a desire for co-operation. This was evident later, for example, when Russian and Chinese scientists worked well together within the US throughout the Cold War. This spirit of connection across national boundaries led to success for our science – and for us all as scientists. The strong innate desire to understand our universe transcends our differences. Our field was in many ways, or so we like to say, the first and most successful model in modern international relations. CERN embodies this co-operation.

CCvie1_05_07

Nowadays, however, we cannot rest on our laurels. This co-operation is happening in almost every other field of research; international facilities and multinational teams of researchers are no longer unique to particle physics. So what is the next level of co-operation for us? To some it might be obvious. We should continue to strive for a seamless global vision of science projects, and we should distribute those projects around the world so as to maximize the benefits of science in all countries, large or small, rich or poor. The ITER and LHC projects perhaps exemplify global projects: the world unites to select, design, build and operate a project. Particle physicists, as everyone knows, are considering another one, an International Linear Collider (ILC).

The Global Design Effort (GDE) for an ILC is not “flat” globally, but is a merging of regions. The world has been divided into three geographical areas: Asia, the Americas and Europe. In this mixture, Canada is an interesting case study. TRIUMF, Canada’s National Laboratory for Particle and Nuclear Physics is located in Vancouver, on the Asia–Pacific rim, yet only a few miles north of the US border. TRIUMF, though a small laboratory, hosts more than 550 scientists, engineers, technicians, postdoctoral fellows and students, and more than 1000 active users from Canada, the US and around the world. Historically, TRIUMF and the Canadian particle-physics community have made significant intellectual contributions to the major projects – both on the accelerator side and detector-physics side – in Europe at DESY with HERA and ZEUS, LHC and ATLAS at CERN, and most recently in Japan with T2K at JPARC. Canadian particle physicists have also been active in experiments in the US, such as SLD and BaBar at SLAC, CDF and D0 at Fermilab and rare-kaon experiments at Brookhaven National Laboratory.

TRIUMF also has a world-leading internal radioactive-beam programme using the ISOL technique, familiar at CERN’s ISOLDE. TRIUMF’s nuclear physicists are collaborating with China and India and have strong ties to France (Ganil), Germany (GSI), the UK and Japan. TRIUMF is truly global, reflecting that Canada is close to Europe in culture, close to the US geographically and culturally, and is on the Asia–Pacific rim. Canada also continues to merge the culture of nuclear and particle physics, just as CERN is doing at the LHC with ALICE, ATLAS and CMS. A good example is the Sudbury Neutrino Observatory (SNO), where particle and nuclear physicists came together and did great science. SNOLAB will also merge nuclear and particle physics to pursue neutrino and dark-matter searches (see Canada looks to future of subatomic physics). TRIUMF’s infrastructure and technical resources allowed Canadian physicists to help build SNO and will be important in the future for experiments at SNOLAB.

TRIUMF is not yet fully engaged in the ILC effort. Given its history, it is obvious that it will want to participate significantly. Canadian particle physicists are big proponents of an ILC and believe that it is a great opportunity and that it has tremendous discovery potential. However, the area of TRIUMF’s involvement and with which regions it will partner is under discussion.

One fact remains: involvement in any international science project must also feed back to help the internal national programmes. Advances in accelerator technology and detector development for the LHC help the entire national science programme, including nuclear physics, life sciences and condensed matter physics. ILC and superconducting radio-frequency (SRF) development will also be important for Canada and TRIUMF’s internal programmes. The latest ILC technology will bootstrap other vanguard technical developments in each country just as we hope that the globally distributed computing for the LHC, such as TRIUMF’s Tier_1 centre, will have a similar impact.

A strong national science programme supports educational advances and is necessary for innovation and economic prosperity. We should keep this in mind as the world considers the ILC and other large projects, such as next-generation neutrino observatories or underground laboratories. TRIUMF’s and Canada’s strategy is to develop niches of national expertise while participating in exciting international science projects such as the LHC and ILC. The development of such niches is essential to the future prosperity of our field.

All of this will require strategic regional and global planning in particle and nuclear physics. Surely, we are up for this challenge!

After investing in ATLAS and LHC for many years, Canada and TRIUMF are looking forward to a decade or more of great discoveries.

The post A new challenge for particle physics appeared first on CERN Courier.

]]>
Opinion TRIUMF's new director Nigel Lockyer looks to the future of co-operation in particle physics, and Canada's role in this increasingly global adventure. https://cerncourier.com/wp-content/uploads/2007/04/CCvie1_05_07.jpg
Viewpoint https://cerncourier.com/a/viewpoint-viewpoint/ Tue, 30 Jan 2007 00:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/viewpoint-viewpoint/ Six secrets of successful institutes Mike Lazaridis, co-founder of the company behind the BlackBerry, explains how he has applied business strategy to establish a world-class theoretical-physics institute.

The post Viewpoint appeared first on CERN Courier.

]]>
I started a company, Research In Motion, while I was still at the University of Waterloo in Ontario, Canada. By the late 1990s we had developed the BlackBerry handheld mobile device. As a result I found myself in a position where I could invest in an area that I am passionate about and one that could make a big difference.

Spot an opportunity. Having observed that research funding is usually thinly spread, I decided to start a theoretical-physics institute that would focus on science that is fundamental to all human progress and at which Canada can excel.

CCEvie1_01-07

Promote scientific openness. My driving motivation for establishing the Perimeter Institute (PI), located next to the University of Waterloo, is that I feel fundamental science needs more support. What worries me is that governments all around the world seem to be listening to the same consultant. They ask scientists to do something that will benefit the economy within five years. Of course, governments are under pressure to balance budgets and be accountable – that’s reasonable. But some of that pressure is getting transferred to universities, with unfortunate results.

Science is a global enterprise based on co-operation and openness. If you say to universities that they must justify their research with patents and licences, you collapse that openness. Efforts to commercialize too early are making researchers more secretive, hampering their ability to excel, without necessarily helping business. I wanted to challenge this trend.

Concentrate on core competencies. A strategic decision we made when creating the Institute in 1999 was to focus on a couple of very specialized fields, quantum gravity and quantum foundations, because we felt these were areas where a relatively small, high-quality team could make a big difference. This is the same strategy that originally made BlackBerry a success: it focused on doing one thing – “push e-mail” – very well rather than competing on all features. So for the first few years, PI focused on recruiting top-class researchers in these two areas to ensure that research efforts were of international calibre within a relatively short period. As the Institute’s reputation builds, we are branching out a bit more.

Build a focal point. The other decision we made early on was to house the Institute in an outstanding building. Before we built it, we spent two years going around the world and talking to people in theoretical-physics institutes and theory departments at universities, asking them what works and what doesn’t. Based on this, we put together some specifications and organized a competition, where we really let the architects go wild. The result is a building with a design that has won several prizes and is internationally recognized.

Attract investment. I invested C$100 million of my own money in PI to get it started. For the longer term it was critical to get government support. Convincing government officials took a huge effort. Part of the challenge is that not many politicians understand basic science, let alone know how to value it. This means that a lot of funding is done almost entirely on your ability to explain the benefits and on their faith in you. Early on, all levels of government (local, provincial and federal) saw the benefit of PI and decided to support the Institute with a total of about C$55 million dollars. More recently, and now that the Institute is established, a further C$50 million in public investment was warmly received.

Present your product. In the long run, you can’t rely on faith alone. So although excellent science is crucial to success that’s really only half of the story. The other half of the Institute’s activities is about outreach. For example, PI has a summer-school programme for students from all over Canada and around the world. PI also goes on tour across Canada to give classroom instruction about physics to both students and teachers.

PI also has a programme of monthly public lectures. Sometimes we’ll have scientists like Roger Penrose discuss a weighty topic; other times we’ll have debates about science with well-known historians and journalists. Waterloo has a population of only about 100,000, yet every month we fill a 550 seat lecture theatre, and there’s always a queue outside on standby. That’s how much interest you can generate in science, if you make the effort to open it up for people and make the research accessible.

And that’s success. Because ultimately, these are the people who vote for the governments which fund the research. If they don’t benefit from and believe in what we’re doing, it’s always going to be an uphill struggle. So in addition to directly helping students, teachers and members of the general public, there’s reason for balancing good science with good outreach. We have to move beyond relying on faith.

Mike Lazaridis is founder and co-CEO of Research In Motion, makers of BlackBerry handheld devices, as well as chancellor of the University of Waterloo. Additional information about PI is available at
www.perimeterinstitute.ca.

The post Viewpoint appeared first on CERN Courier.

]]>
Opinion Six secrets of successful institutes Mike Lazaridis, co-founder of the company behind the BlackBerry, explains how he has applied business strategy to establish a world-class theoretical-physics institute. https://cerncourier.com/wp-content/uploads/2007/01/CCEvie1_01-07.jpg
Nuclear science hits new frontiers https://cerncourier.com/a/viewpoint-nuclear-science-hits-new-frontiers/ Wed, 06 Dec 2006 00:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/viewpoint-nuclear-science-hits-new-frontiers/ C Konrad Gelbke argues that nuclear science has a bright future thanks to the possibilities being opened in particular by the exploration of rare isotopes.

The post Nuclear science hits new frontiers appeared first on CERN Courier.

]]>
Nuclear science is undergoing a renaissance as it confronts new and previously unapproachable research opportunities. One such opportunity, the study of short-lived nuclei far from stability, is emerging as a major frontier in nuclear science. Rare-isotope research is tied to astrophysics and mesoscopic science, fields in which voracious demand for new data is generating worldwide interest in high-power, next-generation accelerators.

CCEvie1_12-06

New facilities will probe the limits of nuclear stability and determine nuclear properties in uncharted regions of nuclei with unusual proton-to-neutron ratios. The new data will challenge descriptions of nuclei that are based on data limited to nuclei near the valley of nuclear stability. These improved models of nuclei – two component, open mesoscopic systems – will increase our understanding of mesoscopic systems in fields such as chemistry, biology, nanoscience and quantum information. More directly, the models will greatly boost our understanding of the cosmos.

Today, our descriptions of stellar evolution, and especially of explosive events, such as X-ray bursts, core-collapse supernovae, gamma-ray bursts, thermonuclear (Type Ia) supernovae and novae, are limited by inadequate knowledge of important nuclear properties. We need new data for nuclei far from stability and better nuclear theories to develop accurate models of these astrophysical phenomena. Improved models, in turn, will help astrophysicists make better use of data from ground- and space-based observatories, understand the nuclear processes that produce the elements observed in the cosmos and learn about the environments in which they were formed.

We already have the first concrete evidence that nuclear structure, well established for nuclei near the line of stability, can change dramatically as we move away from the line of stability. The effective interactions far from stability – pairing, proton–neutron, spin-orbit and tensor – are different, but largely unknown. We need quantitative experimental information to refine theoretical treatments that describe these exotic isotopes.
There are several particularly promising research directions. For example, nuclei with unusual density distributions have been discovered for the lighter elements, but little is known about the properties of heavier, very neutron-rich nuclei. These heavier nuclei may have multi-neutron halo distributions with unusual cluster or molecular structures, which otherwise only occur at the surface of neutron stars. Such nuclei provide a unique opportunity to study the nucleon–nucleon interaction in early pure neutron matter.

Intense beams of neutron-rich isotopes will be used to synthesize transactinide nuclei that are more neutron-rich than is possible with stable beams. These nuclei are predicted to be sufficiently strongly bound and long-lived for detailed chemical study.

Energetic nucleus–nucleus collision experiments with beams of very neutron-rich and very neutron-poor isotopes will explore the asymmetry energy term in the equation of state of neutron-rich nuclear matter. This term is important in understanding the properties of neutron stars.

Nuclei are self-sustaining finite droplets of a two-component – neutron and proton – Fermi-liquid. Selectively prepared nuclei will allow us to study, on a femtoscopic scale, typical mesoscopic phenomena: self-organization and complexity arising from elementary interactions, symmetry and phase transformations, coexistence of quantum chaos and collective dynamics. The openness of loosely bound nuclei owing to strong coupling to the continuum allows us to probe general mesoscopic concepts, such as information processing and decoherence, which are key ideas in quantum computing.

The interplay of strong, electromagnetic and weak interactions determine detailed nuclear properties. Selecting nuclear systems that isolate or amplify the specific physics of interest will allow better tests of fundamental symmetries and fuel the search for new physics beyond the Standard Model.

Beyond advancing basic research questions, new accelerators should yield practical benefits for science and society. In fact, nuclear science has a long record of such applications. Technologies rooted in nuclear science – such as positron-emission tomography, the use of radioactive isotopes for treating or diagnosing disease, and more recently, the use of dedicated accelerators for treating cancer patients – have transformed medicine. Sterilization of fresh produce or surgical instruments with ionizing radiation is growing in importance. Ultra-sensitive nuclear detection, such as Rutherford backscattering, proton-induced X- and gamma-ray emission and accelerator mass spectrometry, has provided diagnostic tools for archaeology and material science.

Next-generation rare-isotope research and this tradition of applied work promise new opportunities for cross-disciplinary collaboration on national and international security, biomedicine, materials research and nuclear energy. Nuclear science is well positioned to deliver new benefits to physics and society in the coming decades.

The post Nuclear science hits new frontiers appeared first on CERN Courier.

]]>
Opinion C Konrad Gelbke argues that nuclear science has a bright future thanks to the possibilities being opened in particular by the exploration of rare isotopes. https://cerncourier.com/wp-content/uploads/2006/12/CCEvie1_12-06.jpg
PHYSTAT tackles the significance problem https://cerncourier.com/a/phystat-tackles-the-significance-problem/ https://cerncourier.com/a/phystat-tackles-the-significance-problem/#respond Wed, 01 Nov 2006 00:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/phystat-tackles-the-significance-problem/ When analysing data from particle-physics experiments, the best statistical techniques can produce a better quality result. Given that statistical computations are not expensive, while accelerators and detectors are, it is clearly worthwhile investing some effort in the former. The PHYSTAT series of conferences and workshops, which started at CERN in January 2000, has been devoted […]

The post PHYSTAT tackles the significance problem appeared first on CERN Courier.

]]>
When analysing data from particle-physics experiments, the best statistical techniques can produce a better quality result. Given that statistical computations are not expensive, while accelerators and detectors are, it is clearly worthwhile investing some effort in the former. The PHYSTAT series of conferences and workshops, which started at CERN in January 2000, has been devoted to just this topic (CERN Courier May 2000 p17). The latest workshop was at Banff in the Canadian Rockies in July and was also a culminating part of the spring 2006 programme of astrostatistics which had taken place earlier in the year at the Statistical and Applied Mathematical Sciences Institute (SAMSI) in the Research Triangle Park, North Carolina.

The initiative for the Banff workshop came from Nancy Reid, a statistician from Toronto who has delivered invited talks at the PHYSTAT conferences at SLAC and Oxford (see CERN Courier March 2004 p22 and (see CERN Courier January/February 2006 p35). The Banff International Research Station sponsors workshops on a variety of mathematical topics, including statistics. The setting for these meetings is the Banff Center, an island of tranquillity and vigorous intellectual activity in the town of Banff. Most of the activities at the centre are in the arts, but science and mathematics are found there too.

Thirty-three people attended the workshop, of whom 13 were statisticians, the remainder being mostly experimental particle physicists, with astrophysicists making up the total. It concentrated on three specific topics: upper limits, in situations where there are systematic effects (nuisance parameters); assessing the significance of possible new effects, in the presence of nuisance parameters; and the separation of events that are signal from those that are caused by background, a classification process that is required in almost every statistical analysis in high-energy physics. For each of these topics there were two coordinators, a physicist and a statistician.

The three topics, of course, interact with each other. Searches for new physics will result in an upper limit when little or no effect is seen, but will need a significance calculation when a discovery is claimed. The multivariate techniques are generally used to provide the enriched subsample of data on which these searches are performed. Just as for limits or significance, nuisance parameters can be important in multivariate separation methods.

As this was a workshop, the organizers encouraged participants to be active in the weeks before the meeting. Reading material was circulated as well as some simulated data, on which participants could run computer programmes that incorporated their favourite algorithms. This enabled all participants to become familiar with the basic issues before the start of the meeting. The workshop began with introductory talks on particle physics and typical statistical analyses, and Monte Carlo simulations in high-energy physics. These primarily described for statisticians the terminology, the sort of physics issues that we try to investigate in experimental particle physics, what the statistical problems are and how we currently cope with them, and so on.

Jim Linnemann of Michigan State University publicized a new website, www.phystat.org, which provides a repository for software that is useful in statistical calculations for physics. Everyone is encouraged to contribute suitable software, which can range from packages that are suitable for general use, to the code that is specifically used in preparing a physics publication.

The convenors of the various subgroups led the remaining sessions on the first day. Very few talks were scheduled for subsequent days, specifically to leave plenty of time for discussions and for summary talks and to provide an opportunity for exploring fundamental issues.

Limits and significance

The discussion about limits ranged from Bayesian techniques, via profile likelihood to pure frequentist methods. Statisticians made the interesting suggestion that hierarchical Bayes might be a good approach for a search for new physics in several related physics channels. There was a lively discussion about the relative merits of the possible approaches, and about the relevant criteria for the comparison. After a late evening session, it was decided that data would be made available by the limits convenor, Joel Heinrich of the University of Pennsylvania, so that participants could try out their favourite methods, and Heinrich would compare the results. This work is expected to continue until November.

Discussions considered the significance issue within particle physics, with several other examples in astrophysics. Indeed it arises in a range of subjects in which anomalous effects are sought. Luc Demortier of Rockefeller University in New York, the physics convenor on significance, detailed eight ways in which nuisance parameters can be incorporated into these calculations, and discussed their performance. This will be a crucial issue for new particle searches at the Large Hadron Collider at CERN, where the exciting discoveries that may be made include the Higgs boson, supersymmetric particles, leptoquarks, pentaquarks, free quarks or magnetic monopoles, extra spatial dimensions, technicolour, the substructure of quarks and/or leptons, mini black holes, and so on. In all cases some of the backgrounds will be known only approximately and it will be necessary to distinguish among peaks that are merely statistical fluctuations, errors and genuine signals of new physics.

Demortier also addressed the issues of whether it is possible to assess the significance of an interesting effect, which is obtained by physicists adjusting selection procedures while looking at the data; and why particle physics usually demands the equivalent of a 5 σ fluctuation of the background before claiming a new discovery. (The probability of obtaining such a large fluctuation by chance is less than one part in a million.)

Signal or background?

The sessions on multivariate signal–background separation resulted in positive discussions between physicists and statisticians. Byron Roe of the University of Michigan explained the various techniques that are used for separating signal from background. He described how for the MiniBooNE experiment at Fermilab, Monte Carlo studies showed that boosted decision trees yielded good separation, and coped with more than 100 input variables. An important issue concerned assessing the effect on the physical result, in this case neutrino oscillation parameters, of possible systematic effects. One of the conventional methods for doing this is to vary each possible systematic effect by one standard deviation, and to see how much this affects the result; the different sources are then combined. Roe pointed out that there is much to recommend an alternative procedure, which investigated the effect on the result of simultaneously varying all possible systematic sources at random.

Radford Neal, a statistician from Toronto University, took up this theme in more detail, and also emphasized the need for any statistical procedure to be robust against possible uncertainties on its input assumptions. One of Neal’s favourite methods uses Bayesian neural nets. He also described graphical methods for showing which of the input variables were most useful in providing the separation of signal and background.

Ilya Narsky of Caltech gave a survey of the various packages that exist for performing signal–background separation, including R, WEKA, MATLAB, SAS, S+ and his own StatPatternRecognition. Narsky suggested that the criteria for judging the usefulness of such packages should include versatility, ease of implementation, documentation, speed, size and graphics capabilities. Berkeley statistician Nicolai Meinshausen gave a useful demonstration of the statistical possibilities within R.

The general discussion in this sub-group covered topics such as the identification of variables that were less useful, and whether to remove them by hand or in the programme; the optimal approach when there are several different sources of background; the treatment of categorical variables; and how to compare the different techniques. This last issue was addressed by a small group of participants working one evening using several different classifiers on a common simulated data set. Clearly there was not the time to optimize the adjustable parameters for each classification method, but it was illuminating to see how quickly a new approach could be used and comparative performance figures produced. Reinhard Schweinhorst of Michigan State University then presented the results.

As far as the workshop as a whole was concerned, it was widely agreed that it was extremely useful having statisticians present to discuss new techniques, to explain old ones and to point out where improvements could be made in analyses. It was noted, however, that while astrophysics has been successful in involving statisticians in their analyses to the extent that their names appear on experimental papers, this is usually not the case in particle physics.

Several reasons for this have been suggested. One is that statisticians enjoy analysing real data, with its interesting problems. Experimental collaborations in particle physics tend to be very jealous about their data, however, and are unwilling to share it with anyone outside the collaboration until it is too old to be interesting. This results in particle physicists asking statisticians only very general questions, which the statisticians regard as unchallenging. If we really do want better help from statisticians, we have to be prepared to be far more generous in what we are ready to share with them. A second issue might be that in other fields scientists are prepared to provide financial support to a statistics post-doc to devote his/her time and skills to helping analyse the data. In particle physics this is, at present, very unusual.

There was unanimous agreement among attendees that the Banff meeting had been stimulating and useful. The inspiring location and environment undoubtedly contributed to the dynamic interaction of participants. The sessions were the scene of vigorous and enlightening discussion, and the work continued late into the evenings, with many participants learning new techniques to take back with them to their analyses. There was real progress in understanding practical issues that are involved in the three topics discussed, and everyone agreed that it would be useful and enjoyable to return to Banff for another workshop.

Further reading

The most recent PHYSTAT conference was in Oxford (see www.physics.ox.ac.uk/phystat05/, which has links to the earlier meetings). Details about the Banff meeting are at www.pims.math.ca/birs/birspages.php?task=displayevent&event_id=06w5054.

The post PHYSTAT tackles the significance problem appeared first on CERN Courier.

]]>
https://cerncourier.com/a/phystat-tackles-the-significance-problem/feed/ 0 Feature
Paris hosts research and innovation expo https://cerncourier.com/a/paris-hosts-research-and-innovation-expo/ https://cerncourier.com/a/paris-hosts-research-and-innovation-expo/#respond Tue, 06 Jun 2006 22:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/paris-hosts-research-and-innovation-expo/ From 8-11 June, the 2nd European Research and Innovation Exhibition, being held at Porte de Versailles Exhibition Centre in Paris will open its doors to the public.

The post Paris hosts research and innovation expo appeared first on CERN Courier.

]]>
From 8-11 June, the 2nd European Research and Innovation Exhibition, being held at Porte de Versailles Exhibition Centre in Paris will open its doors to the public. Aimed both at professionals in research and industry and at the general public, including university and high-school students, the exhibition brings together the major European players in research and innovation. These include CERN, the Dapnia Laboratory of the Commissariat à l’Energie Atomique at Saclay, and the Institut National de Physique nucléaire et de physique des particules (IN2P3) of the Centre national de la recherche scientifique (CNRS).

The first exhibition, held in 2005, attracted 24,000 visitors. This year, to emphasize the international nature of the event, Germany is guest of honour, with participation by SIEMENS, one of the country’s leading exponents of industrial innovation, along with the French-German Association for Sciences and Technology.

The widely varied programme of conferences and round tables allows visitors to familiarize themselves with the achievements and ambitions of research and innovation and their fundamental importance to the future of the European Community, both in the scientific field and research funding and applications. A Young Scientists’ space also will also give exhibitors a chance to meet high-calibre young graduates who are seeking employment.

Presentations include a talk on on elementary particles by Christelle Roy from the CNRS Laboratoire de Physique Subatomique et des Technologies Associées (Subatech) in Nantes, and Michel Spiro, director of IN2P3 in Paris. The stands include an exhibit by CERN, highlighting aspects of technology transfer.

The post Paris hosts research and innovation expo appeared first on CERN Courier.

]]>
https://cerncourier.com/a/paris-hosts-research-and-innovation-expo/feed/ 0 News From 8-11 June, the 2nd European Research and Innovation Exhibition, being held at Porte de Versailles Exhibition Centre in Paris will open its doors to the public.
High schools focus on the extreme universe https://cerncourier.com/a/high-schools-focus-on-the-extreme-universe/ https://cerncourier.com/a/high-schools-focus-on-the-extreme-universe/#respond Tue, 02 May 2006 22:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/high-schools-focus-on-the-extreme-universe/ James Pinfold reports on the large number of projects that are forging a connection between research in ultra-high-energy cosmic rays and practical scientific experience in schools.

The post High schools focus on the extreme universe appeared first on CERN Courier.

]]>
On 15 October 1991 the highest-energy cosmic-ray particle ever measured struck Earth’s atmosphere tens of kilometres above the Utah Desert. Colliding with a nucleus, it lit up the night for an instant and then was gone. The Fly’s Eye detector at the Dugway Proving Grounds in Utah captured the trail of light emitted as the cascade of secondary particles created in the collision made the atmosphere fluoresce. The Fly’s Eye researchers measured the energy of the unusual ultra-high-energy cosmic-ray event – dubbed the “Oh-My-God (OMG) event” – at 320 exa-electron-volts (EeV), or 320 × 1018 eV. In SI units, the particle, probably a proton, hit the atmosphere with a total kinetic energy of about 5 J. For a microscopic particle this is a truly macroscopic energy – enough to lift a mass of 1 kg half a metre against gravity. On 3 December 1993, on the opposite side of the world, the Akeno Giant Air Shower Array (AGASA) in Japan recorded another OMG event with an energy of 200 EeV. In this case the cosmic ray was recorded using a large array of detectors on the ground to measure the extended air shower (EAS) resulting from the primary cosmic ray interacting with the atmosphere.

CCEhig1_05-06

Since these first observations at least a dozen OMG events have been recorded, confirming the phenomenon and mystifying cosmic-ray physicists. It seemed that particles with energies more than about 50 EeV should not reach Earth from any plausible source in the universe more than around 100 million parsecs distant, as they should rapidly lose their energy in collisions with the 2.7 K cosmic-microwave background radiation from the Big Bang – the Greisen-Zatsepin-
Kuzmin limit. While many explanations have been proposed, experiments have so far failed to decipher a clear message from these highly energetic messengers, and the existence of the OMG events has become a profound puzzle. Now a new eye on these ultra-high-energy events has come into focus, based on the great plain of the Pampa Amarilla in western Argentina. The Pierre Auger Observatory (PAO), with its unprecedented collecting power, has begun to study cosmic rays at the highest energies.

CCEhig2_05-06

However signs of the extreme-energy universe may also come in a different guise – not as a single OMG event but rather as bursts of events of more-modest energy. On 20 January 1981, near Winnipeg, a cluster of 32 EASs – with an estimated mean energy of 3000 tera-electron-volts – was observed within 5 min (Smith et al. 1983). Only one such event would have been expected. This observation was the only one of its kind during an experiment that recorded 150,000 showers in 18 months. In the same year an Irish group reported an unusual simultaneous increase in the cosmic-ray shower rate at two recording stations 250 km apart (Fegan et al. 1983). The event, recorded in 1975, lasted 20 s and was the only one of its kind detected in three years of observation.

CCEhig3_05-06

There have since been a few hints of such “correlated” cosmic-ray phenomena seen by some small cosmic-ray experiments dotted around the world, such as a Swiss experiment that deployed four detector systems in Basel, Bern, Geneva and Le Locle, with a total enclosed area of around 5000 km2. In addition, the Baksan air-shower-array group has presented evidence from data from 1992 to 1996 for short bursts of super-high-energy gamma rays from the direction of the active galactic nucleus Markarian 501. The AGASA collaboration has also reported small-scale clustering in arrival directions, and possibly in the arrival times of these clustering events.

CCEhig4_05-06

One mechanism that could generate correlated showers over hundreds of kilometres is the photodisintegration of high-energy cosmic-ray nuclei passing through the vicinity of the Sun, first proposed by N M Gerasimova and Georgy Zatsepin back in the 1950s. Other more recent and more exotic examples of phenomena that could give rise to large-area non-random cosmic-ray correlations include relativistic dust grains, antineutrino bursts from collapsing stellar systems, primordial black-hole evaporation and even mechanisms arising from the presence of extra dimensions.

CCEhig5_05-06

Working together

Whichever way the high-energy universe is incarnated on Earth, the signs should be exceedingly rare, requiring large numbers of detectors deployed over vast areas to provide a reasonable signal. The detection of a single OMG particle requires dense EAS arrays and/or atmospheric fluorescence detectors, with detector spacings of the order of a kilometre, as in the PAO. Detection of cosmic-ray phenomena correlated over very large areas requires even bigger detection areas, which at present are economically feasible only with more sparse EAS arrays (on average much fewer than one detector per km2). In fact, global positioning system (GPS) technology makes it possible to perform precision timing over ultra-large areas, enabling a number of detector networks to be deployed as essentially one huge array. An example is the Large Area Air Shower array, which started taking data in the mid-1990s. It comprises around 10 compact EAS arrays spread across Japan, forming a sparse detector network with an unprecedented enclosed area of the order of 30,000 km2.

Now, however a new dimension to cosmic-ray research has opened up. In 1998 in Alberta, building on a proposal first presented in 1995, the first node of a new kind of sparse very-large-area network of cosmic-ray detectors began to take data. The innovative aspect of the Alberta Large-area Time-coincidence Array (ALTA) is that it is deployed in high schools. By the end of 1999 three high-school sites were operating, each communicating with the central site at the University of Alberta. In 2000 the Cosmic Ray Observatory Project (CROP), centred at the University of Nebraska, set up five schools with detectors from the decommissioned Chicago Air Shower Array. Around the same time the Washington Large-area Time-coincidence Array (WALTA) installed its first detectors.

The ALTA, CROP and WALTA projects have a distinct purpose – to forge a connection between two seemingly unrelated but equally important aims. The first is to study the extreme-energy universe by searching for large-area cosmic-ray coincidences and their sources; the second is to involve high-school students and teachers in the excitement of fundamental research. These “educational arrays”, with their serious research purpose, provide a unique educational experience, and the paradigm has spread to many other sites in North America. The detector systems are simple but effective. Following the ALTA/CROP model they use a small local array of plastic scintillators, which are read by custom-made electronics and which use GPS for precise coincidence timing with other nodes in a network of local arrays over a large area. Most of the local systems forming an array use three or more detectors, which, with a separation of the order of 10 m and a hard-wired coincidence, allow accurate pointing at each local site. Today the ALTA/CROP/WALTA arrays involve more than 60 high schools and there are three further North American educational arrays in operation: the California High School Cosmic Ray Observatory (CHICOS) and the Snowmass Area Large-scale Time-coincidence Array (SALTA) in the US, and the Victoria Time-coincidence Array (VICTA) in Canada. At least seven more North American projects are planned.

The CHICOS array is the largest ground-based array in the Northern Hemisphere. Its detectors, donated by the CYGNUS collaboration, are deployed on more than 70 high-school rooftops across 400 km2 in the Los Angeles area. Each site has two 1 m2 plastic scintillator detectors separated by a few metres. Local pointing at each site is not possible, nor is it required as CHICOS uses GPS pointing across multiple sites to concentrate on the search for single ultra-high-energy cosmic-ray air showers. Recently the collaboration reported their results at the 29th International Cosmic Ray Conference in Pune, India (McKeown et al. 2005).

Innovative detection techniques have also been employed in this burgeoning collaboration between researchers and high-school students and teachers in North America. A prime example is the project for Mixed Apparatus for Radar Investigation of Cosmic Rays of High Ionization (MARIACHI), based at Brookhaven National Laboratory, New York. The plan is for the experiment to detect ultra-high-energy cosmic rays using the passive bistatic radar technique, where stations continuously listen to a radio frequency that illuminates the sky above it. The ionization trails of ultra-high-energy cosmic-ray showers – as well as meteors, micro-meteors and even aeroplanes – in the field of the radio beam will reflect radio waves into the high-school-based detectors. These schools will also be equipped with conventional cosmic-ray air-shower detectors. The technique, if successful, will speed the construction of ultra-large-area cosmic-ray detectors.

The European endeavour

Across the Atlantic, schools in many European countries are also getting involved in studying the extreme-energy universe (see figure 1). In 2001 physicists from the University of Wuppertal proposed SkyView – the first European project to suggest using high-school-based cosmic-ray detectors. This ambitious project proposed an immense 5000 km2 array, the size of the PAO, using thousands of universities, colleges, schools and other public buildings in the North Rhine-Westphalia area. Roughly a year later CERN entered the field with a collaborative effort to distribute cosmic-ray detectors from the terminated High Energy Gamma Ray Astronomy project in schools around Dusseldorf. A test array of 20 counters was set up at Point 4 on the tunnel for the Large Electron-Positron (LEP) collider, with the aim of studying coincidences with counters installed about 5 km away at Point 3 as part of cosmic-ray studies by the L3 experiment on the LEP.

CCEhig6_05-06

Also in 2002 the High School Project on Astrophysics Research with Cosmics (HiSPARC), initiated by physicists from the University of Nijmegen in the Netherlands, joined the European effort. HiSPARC now has five regional clusters of detectors being developed in the areas of Amsterdam, Groningen, Leiden, Nijmegan and Utrecht. Around 40 high schools are participating so far and more are joining. In March 2005 the HiSPARC array registered an event of energy 8 × 1019 eV, in the ultra-high-energy “ankle” region of the cosmic-ray energy spectrum, which was also reported at the international conference in Pune (Timmermans 2005).

The HiSPARC collaboration is also planning to use a recent and exciting development of the Low Frequency Array (LOFAR) Prototype Station (LOPES) experiment in Karlsruhe. Using a relatively simple radio antenna, LOPES detects the coherent low-frequency radio signal that accompanies the showers of secondary particles from ultra-high-energy cosmic rays. A large array of these low-frequency radio antennas, the LOFAR observatory, is already being constructed in the Netherlands. Such technology can also be exploited by high-school-based observatories around the world to expand their capability rapidly to become effective partners in the search for point sources of ultra-high-energy particles.

Elsewhere, the School Physics Project was initiated in Finland and is now under development. Also in 2002 the Stockholm Educational Air Shower Array (SEASA) was proposed to the Royal Institute of Technology in Stockholm. SEASA has two stations of cosmic-ray detectors running at the AlbaNova University Centre and the first cluster of stations for schools in the Stockholm area is now in the production stage. Meanwhile, in the Czech Republic the Technical University in Prague and the University of Opava in the province of Silesia – working closely with the ALTA collaboration – each have a detector system taking data, with a third to be deployed this summer.

A number of other European efforts are gearing up, including two that have links to the discovery of cosmic-ray air showers in 1938 by Pierre Auger, Roland Maze and Thérèse Grivet-Meyer working at the Paris Observatory. The Reseau de Lycées pour Cosmiques (RELYC) project, centred on the College de France/Laboratoire Astroparticule et Cosmologie in Paris, is preparing to install detectors in high schools close to where Auger and colleagues performed their ground-breaking experiments. The Roland Maze project is centred on the Cosmic Ray Laboratory of the Andrzej Soltan Institute for Nuclear Studies in Lodz, Poland, where it continues a long tradition in studies of cosmic-ray air showers initiated in partnership with Maze some 50 years ago. The plans are to deploy detectors in more than 30 local high schools. In the UK, physicists from King’s College London in collaboration with the Canadian ALTA group will place detector systems in the London area during 2006. In northern England, Preston College is continuing to work on a pilot project, initiated in 2001, to develop an affordable cosmic-ray detection system as part of the Cosmic Schools Group Proposal, involving the University of Liverpool and John Moores University in Liverpool. Finally, a project to set up cosmic-ray telescopes with GPS in 10 Portuguese high schools is underway, spearheaded by the Laboratório de Instrumentação e Física Experimental de Partículas and the engineering faculty of the Technical University in Lisbon.

While the majority of the European projects are based on plastic scintillators, the Italian Extreme Energy Events (EEE) project has opted instead for multigap resistive plate chambers (MRPCs) as their basic detector element. These allow a precise measurement of the direction and time of arrival of a cosmic ray. The aim of this project, the roots of which date back to 1996, is to have a system of MRPC telescopes distributed over a surface of 106 km2, for precise detection of extreme-energy events (Zichichi 1996). These chambers are similar to those that will be used in the time-of-flight detector for the ALICE experiment at CERN’s Large Hadron Collider. Three MRPC chambers form a detector “telescope” that can reconstruct the trajectories of cosmic muons in a shower. At present 23 schools from across Italy are involved in the pilot project, with around 100 others on a waiting list from the length and breadth of the Italian peninsula. More than 60 MRPCs have been built at CERN by teams of high-school students and teachers under the guidance of experts from Italian universities and the INFN.

A worldwide network

CCEhig7_05-06

Most of the major groups in Canada and the US have formed a loose collaboration – the North American Large-area Time Coincidence Arrays (NALTA) – with more than 100 detector stations spread across North America (figure 2). The aim is to share educational resources and information. However, it is also planned to have one central access point where students and researchers can use data from all of the NALTA sites, creating in effect a single giant array. Such a combined network across North America could eventually consist of thousands of cosmic-ray detectors, with the primary research aim of studying ultra-high-energy cosmic-ray showers and correlated cosmic-ray phenomena over a very large area. Until the PAO collaboration constructs its second array in Colorado, US, the NALTA arrays, along with their European counterparts, will dominate the ground-based investigation of the extreme-energy universe in the Northern Hemisphere.

The European groups are also developing a similar collaboration, called Eurocosmics. It is clear that a natural next step is to combine the North American and European networks into a worldwide network that could contribute significantly to elucidating the extreme-energy universe. Such a network could aid and encompass other efforts throughout the world, including in developing countries where it could provide a natural bridgehead into the global scientific culture.

The post High schools focus on the extreme universe appeared first on CERN Courier.

]]>
https://cerncourier.com/a/high-schools-focus-on-the-extreme-universe/feed/ 0 Feature James Pinfold reports on the large number of projects that are forging a connection between research in ultra-high-energy cosmic rays and practical scientific experience in schools. https://cerncourier.com/wp-content/uploads/2006/05/CCEhig1_05-06.jpg
A cosmic vision for world science https://cerncourier.com/a/viewpoint-a-cosmic-vision-for-world-science/ Tue, 02 May 2006 22:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/viewpoint-a-cosmic-vision-for-world-science/ James Pinfold considers how relatively low-cost experiments to study ultra-high-energy cosmic rays could bring developing countries into frontier research.

The post A cosmic vision for world science appeared first on CERN Courier.

]]>
Many developed countries face the challenge of encouraging more young people to take up science to ensure future innovation to benefit society. However, there is a related and equally important challenge – to promote a scientific infrastructure to aid the academic and career ambitions of members of under-represented and economically disadvantaged groups, as well as scientists from developing countries, to increase their participation in scientific and technical fields worldwide.

CCEmul1_05-06

Severe constraints on resources, which are a common feature in developing countries, mean that research there does not usually consist of designing and making equipment for a new experiment at the forefront of the field. In many schools, colleges and universities laboratories either do not exist or are poorly equipped. Consequently, the brain drain of bright young scientists from developing to developed countries seems to be the norm, and further intellectually impoverishes the developing world. Collaborative programmes between scientists from developed and developing countries are urgently needed.

The Abdus Salam International Centre for Theoretical Physics (ICTP) in Trieste has set an international example by providing both a forum and practical support for collaboration in theoretical physics between developing and developed countries. It has also supported indigenous physics programmes in developing countries. Importantly, the director of ICTP, Katepalli Sreenivasan, plans to include experimental physics in the programme. CERN has also taken a significant step to foster a relationship with physicists from developing countries that does not require large cash contributions to CERN, but instead encourages the production of detector components at the home laboratories. This lets physicists from developing countries participate in frontier research.

The Pierre Auger Collaboration is involved in Vietnam in developing experimental work to understand the universe at the highest energies. The Vietnam Auger Training Laboratory (VATLY) at the Institute for Nuclear Science and Techniques in Hanoi was inaugurated as a training ground for future experimentalists in astroparticle physics and related areas, and an exact replica of the water Cherenkov detector used in the Pierre Auger Observatory has been installed at VATLY. More recently, the atmospheric muon spectrum was measured in Vietnam for the first time. The phenomenology of neutrino oscillation is also being studied at this laboratory. Indeed, a Vietnamese community for experimental particle physics is developing well – in 2001 a group from the Institute of Physics in Ho Chi Minh City joined the D0 collaboration at Fermilab.

In many areas of research, leading-edge science is expensive and there are few support networks for disadvantaged groups. However, cost-effective projects to investigate the nature of ultra-high-energy cosmic rays (UHECR) are already being developed for high schools and could provide an ideal vehicle for such an effort. These projects demonstrate the basic elements of research and technology, with modern detectors, fast electronics, GPS timing, computerized data acquisition and data analysis. Perhaps just as importantly, they also teach social skills such as collaborative effort, organization, long-term planning and teamwork.

Efforts to bring the developing world into such projects have already begun. For example, the collaboration behind the Mixed Apparatus for Radar Investigation of Cosmic-rays of High Ionization project has established contact with the Maseno University in Kisumu, Kenya, the University of Zambia in Lusaka and the University of Rio de Janeiro in Brazil, to investigate the hypothesis that some forms of lightning are induced by cosmic rays. The collaboration is also working with Rio de Janeiro to deploy detectors that register UHECR showers and meteors in high-school-based receivers.

These are just two examples of the diverse topics related to the “cosmic connection” between research and education in both the developed and developing world. These include not only the astrophysics and particle physics of cosmic rays, but also topics in biology (e.g. the effects of natural radiation), mathematics, computer science and programming, chemistry, and environmental and Earth sciences (e.g studying the chemistry of ozone and how that could affect the transmission of cosmic rays).

The educational paradigm created by the networks of cosmic-ray arrays in high schools is one that can be employed in many areas. In geophysics, for example, one could use distributed arrays of seismometers to study geological activity over a large area. A specific example is the project BAMBI, which promotes the construction of an amateur array of radio telescopes distributed over a large area to study the radio sky at 4 GHz and search for signs of extraterrestrial intelligence. Such large-area, national and international school-based detector networks could aid and encompass other efforts throughout the world including developing countries, where it could provide entry to the global scientific community.

The post A cosmic vision for world science appeared first on CERN Courier.

]]>
Opinion James Pinfold considers how relatively low-cost experiments to study ultra-high-energy cosmic rays could bring developing countries into frontier research. https://cerncourier.com/wp-content/uploads/2006/05/CCEmul1_05-06.jpg
Trieste seeks participants for new fourth-generation light source https://cerncourier.com/a/trieste-seeks-participants-for-new-fourth-generation-light-source/ https://cerncourier.com/a/trieste-seeks-participants-for-new-fourth-generation-light-source/#respond Tue, 28 Mar 2006 22:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/trieste-seeks-participants-for-new-fourth-generation-light-source/ Sincrotrone Trieste has announced a call for letters of intent to participate in developing and using a new fourth-generation light source, FERMI@Elettra, operating alongside the present ELETTRA source near Trieste.

The post Trieste seeks participants for new fourth-generation light source appeared first on CERN Courier.

]]>
Sincrotrone Trieste has announced a call for letters of intent to participate in developing and using a new fourth-generation light source, FERMI@Elettra, operating alongside the present ELETTRA source near Trieste. The FERMI@Elettra source will be added to the existing 2.0-2.4 GeV synchrotron and will be one of the first single-pass free-electron laser (FEL) facilities in the world.

CCEnew7_04-06

FERMI@Elettra will operate in harmonic generation mode at wavelengths in the UV to soft X-ray range. It will initially have two FELs covering the wavelength ranges of 100-40 nm and 40-10 nm. The existing ELETTRA linac will be extended with a new 70 m long klystron gallery; a 65 m shielded undulator hall and a new experimental hall with eight beamlines will also be added. Support laboratories will be built at the end of the chain. The technical design study has been completed, the commissioning of the new booster is planned for summer 2007 and the two FELs are expected to be operational by the end of 2009.

ELETTRA, which is managed by the non-profit organization Sincrotrone Trieste, currently has more than 800 users a year; 86% are from European countries, working on research in physics, chemistry, earth science, material and life science. Proposals for FERMI@Elettra should be submitted before 30 April. Proponents selected by the international advisors will be involved in developing the scientific-exploitation programme (beam lines, end stations and Ramp;D projects), to be defined by the end of 2006.

The post Trieste seeks participants for new fourth-generation light source appeared first on CERN Courier.

]]>
https://cerncourier.com/a/trieste-seeks-participants-for-new-fourth-generation-light-source/feed/ 0 News Sincrotrone Trieste has announced a call for letters of intent to participate in developing and using a new fourth-generation light source, FERMI@Elettra, operating alongside the present ELETTRA source near Trieste. https://cerncourier.com/wp-content/uploads/2006/03/CCEnew7_04-06.jpg
Auger observatory celebrates progress https://cerncourier.com/a/auger-observatory-celebrates-progress/ https://cerncourier.com/a/auger-observatory-celebrates-progress/#respond Wed, 08 Feb 2006 00:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/auger-observatory-celebrates-progress/ On 10 November, the Pierre Auger Observatory (PAO) began a major two-day celebration at its headquarters in Malargüe, Argentina, to mark the progress of the observatory and the presentation of the first physics results at the International Cosmic Ray Conference in the summer 2005.

The post Auger observatory celebrates progress appeared first on CERN Courier.

]]>
On 10 November, the Pierre Auger Observatory (PAO) began a major two-day celebration at its headquarters in Malargüe, Argentina, to mark the progress of the observatory and the presentation of the first physics results at the International Cosmic Ray Conference in the summer 2005. One of several experiments connecting particle astrophysics and accelerator-based physics, the PAO studies extensive air showers created by primary cosmic rays with energies greater than 1018 eV. With more than 1000 of the 1600 surface detectors and 18 of the 24 fluorescence detectors currently installed and operating, the observatory will eventually cover 3000 km2 of the expansive Pampa Amarilla.

CCEnew5_01-06

Over 175 visitors from the 15 collaborating countries attended the celebration, with guests including heads of collaborating institutions, representatives from supporting funding agencies, delegates from Argentinian embassies, local and provincial authorities, plus press and media teams. On the first day, experiment heads Jim Cronin, Alan Watson and Paul Mantsch presented the history and status of the observatory to the assembled visitors in Malargüe’s Convention Center. This was followed by a ceremony on the Auger campus to unveil a commemorative monument made of glass and stone. Ceremony speakers included Malargüe’s mayor and the governor of Mendoza Province. Guests then retired to a traditional asado that featured local cuisine and entertainment by folk musicians and tango dancers. On the second day, attendees toured the vast observatory site, including surface detectors on the pampa and one of the remote fluorescence detector buildings.

As part of the celebration, the collaboration sponsored a science fair in the observatory’s Assembly Building, organized by four local science teachers for teachers and students from high schools in Mendoza Province. Twenty-nine school groups, many travelling long distances to reach Malargüe, presented research projects on topics in physics, chemistry or technology. A team of PAO physicists judged the displays and awarded prizes to the most outstanding young scientists. In March 2006, the opening of a new high school in Malargüe is anticipated, partial funding for which was secured by Cronin from the Grainger Foundation in the US.

The post Auger observatory celebrates progress appeared first on CERN Courier.

]]>
https://cerncourier.com/a/auger-observatory-celebrates-progress/feed/ 0 News On 10 November, the Pierre Auger Observatory (PAO) began a major two-day celebration at its headquarters in Malargüe, Argentina, to mark the progress of the observatory and the presentation of the first physics results at the International Cosmic Ray Conference in the summer 2005. https://cerncourier.com/wp-content/uploads/2006/02/CCEnew5_01-06-feature.jpg
PHYSTAT: making the most of statistical techniques https://cerncourier.com/a/phystat-making-the-most-of-statistical-techniques/ https://cerncourier.com/a/phystat-making-the-most-of-statistical-techniques/#respond Wed, 08 Feb 2006 00:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/phystat-making-the-most-of-statistical-techniques/ Statistics has always been an essential tool in experimental particle physics, and today this is truer than ever. In the early days emulsions and bubble-chamber photographs were scanned slowly by hand; now modern electronic detectors perform equivalent processing quickly and automatically. However, physicists still classify and count their events and then, eagerly or reluctantly, turn […]

The post PHYSTAT: making the most of statistical techniques appeared first on CERN Courier.

]]>
Statistics has always been an essential tool in experimental particle physics, and today this is truer than ever. In the early days emulsions and bubble-chamber photographs were scanned slowly by hand; now modern electronic detectors perform equivalent processing quickly and automatically. However, physicists still classify and count their events and then, eagerly or reluctantly, turn to statistical methods to decide whether the numbers are significant and what results are valid.

As the subject has progressed, new themes have emerged. The high numbers of events obtained by CERN’s Large Electron-Positron collider (a Z-factory), the B-factories PEP-II and KEKB at SLAC and KEK respectively, and the experiments at DESY’s HERA collider, mean that statistical errors below 1% are now common. Many areas have become dominated by systematic effects, a relatively untrodden and much less well understood field.

On the theoretical side, the high precision of theories such as quantum electrodynamics and quantum chromodynamics means that the tiny uncertainties in their predictions have to be carefully studied and understood. Supersymmetry and other “new physics” models predict signals that depend on several parameters of the theory, and when an experiment fails to see such a signal the restrictions this places on possible values for these parameters has to be worked out. When different experiments probe the same basic theory, we need to evaluate the combined implication of their results.

The science of statistics is also developing fast. The availability of large amounts of processing power opens new possibilities for evaluating statistical models and their predictions. Bayesian statistics is a rapidly growing field in which a great deal of progress has been made. Machine learning techniques, such as artificial neural networks and decision trees, are flourishing, with further applications continually being found that open up new possibilities for exploiting data.
Astronomers and cosmologists are also developing the power and sophistication of their statistical techniques. Telescopes are becoming larger and more powerful, and the readout from their instruments with charge-coupled detectors produces a torrent of data. Observations at different wavelengths, from gamma rays to radio waves, from ground-based observatories and satellites are combined to yield clues about the nature of distant objects, the processes that power them and other features of the universe. Details of the distribution of the cosmic microwave background will, when properly interpreted, tell us what happened in the Big Bang at energies beyond the reach of man-made accelerators.

The PHYSTAT series of conferences and workshops provide a forum in which different communities can meet and exchange ideas. A first workshop of particle physicists at CERN in 2000 was followed by one at Fermilab in 2001, and then a full conference in Durham in 2002, which benefited from the presence of statisticians as well as physicists. At SLAC in 2003, astronomers and cosmologists were included (see CERN Courier March 2004 p22). This was so successful that it was repeated at the most recent conference, “Statistical Problems in Particle Physics, Astrophysics and Cosmology”, held in Oxford in September 2005 and organized by Louis Lyons.

PHYSTAT 2005 consisted of a wide-ranging programme of parallel and plenary talks. One of the most influential statistical thinkers of the 20th century, David Cox of Oxford University, gave the opening keynote speech, in which he provided an authoritative account of the Bayesian and frequentist approaches to inference. The official programme was supplemented by intense discussions in corridors, coffee lounges and local pubs, as the participants thrashed out ideas that ranged from the philosophical abstractions of the meaning of probability to the pragmatic and technical details of different computer systems.

These techniques are being fed back into the community through the activities of the participants, many of whom are active in analysis on various different experiments, through further meetings (a follow-up afternoon meeting in Manchester attracted 80 particle physicists from the UK), through the academic training programmes offered at CERN and other laboratories, and through graduate conferences and summer schools. There are developing plans for a repository of software that performs these increasingly sophisticated statistical tests. Further workshops are planned for 2006 and beyond.

Further reading

More information can be found at www.physics.ox.ac.uk/phystat05/ and at www.pa.msu.edu/people/linnemann/stat_resources.html.

The post PHYSTAT: making the most of statistical techniques appeared first on CERN Courier.

]]>
https://cerncourier.com/a/phystat-making-the-most-of-statistical-techniques/feed/ 0 Feature
ILC comes to Snowmass https://cerncourier.com/a/ilc-comes-to-snowmass/ https://cerncourier.com/a/ilc-comes-to-snowmass/#respond Fri, 25 Nov 2005 00:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/ilc-comes-to-snowmass/ In August nearly 700 scientists and engineers from North America, Asia and Europe got together at Snowmass in the US to advance the design of the International Linear Collider and its detectors, and to refine the physics case for this next-generation machine.

The post ILC comes to Snowmass appeared first on CERN Courier.

]]>
In August 2004 the Executive Committee of the American Linear Collider Physics Group (ALCPG), galvanized by the technology choice for the future International Linear Collider (ILC), decided to host an extended international summer workshop to further the detector designs and advance the physics arguments. Subsequently, the International Linear Collider Steering Committee (ILCSC) elected to hold their Second ILC Accelerator Workshop in conjunction with the Physics and Detector Workshop. Ed Berger of Argonne and Uriel Nauenberg of Colorado were selected to co-chair the organizing committee for this joint workshop, which was held at Snowmass, Colorado, US, for two weeks in August. ALCPG co-chairs Jim Brau of Oregon and Mark Oreglia of Chicago, along with accelerator community representatives Shekhar Mishra from Fermilab and Nan Phinney from SLAC, rounded out the committee. While hosted by the North American community, the workshops were planned with worldwide participation in all the advisory committees and in the scientific programme committees for the accelerator, detector, physics and outreach activities.

CCEilc1_12-05

As Berger described in the opening address, the primary accelerator goals at Snowmass were to define an ILC Baseline Configuration Document – to be completed by the end of 2005 – and to identify critical R&D topics and timelines. On the detector front, the goal was to develop detector design studies with a firm understanding of the technical details and physics performance of the three major detector concepts, the required future R&D, test-beam plans, machine-detector interface issues, beamline instrumentation and cost estimates. The physics goals were to advance and sharpen ILC physics studies, including precise higher-order calculations, synergy with the physics programme of CERN’s Large Hadron Collider (LHC), connections to cosmology, and, very importantly, relationships to the detector designs. A crucial fourth goal was to facilitate and strengthen the broad participation of the scientific and engineering communities in ILC physics, detectors and accelerators, and to engage the greater public in this exciting work.

A rich new world

Over the past few years, prestigious panels in Europe (the European Committee for Future Accelerators – ECFA), Asia (the Asian Committee for Future Accelerators – ACFA) and the US (the High Energy Physics Advisory Panel – HEPAP) have reached an unprecedented consensus that the next major accelerator for world particle physics should be a 500 GeV electron-positron linear collider with the capability of extension to higher energies. This machine would be ideal for exploiting the anticipated discoveries at the LHC and would also have its own unique discovery capabilities. The ability to control the collision energy, polarize one or both beams, and measure cleanly the particles produced will allow the linear collider to zero in on the crucial features of a rich new world that Peter Zerwas of DESY described on the first day of the workshop, which might include Higgs bosons, supersymmetric particles and evidence of extra spatial dimensions.

This physics programme dictates specific requirements for the detectors and for the accelerator design. As the ILC community turns increasingly to design and engineering, there was considerable activity in the physics groups to formulate these requirements concretely. Early in the workshop, an international panel set up this spring presented a proposed list of benchmark processes to be used in optimizing the ILC detector designs. This brought a new flavour to the physics discussions – one that will continue in future work on physics at the ILC.

CCEilc2_12-05

This influence was felt most strongly in the working groups on Higgs physics and supersymmetry. Precision electroweak data predict that the neutral Higgs boson will be observed within the initial energy reach of the ILC, which will provide a microscope to study the whole range of possible Higgs boson decays and measure coupling strengths to the percent level. To accomplish this goal, the ILC detectors must have significantly better performance in several respects than those at CERN’s Large Electron-Position collider (LEP). In contrast with the quite specific implications of Higgs boson physics, the idea of supersymmetry encompasses various models with diverse implications. Some of the signatures of supersymmetry will be studied at the LHC, but the problem of understanding the exact nature of any new physics will be a difficult one. Through the study of a diverse set of specific parameter sets for supersymmetry, work done at Snowmass showed that the ILC experiments could address this problem robustly, and the necessary detector performances were specified.

CCEilc3_12-05

Precision is crucial

The precision of the ILC experiments should be supported by equally precise theoretical calculations. Among those discussed at the workshop were Standard Model analyses, including higher-order contributions in quantum chromodynamics, calculations of radiative corrections to the key Higgs boson production processes, and precision calculations within models of new physics. The Supersymmetry Parameter Analysis project, presented at Snowmass, proposes a convention for the parameters for supersymmetry models from which observables can be computed to the part-per-mille level for unambiguous comparison of theory and experiment. The fourth in the series of LoopFest conferences on higher-order calculations took place during the Snowmass workshop, the highlight this year being a presentation of new twistor space methods for computing amplitudes for emission of very large numbers of gluons and other massless particles. New calculations of the process e+e → tbar th showed that higher-order corrections enhance this process by a factor of two near threshold, making it possible for the 500 GeV ILC to obtain a precise measurement of the top quark Yukawa coupling.

CCEilc4_12-05

The capabilities of the ILC will make it possible to explore new models, which include Higgs sectors with CP violation (for which the ILC offers specific probes of quantum numbers), and models with a “warped” extra dimension, which predict anomalies in the top quark couplings that can be seen in tbar t production just above threshold.

Many of the discussions of new physics highlighted the connections to current problems of cosmology. Supersymmetry and many other models of new physics contain particles that could make up (at least part of) the cosmic dark matter. If these models are correct, dark-matter candidates will be produced in the laboratory at the LHC. Studies at Snowmass showed how precise measurements at the ILC could be used to verify whether these particles have the properties required to account for the densities and cross-sections of astrophysical dark matter. Here all the strands of ILC physics – exotic models, precision calculations and incisive experimental capabilities – could combine to provide physical insight that can be obtained in no other way.

The accelerator design effort

In August 2004 the International Technology Recommendation Panel concluded that the ILC should be based on superconducting radio-frequency accelerating structures. This recommendation has been universally adopted as the basis for the ILC project, now being coordinated via the Global Design Effort (GDE), led by Barry Barish from Caltech. At Snowmass, the accelerator experts carried the baton from the successful launch of the ILC design effort at the first ILC workshop at KEK in Japan in November 2004. Snowmass also provided the forum for the first official meeting of the GDE. The working groups established for the first ILC workshop at KEK formed the basis of the organizing units through Snowmass. In addition, six global groups were formed to work towards a realistic reference design: Parameters, Controls & Instrumentation, Operations & Availability, Civil & Siting, Cost & Engineering, and Options.

Sources of electrons and positrons are the starting points of the accelerator chain. The successful production of intense beams of polarized electrons at the SLAC Linear Collider (SLC) between 1992 and 1998 demonstrated the best mechanism for producing electrons. When polarized laser light is fired at special cathode materials, electrons are produced with their spin vectors aligned, with polarization of up to 90% achieved in the laboratory. The ability to select the “handedness” of the beam is an incisive capability that will allow probes of the left- or right-handed nature of the couplings of new particles, such as those in supersymmetric models.

As well as the positron production systems used previously, other approaches are being studied to achieve polarized beams. One involves passing the high-energy electron beam through the periodic magnetic field provided by an “undulator”, similar to those used at synchrotron light sources. The intense photon beams radiated by the undulating electrons can be converted in a thin target into electron-positron pairs. A second method involves boosting the energies of photons produced in laser beams by Compton back-scattering them from electrons, and then similarly converting the boosted photons to yield positrons. If the intermediate photons are polarized, both of these methods allow polarized positron production.

The electron and positron beams produced must be “cooled” in so-called damping rings, in which their transverse size is reduced via synchrotron radiation during several hundred circuits. A few different designs are being studied for these rings. Challenges include precise component alignment and the high degree of stability required for low emittance, while minimizing collective effects that can blow up the beams.

Most of the length of the linear collider, some 20 km or so, will be devoted to accelerating the electron and positron beams in two opposing linacs. The debate at Snowmass centred on critical issues, such as the operating choice for the accelerating voltage gradient in the superconducting niobium cavities and the choice of advanced technologies that must be used to power the cavities. The details of the shape and surface preparation of the cavities are among the issues that affect the gradient that can be supported. Larger radii of curvature of the cavity lobes are desirable to reduce peak surface electric fields that can induce breakdown. Also, advanced surface preparation techniques such as electropolishing are being refined, and cavities are being produced and tested by strong international teams at regional test facilities. Based on experience to date, a draft recommendation was reached for a mean initial operating gradient of around 31 MV per metre. Each linac would then need to be just over 10 km long to reach the initial target centre-of-mass energy of 500 GeV.

Similar expert attention was devoted to the modulators, klystrons and distribution systems that convert “wall-plug” power into the high-power (10 MW) millisecond-long pulses applied to the cavities. Industrial companies in Europe, Asia and the US have developed prototype klystrons for this purpose. These are in use at the TESLA Test Facility at DESY, which provides a working prototype linac system. Several innovative ideas for solid-state modulators or more compact klystrons are also being explored with industry.

Once at their final energy, the beams must be carefully focused and steered into collision. The collision point lies at the ends of the two linacs and encompasses the interaction region, including the detector(s). The working recommendation, defined at the workshop at KEK, is to consider two interaction regions, each with one detector. Many important ramifications were discussed at Snowmass. For example, the current plan calls for the beams to be brought into the interaction region with a small horizontal crossing angle of either 2 or 20 mrad. In either case the final-focus magnets must be carefully designed to be compact and stable with respect to vibrations that could be transferred to beam motion. A detailed engineering design is being prepared, which will also include beam-steering feedback systems to maintain the beams in collision and optimize the luminosity. Intermediate values for the crossing angle, such as 14 mrad, are also under study.

Of no less importance is the need to remove the spent beams safely from the interaction region and transport them to the beam dumps. As each beam carries an equivalent of several megawatts of power, the design must allow the necessary clearances, and be capable of being aborted safely in the event of equipment failure. The “machine protection” system and beam dumps remain subjects for active R&D. Many crucial diagnostic systems for measuring the beam energy, polarization and luminosity will be based in the extraction lines, and excellent progress was made in defining the locations and configurations of the necessary instrumentation.

The GDE will build on the consensus reached at Snowmass and produce an accelerator Baseline Configuration Document (BCD) by the end of 2005. As Nick Walker from DESY summarized at the end of the workshop, the BCD will define the most important layout and technology choices for the accelerator. For each subsystem a baseline technology will be specified, along with possible alternatives which, with further R&D, will offer the promise to reduce the cost, minimize the risk or further optimize the performance of the ILC. The engineering details of the baseline design will then be refined and costed. A Reference Design Report will follow at the end of 2006. This will represent a first “blueprint” for the ILC, paving the way for a subsequent effort to achieve a fully engineered technical design.

Detector concepts

The Snowmass workshop was an important opportunity for proponents of the three major detector-concept studies to work together on their detector designs. They are planning to draft detector outline documents before the next Linear Collider Workshop (LCWS06) in Bangalore in March 2006. Detector capabilities are challenged by the precision physics planned at the ILC. The environment is relatively clean, but the detector performance must be two to ten times better than at LEP and the SLAC Linear Collider. Details of tracking, vertexing, calorimetry, software algorithms and other aspects of the detectors were discussed vigorously.

The three major international detector concepts rely on a “particle flow” approach in which the energy of jets is measured by reconstructing individual particles. This technique can be much more precise than the purely calorimetric approach employed at hadron colliders like the LHC. In a typical jet, 70% of the energy consists of hadrons, which are measured with only moderate resolution in the hadron calorimeter, while 30% consists of photons, which are measured with much better precision in the electromagnetic calorimeter. Of the hadronic energy typically 60% is carried by charged particles, which can be measured precisely with the tracking system. The hadron calorimeter is thus relied on only for the 10% carried by neutral hadrons. For the particle-flow approach, it is necessary to separate the charged and neutral particles in the calorimeters, where the showers overlap or are often very close to each other. Separation of the showers is accomplished differently in each of the detector concepts, trading off detector radius, magnetic-field strength and granularity of the calorimeter.

Physical processes to be studied at the ILC require tagging of bottom and charm quarks with unprecedented efficiency and purity, as well as of tau leptons.

A specialized group worked on the development of the particle-flow algorithms. A conventional shower-reconstruction algorithm tends to combine the showers of different hadrons, but more sophisticated software should be able to separate them based on the substructure of the showers. At present the energy resolution of jets is still limited by confusion in the reconstruction, but significant progress was achieved at Snowmass, with optimism that a resolution of 30%⁄√E can be reached.

In the Silicon Detector Concept (SiD) the goal is a calorimeter with the best possible granularity, consisting of a tungsten absorber and silicon detectors. To make this detector affordable, a relatively small inner calorimeter radius of 1.3 m is chosen. Shower separation and good momentum resolution are achieved with a 5 T magnetic field and very precise silicon detectors for charged particle tracking. The fast timing of the silicon tracker makes SiD a robust detector with respect to backgrounds.

The Large Detector Concept (LDC), derived from the detector described in the technical design report for TESLA, uses a somewhat larger radius of 1.7 m. It also plans a silicon-tungsten calorimeter, possibly with a somewhat coarser granularity. For charged particle tracking, a large time-projection chamber (TPC) is planned to allow efficient and redundant particle reconstruction. The larger radius is needed to achieve the required momentum resolution.

The GLD concept chooses a larger radius of 2.1 m to take advantage of a separation of showers just by distance. It uses a calorimeter with even coarser segmentation and gaseous tracking similar to the LDC. Progress at Snowmass on the GLD, LDC and SiD concepts was summarized at the end of the workshop by Yasuhiro Sugimoto of KEK, Henri Videau of Ecole Polytechnique and Harry Weerts of Argonne, respectively. A fourth concept was introduced at Snowmass, one not relying on the particle-flow approach.

A common challenge for all detector concepts is the microvertex detector. Physical processes to be studied at the ILC require tagging of bottom and charm quarks with unprecedented efficiency and purity, as well as of tau leptons. This task is complicated by backgrounds from the interacting beams and the long bunch trains during which readout is difficult. The detectors must be extremely precise, and also extremely thin, to avoid deflection of low-momentum particles and deterioration of interesting information. Several technologies are under discussion, all employing a “pixel” structure based on the excellent experience of the SLD vertex detector at SLAC, ranging from charge-coupled devices (CCDs) and complementary metal-oxide semiconductor (CMOS) sensors used in digital cameras, to technologies that improve on the ones already used for the LHC.

In the gaseous tracking groups much discussion centred on methods to increase the number of points in the TPC, compared with LEP experiments. A possibility is to use gas electron-multiplier foils, a technology that was developed at CERN for the LHC detectors. Micromesh gaseous structure detectors, or “micromegas”, are another option, pursued mainly in France. The availability of test beams is crucial for advancing detector designs. TPC tests have been performed at KEK, and a prototype of the electromagnetic calorimeter has been tested at DESY. Further tests are also planned at the test-beam facilities at CERN and Fermilab.

As befitting a workshop with both detector and accelerator experts present in force, discussion of the machine-detector interface issues played a big role. The layout of the accelerator influences detectors in many ways – for example, beam parameters determine the backgrounds, the possible crossing angle of the beams affects the layout of the forward detectors, and the position of the final focus magnets dictates the position of important detector elements. All of these parameters have to be optimized by accelerator and detector experts working in concert. At a well attended plenary “town meeting” one afternoon, several speakers debated “The Case for Two Detectors”. Issues included complementary physics capabilities, cross-checking of results, total project cost and two interaction regions versus one.

Outreach, communication and education

A special evening forum on 23 August addressed “Challenges for Realizing the ILC: Funding, Regionalism and International Collaboration”. Eight distinguished speakers, representing committees and funding agencies with direct responsibility for the ILC, shared their wisdom and perspectives: Jonathan Dorfan (chairman of the International Committee for Future Accelerators, ICFA), Fred Gilman (HEPAP), Pat Looney (formerly of the US Office of Science and Technology Policy), Robin Staffin (US Department of Energy), Michael Turner (US National Science Foundation), Shin-ichi Kurokawa (ACFA chair and incoming ILCSC chair), Roberto Petronzio (Funding Agencies for the Linear Collider) and Albrecht Wagner (incoming ICFA chair). The brief presentations were followed by animated questions and comments from many in the audience.

Educational activities played a prominent role in the Snowmass workshop. Reaching out to particle experimenters and theorists, the accelerator community ran a series of eight lunchtime accelerator tutorials. Of broader interest to the general public were a dark-matter café and quantum-universe exhibit in the Snowmass Mall, a Workshop on Dark Matter and Cosmic-Ray Showers for high-school teachers, and a cosmic-ray-shower study in the Aspen Mall. Two evening public lectures attracted many residents and tourists, with Young-Kee Kim talking on “E = mc2“, and Hitoshi Murayama on “Seeing the Invisibles”. A physics fiesta took place one Sunday in a secondary school in Carbondale, where physicists and teachers from the workshop engaged children in hands-on activities.

Communication was also on the Snowmass agenda, and the communications working group defined a strategic communication plan. During the workshop a new website, www.linearcollider.org , was launched together with ILC NewsLine, a new weekly online newsletter open to all subscribers.

• Proceedings of the Snowmass Workshop will appear on the SLAC Electronic Conference Proceedings Archive, eConf.

The post ILC comes to Snowmass appeared first on CERN Courier.

]]>
https://cerncourier.com/a/ilc-comes-to-snowmass/feed/ 0 Feature In August nearly 700 scientists and engineers from North America, Asia and Europe got together at Snowmass in the US to advance the design of the International Linear Collider and its detectors, and to refine the physics case for this next-generation machine. https://cerncourier.com/wp-content/uploads/2005/11/CCEilc1_12-05-feature.jpg
New exhibition unites people and ideas https://cerncourier.com/a/new-exhibition-unites-people-and-ideas/ https://cerncourier.com/a/new-exhibition-unites-people-and-ideas/#respond Wed, 02 Nov 2005 00:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/new-exhibition-unites-people-and-ideas/ The 1st European Research and Innovation Exhibition, held in Paris, attracted 24,000 visitors. Astrophysicist Jean Audouze, chairman of the exhibition's Scientific Committee, talked to Beatrice Bressan about the event's objectives and its impact.

The post New exhibition unites people and ideas appeared first on CERN Courier.

]]>
The 1st European Research and Innovation Exhibition – the Salon Européen de la Recherche et de l’Innovation – took place in Paris on 3-5 June 2005 under the patronage of Jacques Chirac, president of France. The aim of the exhibition, which is to become an annual event, is to provide a place for players from a broad sector of activities to come together, creating a crossroads where people and ideas from both the public sector and the corporate world can meet. This year, the 130 exhibitors included CERN, the Institut National de Physique Nucléaire et de Physique des Particules (IN2P3) of the Centre National de la Recherche Scientifique (CNRS), and the Dapnia laboratory of the Commissariat á l’Energie Atomique, who together presented a stand showing examples of technology transfer.

CCEint1_11-05

Jean Audouze, senior CNRS researcher, is the founder and chairman of the exhibition’s Scientific Committee. Consisting of scientific leaders in the world of research and innovation, this committee is responsible for the programme of events, in particular conferences and round-table discussions. Audouze himself has had a great deal of experience in communicating physics on the highest and broadest levels, as scientific adviser to the president of France (1989-1993) and as director of Paris’s well known science museum, the Palais de la Découverte (1998-2004).

How would you describe the role of research today, in the World Year of Physics?

Research is the driving force behind economic, cultural and social progress. The French government, much as the other European polit­ical leaders, has set a goal of devoting 3% of gross domestic prod­uct to research and development by 2010. Together, France and Europe are actively preparing for the future to meet the dynamic momentum of countries like the US, China and Japan, with whom competition is already very fierce. According to the OECD [Organization for Economic Co-operation and Development], gross domestic expenditure on research and development by member countries amounted to over $650 billion in 2001. The countries of the European Union contributed about $185 billion of this amount. France spent about $31 billion on research and development, which places it in second position in Europe and fourth worldwide, behind the US, Japan and Germany. Many researchers have started their own companies since 1999. Business incubators are playing a crucial role in the development of new companies, assisted by organi­zations that provide financing specifically for the creation of innov­ative companies. The biotechnology and nanotechnology sectors are at present leading in terms of the creation of new businesses

Can you explain the event’s objectives?

The exhibition combined information from fundamental research with its applications. It provided an opportunity for researchers, public and private institutions, universities and the top engineering and business schools in France (les grandes écoles), industrial and commercial companies, R&D departments, incubators, financing organizations, laboratory suppliers, local governments, technology parks (technopoles), research associations and foundations to meet. They could present their activities, develop contacts to encourage professional development, discuss the establishment of new projects, start new partnerships, and negotiate financing for new businesses or research programmes.

CCEint2_11-05

What was the outcome of the three days?

The final balance is very positive. A total of around 24,000 people attended the event. In addition, the presence of many visitors at the conferences, at the round tables concerning the European research programme and the diffusion of scientific culture in Europe, and at events with the participation of the Nobel laureates in physics has shown the strong interest the public has in scientific topics.

How have politicians reacted to the measures required to maximize the value of scientific research?

The politicians have responded well to the scientists’ needs, indeed a few programmes have received specific financing allocations. They appreciated the creative way the technological developments were presented to the public, and the debates on social impact to arouse awareness of the importance of science for everyday life.

What is the outlook for continuing the dialogue in research, education and industrial promotion?

The perspective for the future is to make this event an annual rendezvous with the participation of other European institutions and national stands.

The World Year of Physics 2005 is an international celebration of physics. Events throughout the year have been highlighting the vitality of physics and its importance in the coming millennium, and have commemorated Einstein’s pioneering contributions in 1905. How can the World Year of Physics bring the excitement and impact of physics, science and research to the public?

I am convinced that the World Year of Physics has been a success in terms of popularizing physics and in conveying enthusiasm for the subject among a large public. In each country, and especially in France, many very exciting events were set up with that goal and have attracted quite big audiences. We astrophysicists have a project to make 2009, the 400th anniversary of the use of the astronomical lens by Galileo, the World Year of Astronomy and Astrophysics.

How can worldwide collaborations and fundamental research laboratories such as CERN, CNRS and Dapnia inspire future generations of scientists?

This inspiration is induced by at least two factors: first, CERN, CNRS and Dapnia are involved in the most exciting aspects of fundamental research, e.g. the very nature of matter and the universe; second, their research programmes are planned for the coming decades: the forthcoming operation of the Large Hadron Collider at CERN and projects like VIRGO (which aims to detect gravitational waves) for CNRS and Dapnia should be very enticing for European newcomers to science.

• The CERN, IN2P3 and Dapnia stand showed examples of technology transfer and was prepared by CERN’s Technology Transfer and Communication groups. In addition, CERN’s Daniel Treille gave a talk “Miroirs brisés, antimatière disparue, mati&egravere cachée: le CERN mène l’enquête”.

The post New exhibition unites people and ideas appeared first on CERN Courier.

]]>
https://cerncourier.com/a/new-exhibition-unites-people-and-ideas/feed/ 0 Feature The 1st European Research and Innovation Exhibition, held in Paris, attracted 24,000 visitors. Astrophysicist Jean Audouze, chairman of the exhibition's Scientific Committee, talked to Beatrice Bressan about the event's objectives and its impact. https://cerncourier.com/wp-content/uploads/2005/11/CCEint2_11-05.jpg
Collaboration without borders https://cerncourier.com/a/viewpoint-collaboration-without-borders/ Mon, 22 Aug 2005 22:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/viewpoint-collaboration-without-borders/ Barry Barish asks how the particle-physics community can continue to foster its hallmark of fruitful international collaboration.

The post Collaboration without borders appeared first on CERN Courier.

]]>
One of the things we do well in particle physics is collaborate with each other and internationally to carry out our science. Of course, individual and small collaborations are a trademark of modern scientific research and many of us have developed lifelong colleagues and friends from different cultures through our scientific interactions. Even during times when political situations have been constraining, scientific contacts have been maintained that have helped to break down those barriers. For example, during the heat of the Cold War, personal interactions were greatly hampered, yet scientific bonds persevered and some of those connections provided crucial ongoing contacts between Western and Soviet societies.

CCEvie1_09-05

When I was a student at the University of California, Berkeley, I did my PhD research at the “Rad Lab”, now called Lawrence Berkeley National Laboratory. I recall being immediately surprised both by the number of foreign or foreign-born scientists at the lab and by how little it seemed to matter. For me, having grown up as a local Californian boy, this was an eye-opening experience and a terrific opportunity to learn about other cultures, customs and views of the world. After a while, I pretty much took it for granted that we scientists accept and relate to each other in ways that are essentially independent of our backgrounds. However, it is worth reminding ourselves that this is not the case in most of society and that we are the exception, not the rule.

I have often wondered what it is that unifies scientists. How can we work together so easily, when cultural, political and societal barriers inhibit that for most of society? After all, hostilities between countries and cultures seem to continue as an almost accepted part of our modern existence. I won’t theorize here on what enables scientists to work together and become colleagues and friends without regard to our backgrounds. Instead, I would like to briefly explore whether the nature of how we collaborate will, or should, change.

Particle physics is increasingly focused on the programmes at our big laboratories that house large accelerators, detectors and support facilities. These laboratories have essentially come to represent a distributed set of centres for high-energy physics, from which the intellectual and technical activities emanate and where physicists go to interact with their colleagues. Fermi National Accelerator Laboratory (Fermilab), Stanford Linear Accelerator Center (SLAC), the High Energy Accelerator Research Organization (KEK) and Deutsches Elektronen-Synchrotron (DESY) are examples of national laboratories that play this role in the US, Japan and Germany. CERN is a different example of a successful regional laboratory that has provided Europe with what is arguably the leading laboratory in the world for particle physics and with a meeting place for physicists from Europe and beyond.

One essential ingredient in the success of particle physics is that the accelerator facilities at the large laboratories have been made open to experimentalists worldwide without charge. This principle was espoused by the International Committee on Future Accelerators (ICFA) and, I believe, it has been crucial to widening participation.

It is interesting to contemplate how international collaboration might evolve as we go beyond the regional concept to a global one, like the International Linear Collider (ILC). The organizational principles for building and operating the ILC are not yet defined, but the idea is to form a partnership between Asian, American and European countries. Such an arrangement is already in place for the accelerator and detector R&D efforts. The general idea is to site the ILC near an existing host laboratory, to take advantage of the support facilities. However, the project itself will be under shared management by the international stakeholders. The experiments are expected to consist of large collider detectors similar to those at present colliders, but with some technical challenges that will require significant detector R&D over the next few years.

As we plan for the ILC, we want to ensure that we create a facility that will be available to the world scientific community. What needs to be done to ensure that we maintain the strong collaborative nature of our research and how do we create a true centre for the intellectual activities of our community? What should we require of the host country to assure openness to the laboratory and its facilities? How can we best include the broad community in the decision-making that will affect the facilities that are to be built? Is it time to consider new forms of detector collaboration and/or should we contemplate making the data from the detectors available to the broader community after an initial period (as in astronomy)?

I raise such questions only as examples, not to imply that we should change the way we do business, but to encourage us to think hard about how we can create an exciting international facility that will best serve our entire community and enable productive and broad collaboration to continue in science.

The post Collaboration without borders appeared first on CERN Courier.

]]>
Opinion Barry Barish asks how the particle-physics community can continue to foster its hallmark of fruitful international collaboration. https://cerncourier.com/wp-content/uploads/2005/08/CCEvie1_09-05.jpg
SLAC reorganizes and prepares for next major breakthroughs https://cerncourier.com/a/slac-reorganizes-and-prepares-for-next-major-breakthroughs/ https://cerncourier.com/a/slac-reorganizes-and-prepares-for-next-major-breakthroughs/#respond Sun, 17 Jul 2005 22:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/slac-reorganizes-and-prepares-for-next-major-breakthroughs/ On 24 May, Jonathan Dorfan, director of the Stanford Linear Accelerator Center (SLAC), announced a complete reorganization of the structure and senior management of the laboratory, which Stanford University has operated for more than 40 years for the US Department of Energy.

The post SLAC reorganizes and prepares for next major breakthroughs appeared first on CERN Courier.

]]>
On 24 May, Jonathan Dorfan, director of the Stanford Linear Accelerator Center (SLAC), announced a complete reorganization of the structure and senior management of the laboratory, which Stanford University has operated for more than 40 years for the US Department of Energy. The new organizational structure is built around four divisions: Photon Science, Particle and Particle Astrophysics, Linac Coherent Light Source (LCLS) Construction, and Operations.

CCEnew2_07-05

“One thing that is recurrent in world-class scientific research is change,” Dorfan said. “Recognizing new science goals and discovery opportunities, and adapting rapidly to exploit them efficiently, cost-effectively and safely is the mark of a great laboratory. Thanks to the support of the Department of Energy’s Office of Science and Stanford University, SLAC is ideally placed to make important breakthroughs over a wide spectrum of discovery in photon science and particle and particle astrophysics. These fields are evolving rapidly, and we are remodelling the management structure to mobilize SLAC’s exceptional staff to better serve its large user community. The new structure is adapted to allow them to get on with what they do best – making major discoveries.”

Two of the new divisions – Photon Science, and Particle and Particle Astrophysics – encompass SLAC’s major research directions. As director of the Photon Science Division, Keith Hodgson has responsibility for the Stanford Synchrotron Radiation Laboratory, the science and instrument programme for the LCLS (the world’s first X-ray-free electron laser) and the new Ultrafast Science Center. Persis Drell, director of the Particle and Particle Astrophysics Division, oversees the B-Factory (an international collaboration studying matter and antimatter), the Kavli Institute for Particle Astrophysics and Cosmology, the International Linear Collider effort, accelerator research and non-accelerator particle-physics programmes.

Construction of the $379 million LCLS, a key element in the future of accelerator-based science at SLAC, started this fiscal year. A significant part of the laboratory’s resources and manpower are being devoted to building LCLS, with completion of the project scheduled for 2009. Commissioning will begin in 2008 and science experiments are planned for 2009. John Galayda serves as director of the LCLS Construction division.

To reinforce SLAC’s administrative and operational efficiency, and to stress the importance of strong and effective line management at the laboratory, a new position of chief operating officer has been created, filled by John Cornuelle. This fourth division, Operations, has broad responsibilities for operational support and R&D efforts that are central to the science divisions. Included in Operations will be environmental safety and health, scientific computing and computing services, mechanical and electrical support departments, business services, central facilities and maintenance.

The post SLAC reorganizes and prepares for next major breakthroughs appeared first on CERN Courier.

]]>
https://cerncourier.com/a/slac-reorganizes-and-prepares-for-next-major-breakthroughs/feed/ 0 News On 24 May, Jonathan Dorfan, director of the Stanford Linear Accelerator Center (SLAC), announced a complete reorganization of the structure and senior management of the laboratory, which Stanford University has operated for more than 40 years for the US Department of Energy. https://cerncourier.com/wp-content/uploads/2005/07/CCEnew2_07-05.jpg
Celebratory year lifts off in Paris https://cerncourier.com/a/celebratory-year-lifts-off-in-paris/ https://cerncourier.com/a/celebratory-year-lifts-off-in-paris/#respond Tue, 01 Mar 2005 00:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/celebratory-year-lifts-off-in-paris/ More than 1000 people including eight Nobel laureates and close to 500 students from 70 countries took part in the Physics for Tomorrow conference in Paris on 13 January.

The post Celebratory year lifts off in Paris appeared first on CERN Courier.

]]>
More than 1000 people including eight Nobel laureates and close to 500 students from 70 countries took part in the Physics for Tomorrow conference in Paris on 13 January. The event took place at the headquarters of the United Nations Educational, Scientific and Cultural Organization (UNESCO). It marked the official launch of the International Year of Physics proclaimed by the UN, which aims to highlight the importance of physics and its contribution to society.

CCEnew2_03-05

The conference was organized by UNESCO, the lead UN organization for the International Year, together with other organizations from the physics community, including the CNRS and CEA in France and CERN. CERN itself was founded under the auspices of UNESCO, which is one of the observer organizations to the CERN council, so it was appropriate that Carlo Rubbia and Georges Charpak, Nobel laureates from CERN, together with the director-general, Robert Aymar, were among the invited speakers.

During the opening ceremony, Aymar emphasized the crucial roles of physics as the driving force for innovation, as the magnet for attracting and training the most talented people, and in forging partnerships of nations. Rubbia participated in the round table on “What can physics bring to the socio-economical challenges of the 21st century?” and Charpak talked about “Teaching and education in physics”.

• This inaugurated a series of events that are taking place all over the world in 2005 to celebrate physics and emphasize its role. For further information see www.wyp2005.org.

The post Celebratory year lifts off in Paris appeared first on CERN Courier.

]]>
https://cerncourier.com/a/celebratory-year-lifts-off-in-paris/feed/ 0 News More than 1000 people including eight Nobel laureates and close to 500 students from 70 countries took part in the Physics for Tomorrow conference in Paris on 13 January. https://cerncourier.com/wp-content/uploads/2005/03/CCEnew2_03-05-feature.jpg
The shock of the known https://cerncourier.com/a/viewpoint-the-shock-of-the-known/ Thu, 03 Feb 2005 00:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/viewpoint-the-shock-of-the-known/ Simon Singh believes that the best way for scientists to interest the public may be to forget the unknown and amaze them with what they know.

The post The shock of the known appeared first on CERN Courier.

]]>
Naturally, researchers take for granted that which is known, and instead focus on the unknown. Indeed, when I was at CERN working on the UA2 experiment, everyone was obsessed wih those areas of physics that were not yet understood. The public is also interested in those scientific subjects that still remain a mystery – where is the Higgs boson? Is string theory correct? What is dark matter? So when I left particle physics and became a science journalist, I continued to concentrate on unexplored territory. It was those research topics at the frontiers of knowledge and at the centre of controversy that inevitably resulted in the best stories.

CCEvie1_01-05

However, when I sat down to write Big Bang, I decided to adopt a different approach – I wanted to celebrate how much we do know, and glory in the fact that we belong to the first generation of humans that have access to a coherent, consistent, compelling and verifiable model of the universe. The public is told so much about contentious issues, such as arguments over the existence, type and quantity of dark matter, that they probably have the impression that cosmologists know very little about the universe. In fact, I think the public would be staggered if they realized how much we do know.

The fact that the universe is expanding might seem dull to those of us within science, but to outsiders it probably sounds incredible. I suspect that the majority of the public perceive the expansion of the universe as a weird new hypothesis that will be overturned in a few years. If only they realized that the expansion of the universe was detected more than 75 years ago and has since been measured in detail and verified in a multitude of ways, then they might begin to engage with the staggering and profound implications of an expanding cosmos.

As well as spreading the gospel of our understanding of the universe, including the Big Bang model, I also wanted to show how superior models emerge in science and how they are eventually accepted, regardless of how controversial they are initially and no matter how powerful their detractors might be. Although we should be celebrating Albert Einstein in the centenary of his annus mirabilis, it is still worth noting that he vehemently opposed the Big Bang model when it was explained to him by the Belgian cosmologist (and priest) Georges Lemaître. Einstein told him, “Your calculations are correct, but your physics is abominable.” But a few years later, the observations showed that Lemaître was right, and Einstein had to concede defeat in the light of reality. The Big Bang model turned out to be basically correct and remains the best game in town.

Despite all the successes of modern cosmology and the Big Bang model, my book does feature an epilogue that explains the ways in which the model is incomplete. There are, of course, still aspects of our universe that cause bewilderment and arguments among cosmologists. For example, was there an inflationary period in the early universe, what is dark matter, what is dark energy and what is the fate of the universe? Such questions currently belong to the realm of speculation, and answering them sometimes seems impossible.

However, perhaps my book offers a note of optimism for cosmologists, because they can take heart by looking back through the history of their subject. After all, what now seems completely obvious was itself mysterious to scientists of the past. There was a time when nobody had any idea of how to measure the distances to the nebulae, but in 1923 Edwin Hubble solved the puzzle and showed that many of them were remote galaxies. He relied on the periodic variation in brightness of a type of star, known as a Cepheid variable, which he spotted in the Andromeda Nebula. The time between peaks in brightness betrays the absolute brightness of a Cepheid star and this could be compared to its apparent brightness in order to deduce its distance – and the distance to the Andromeda Nebula that it inhabited. Today, measuring the distances to galaxies is still not routine, but it is clearly no longer impossible.

Perhaps the best example of a once impossible problem that soon became trivial was discussed in 1835 by the French philosopher Auguste Comte. He had tried to identify areas of knowledge that would forever remain beyond the wit of scientific endeavour. In particular, he thought that some qualities of the stars could never be ascertained. “We see how we may determine their forms, their distances, their bulk, and their motions, but we can never know anything of their chemical or mineralogical structure.” In fact, Comte would be proved wrong within a few years of his death, as scientists began to discover which types of atom exist in the Sun.

The post The shock of the known appeared first on CERN Courier.

]]>
Opinion Simon Singh believes that the best way for scientists to interest the public may be to forget the unknown and amaze them with what they know. https://cerncourier.com/wp-content/uploads/2005/02/CCEvie1_01-05-feature.jpg
Hands across the Mediterranean https://cerncourier.com/a/hands-across-the-mediterranean/ https://cerncourier.com/a/hands-across-the-mediterranean/#respond Sun, 05 Sep 2004 22:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/hands-across-the-mediterranean/ Scientists from North Africa, the Middle East and Europe came together in a meeting at CERN to discuss common projects in fields varying from particle physics to water desalination. Robert Klapisch reports.

The post Hands across the Mediterranean appeared first on CERN Courier.

]]>
cernmed1_9-04

Back in April 2002 AFAS (the French Association for the Advancement of Science) and the “Club de Marseille” jointly convened “WorldMed 2002”, a meeting that was set up to share knowledge between the north and south regions of the Mediterranean. WorldMed’s aim was to show how concrete projects could advance co-operation between countries with different cultures, thereby providing a much-needed stimulus to the political intergovernmental process. The meeting, which was attended by 850 people, of whom 150 came from North Africa, was a huge success and several projects were begun as a result of contacts initiated among the participants. This success suggested a follow-up in the form of periodic meetings to discuss projects and seek potential synergies. For this purpose, smaller meetings, which would focus on a few selected topics and so be easier to organize and permit an even better opportunity for contacts, seemed a promising concept.

The celebration this year of CERN’s 50th anniversary provided a perfect opportunity for the laboratory, with its distinguished tradition along these lines, to initiate the series by hosting the event on 6-7 May. The chosen topics were the Large Hadron Collider (LHC), the Synchrotron-light for Experimental Science and Applications in the Middle East (SESAME) project, and computing – all of which are familiar to readers of the CERN Courier – together with two applied topics of considerable and obvious relevance: water and energy.

The conference was opened by Pascal Colombani, chairman of AFAS, who stressed the universal value of science and its ability to build bridges between peoples belonging to different cultures and religions, even in cases where they are in bitter political conflict. John Ellis from CERN introduced the first session with an overview of the LHC programme and its worldwide extent. His talk was followed by specific reports of non-member-state participation from countries in North Africa and the Middle East, with Abdeslam Hoummada from Casablanca, Hafeez Hoorani from the National Center for Physics in Islamabad, and Hessamaddin Arfaei from the Institute for Studies in Theoretical Physics and Mathematics in Teheran. The status of possible Egyptian participation was also presented by Mohammed Sherif from Cairo. A subsequent round-table discussion included CERN’s director-general, Robert Aymar, together with Ali Chamseddine of Beirut and Giora Mikenberg of Rehovoth.

The overwhelming impression was of the serious and impressive contributions these relative newcomers to the field are bringing to the building of the ATLAS and CMS detectors. In the case of Pakistan and Iran, the legacy of Abdus Salam as the first Muslim Physics Nobel laureate certainly seems to have played a role in persuading the powers-that-be to support such an apparently esoteric field of research. Another interesting aspect is the case of Morocco, where bilateral ties with the French institute IN2P3 have helped to organize and bring to a high standard a consortium of universities that is now a full member of the ATLAS collaboration.

Herwig Schopper, president of the SESAME Council, presented the UNESCO-backed programme for SESAME, a regional synchrotron light facility to be located in Jordan with statutes analogous to those of CERN. It will be based on parts donated from the BESSY I machine at Berlin, which are in the process of being upgraded to make SESAME competitive and up to international standards. The facility should be operational in 2007 and it is remarkable that in just five years a new international organization has been created. Zehra Sayers of Istanbul outlined the scientific programme and Samar Hasnain of the Daresbury Laboratory described the first generation of beam lines. Nasser Hamdam of the United Arab Emirates recounted his former work at the Advanced Light Source at the Lawrence Berkeley National Laboratory and talked about his projects for SESAME when it comes on line.

Joining in the subsequent round-table discussion were Abdeslam Hoummada, Abderrahmane Tadjeddine of LURE, Orsay, Jean-Patrick Connerade of Imperial College, London, and Eliezer Rabinovici of Jerusalem. The first example of a regional facility, SESAME will add a south_south dimension to international scientific collaboration. Indeed, as Schopper noted, UNESCO has agreed in principle that other regional scientific centres could be considered in the future – a point that generated tremendous interest in the audience.

Guy Wormser of IN2P3 and Orsay convened a session on “Fighting the digital divide”, in which Michel Spiro, director of IN2P3, first pointed out the dual importance of broadband access. As a tool, broadband would make data analysis a democratic affair, enabling researchers to do physics based on their talent rather than on their geographic location. More generally, bridging the divide could be meant as bridging the gap between people belonging to different cultures or religions, even though some may presently be in political conflict. This is really the prolongation of a 50-year-old CERN tradition.

Fabrizio Gagliardi of CERN then explained the concept of the computing Grid, stressing that it is not only very powerful but also economical. In addition to being necessary to handle the vast amounts of LHC data, it should also have obvious applications in other fields such as meteorology and genomics. Driss Benchekroun of the University of Hassan II, Casablanca, gave the view of a user from Morocco and outlined plans to update IT infrastructure within the Maghreb. These were in fact realized three weeks later when the Moroccan minister inaugurated MARWAN, a wide-area network connecting Moroccan universities among themselves and to Europe. As Dany Vandromme of the Réseau National de Télécommunications pour la Technologie, l’Enseignement et la Recherche (RENATER) explained, this was made possible because the European intra-university network GEANT had been extended to include a link to a point in each country around the Mediterranean, from Casablanca to Beirut. Lorne Levinson of Rehovot and Alberto Santoro from Rio de Janeiro then joined the round-table discussion, appropriately via an Internet videoconference.

For countries in the sun belt, solar energy is a tremendous resource still waiting to be exploited.

Water desalination and reuse is of crucial interest for semi-arid countries, where there is a strong increase in population. For this discussion Miriam Balaban of the European Desalination Association and Azzedine El Midaoui of Ibn Tofa University in Kénitra, Morocco, had assembled a splendid panel of experts. Richard Morris of Glasgow, Corrado Somariva of Abu Dabi, Valentina Lazarova of the Suez Environnement company, Michel Soulié of the Agropolis Association in Montpellier, Bruce Durham of Veolia Water in the UK, and Mohamed Safi of the Ecole national d’ingénieurs in Tunis, presented all aspects of the progress in this field.

The cost of desalination, which only a decade ago was considered out of reach for non oil-rich countries, has fallen dramatically in the past five years. It is now in the region of €0.50-0.85 per tonne for large installations and further progress can be expected. The energy necessary to pump seawater through a semi-permeable membrane is currently 2 kWh for new installations, compared with 5 kWh for installations built in the 1990s, and close to the thermodynamics limit of 0.7 kWh. The focus is now increasingly on environmental aspects such as the safe disposal of the brine and chemicals, on sound water management and on safe recycling of urban and industrial wastewater for irrigation.

For countries in the sun belt, solar energy is a tremendous resource still waiting to be exploited. Augusto Maccari of ENEA, the Italian national agency for new technologies, energy and the environment in Rome, gave a report on how to harness solar energy as high-temperature heat by using concentrating mirrors and storing the heat in a molten salt at 550 °C. This circumvents the discontinuous nature of solar energy so that electricity can be generated on a continuous basis. This development was under the leadership of Carlo Rubbia, president of ENEA, and the talk was also a preview of the inauguration of the “Archimède” pilot facility (20 MW), which took place on 19 May near Syracuse in Sicily.
• The conference was organized by AFAS with the support of CERN, IN2P3, UNESCO, France Telecom, Veolia and Suez.

The post Hands across the Mediterranean appeared first on CERN Courier.

]]>
https://cerncourier.com/a/hands-across-the-mediterranean/feed/ 0 Feature Scientists from North Africa, the Middle East and Europe came together in a meeting at CERN to discuss common projects in fields varying from particle physics to water desalination. Robert Klapisch reports. https://cerncourier.com/wp-content/uploads/2004/09/cernmed1_9-04.jpg
More than just a conference https://cerncourier.com/a/viewpoint-more-than-just-a-conference/ Mon, 07 Jun 2004 22:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/viewpoint-more-than-just-a-conference/ The European Particle Accelerator Conference, EPAC, has developed a distinctive role on the world stage, explains Christine Petit-Jean-Genaz, the EPAC conferences coordinator.

The post More than just a conference appeared first on CERN Courier.

]]>
cernvie1_6-04

When CERN’s Kurt Hübner and Günther Plass travelled to Rome in 1986 to join Sergio Tazzari of the Frascati Laboratory in search of a venue for the first European Particle Accelerator Conference, they set in motion the machinery that was to give the European accelerator community its own conference, 20 years after the birth of the American Particle Accelerator Conference, PAC. Two decades later, with science funding tight and justifiably under close scrutiny, it is interesting to assess the value, and also the spin-off, of this event.

In July EPAC’04 will welcome around 800 delegates from more than 30 countries to the ninth conference in the series. Sixty-five oral presentations are scheduled and more than 1000 posters will be displayed during the lively sessions. Two European accelerator prizes, first introduced in 1994, will also be awarded, one for a recent significant, original contribution to the accelerator field from a scientist in the early part of his or her career, and one for outstanding achievement in the accelerator field. This will precede a regular conference highlight, the “entertainment” session, with a talk on cosmic accelerators. The industrial exhibition and its associated session on technology transfer and industrial contacts completes the picture, demonstrating the vital communication between scientists and representatives of industry. This is a very different conference from the first one in 1988.

EPAC’88, held at the Hotel Parco dei Principi in Rome, was a victim of its own success. When the estimated 400 delegates expanded to 700 there was “controlled chaos” as closed-circuit TV had to relay the oral presentations, authors shared poster boards, a lack of air-conditioning caused delegates to flee the industrial exhibition and a lack of space meant the plenary sessions were relocated to the Aula Magna of the “La Sapienza” University of Rome. Alas, it was too late to relocate the conference dinner. Those who recall the fountains dancing in time to the strains of a string quartet in the gardens of the Villa Tuscolana will also remember the mouth-watering buffet, which was woefully insufficient and had vanished before the guests finished arriving.

However, the learning process had begun, and the venue for EPAC’90 was a purpose-built conference centre in Nice. Only one detail escaped the vigilant local organizing committee: the unique banquet venue able to cater for 800 people was outdoors, on a beautiful, unsheltered, Mediterranean beach. A week of perfect weather was marred only by the cloud that burst that particular evening. Who recalls the drenched delegates arriving following a 200 metre sprint in a tropical downpour?

During this period, Maurice Jacob, chairman of the European Physical Society (EPS), convinced EPAC’s organizers to form an EPS Interdivisional Group on Accelerators (IGA), and the successive EPS-IGA elected boards have since formed the EPAC organizing committees. A biennial, one-third turnover of the 18 members ensures continuity, while encouraging new members to introduce new ideas. To promote communication between the regional conferences, the organizing committees welcome representatives of US and Asian PACs, and the chairmen meet informally each year. The EPS-IGA has undertaken a number of initiatives such as the Student Grant Programme which, with the sponsorship of European laboratories and industry, enables young scientists to attend EPAC; around 60 will attend EPAC’04 under this scheme.

Continuity, coordination and communication characterize EPAC organization. Participation has increased steadily, with almost half the participants coming from non-European countries. Improved management techniques have streamlined the workload and contained registration fees. This was also a result of publishing the proceedings in CD-ROM and Web format, rather than expensive paper-hungry hard-copy volumes.

An unexpected spin-off of regional collaboration has been the creation of the Joint Accelerator Conferences Website (JACoW). The suggestion by Ilan Ben-Zvi of the Brookhaven National Laboratory in the mid-1990s to create a website for the publication of regional accelerator conference proceedings has developed into a flourishing international collaboration. It now extends to a whole range of conference series on accelerators – CYCLOTRONS, DIPAC, ICALEPCS, LINAC and RUPAC. The editors of all eight series, led by CERN’s John Poole, get hands-on experience in electronic publication techniques during each PAC and EPAC. Furthermore, the yearly team meetings have led to the development of a Scientific Programme Management System. This is an Oracle-based application capable of handling conference contributions from abstract submission through to proceedings production. Twenty-three sets have been published since 1996, including scanned PAC proceedings dating back to 1967. The EPAC and LINAC series plan to follow suit and scan their pre-electronic era proceedings too.

EPAC has evolved into an established, respected forum for the state of the art in accelerator technology. Delegates meet at unique venues, where the varied cultural heritage constitutes real added value. Strengthened ties with other regional and specialized conferences have enhanced international collaboration in the accelerator field, to the undoubted benefit of the community worldwide.

The post More than just a conference appeared first on CERN Courier.

]]>
Opinion The European Particle Accelerator Conference, EPAC, has developed a distinctive role on the world stage, explains Christine Petit-Jean-Genaz, the EPAC conferences coordinator. https://cerncourier.com/wp-content/uploads/2004/06/cernvie1_6-04.jpg
Can companies benefit from Big Science? https://cerncourier.com/a/can-companies-benefit-from-big-science/ https://cerncourier.com/a/can-companies-benefit-from-big-science/#respond Tue, 09 Dec 2003 00:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/can-companies-benefit-from-big-science/ The findings of a recent study reveal the significant and widespread impact on high-tech companies of contracts with CERN.

The post Can companies benefit from Big Science? appeared first on CERN Courier.

]]>
Several studies have indicated that there are significant returns on financial investment via “Big Science” centres. Financial multipliers ranging from 2.7 (ESA) to 3.7 (CERN) have been found, meaning that each Euro invested in industry by Big Science generates a two- to fourfold return for the supplier. Moreover, laboratories such as CERN are proud of their record in technology transfer, where research developments lead to applications in other fields – for example, with particle accelerators and detectors. Less well documented, however, is the effect of the experience that technological firms gain through working in the arena of Big Science. Indeed, up to now there has been no explicit empirical study of such benefits.

cernett1_12-03

We have therefore analysed the technological learning and innovation benefits derived from CERN’s procurement activity during 1997-2001. Our study was based on responses from technology-intensive companies of some financial significance. The main aim was to open up the “black box” of CERN as an environment for innovation and learning. Our findings reveal a variety of outcomes, which include technological learning, the development of new products and markets, and impact on the firm’s organization. The study also demonstrates the importance of technologically challenging projects for staff at CERN. Together, these findings imply ways in which CERN – and by implication other Big Science centres – can further boost technology transfer into spill-over benefits for industrial knowledge and enhance their contribution to industrial R&D and innovation.

The method and sample

The empirical section of the study had several parts. First, a series of case studies was carried out, to develop a theoretical framework describing influences on organizational learning in relationships between Big Science and suppliers. The theoretical framework for the study is indicated in figure 1. A wide survey of CERN’s supplier companies was carried out from autumn 2002 to March 2003. Finally, a parallel survey was carried out among CERN staff responsible for coordinating purchase projects in order to explore learning effects and collaboration outcomes at CERN.

The focus of the survey was CERN-related learning, organizational and other benefits that accrue to supplier companies by virtue of their relationships with CERN. The questionnaire was designed according to best practice with multi-item scales used to measure both predictor and outcome variables. All scales were pre-tested in test interviews, and the feedback was used to iron out any inconsistencies and potential misunderstandings.

cernett2_12-03

The base population of the study consisted of 629 companies – representing about 1197 million SwFr (€768 million) in procurement – which were selected as being technology-intensive suppliers with order sizes greater than 25,000 SwFr (€16,000). This base population was selected from a total of 6806 supplier companies, which generated supplies worth 2132 million SwFr (€1368 million) during this period (see “The selection process” table). The selection procedure therefore indicates that around 10% of CERN’s suppliers are firms that are both genuinely technology-intensive and also of some financial significance; when combined, they represent more than 50% of CERN’s total procurement budget during the period under study. We received 178 valid answers to our questionnaire from 154 of these companies, most of which were still supplying high-technology products or services to CERN beyond 2001. These answers formed the basis for our statistical study.

The findings

The learning outcomes documented in the study range from technology development and product development to organizational changes. The main findings show that while benefits vary extensively among suppliers, learning and innovation benefits tend to occur together: one type of learning is positively associated with other types of learning. Technological learning stands out as the main driver. Indeed, 44% of respondents indicated that they had acquired significant technological learning through their contract with CERN.

The study revealed important signs of development of new businesses, products and services, and increased internationalization of sales and marketing operations. As many as 38% of all respondents said they had developed new products as a direct result of their contractual relationship with CERN, while some 60% of the firms had acquired new customers. Extrapolating to the base population of 629 suppliers suggests that some 500 new products have been developed, attracting around 1900 new customers, essentially from outside high-energy physics. In addition, 41% of respondents said they would have had a poorer technological performance without the benefit of the contract with CERN, and 52% of these believed they would have had poorer sales.

Virtually all CERN’s suppliers appear to derive great value from CERN as a marketing reference (figure 2). In other words, CERN provides a prestigious customer reference that firms can use for marketing purposes. As a result of their interaction with CERN, 42% of the respondents have increased their international exposure, and 17% have opened a new market.

cernett3_12-03

Suppliers also find benefits in terms of organizational effects from their involvement in CERN projects, such as improvements to their manufacturing capability, quality-control systems and R&D processes. In particular, 13% of the respondents indicated that they had formed new-product R&D teams and 14% reported the creation of new business units. A final aspect of the impact on organizational capabilities concerned project management, with 60% indicating strengthened capabilities in this area.

In examining what determines the outcomes of the relationships between suppliers and CERN, we found that learning and innovation benefits appear to be regulated by the quality of the relationship. The greater the social capital – for example, the trust, personal relationships and access to CERN contact networks – built into the relationship, the greater the learning and innovation benefits. This emphasizes the benefits of a partnership-type approach for technological learning and innovation.

In particular we found the frequency of interaction with CERN during the project to be relevant, together with how widely used the technology was at the start of the project, and the number of the supplier’s technical people who frequently interacted with CERN during the project.

During the study we also interviewed physicists and engineers at CERN, to cross check the information provided by the companies. These interviews highlighted the benefits that these interactions with industry have for CERN personnel. We observed that the mutual beneficial effect is independent of the level of in-house experience and participation in high-tech projects, confirming the importance of maintaining an active “industry-Big Science” interaction. Furthermore the results confirmed the importance of technologically challenging projects for the high levels of knowledge acquisition and motivation of highly qualified staff at CERN.

The implications

The study has shown that to a larger scale than foreseen, many suppliers have benefited from technological and other types of learning during the execution of contracts with CERN. The benefits suggest that for companies that are participating in highly demanding and cutting-edge technological projects, conventional procurements and stringent requirements are not the most appropriate modes of interaction, especially if learning and innovation are to be nurtured in the long term. For a Big Science organization such as CERN, some measures to facilitate true partnership and some thoughts on how to maintain the efficiency of the derived benefits for projects with long life-span development, either within the present financial and commercial rules or through other possible routes, will have to be carefully investigated. In particular procurement policy should involve industry early on in the R&D phase of high-tech projects that will lead to significant contracts.

Lastly, note that the methodology developed is not specific to CERN. It would be interesting to determine if and how these findings from CERN compare with those of other Big Science centres.

The post Can companies benefit from Big Science? appeared first on CERN Courier.

]]>
https://cerncourier.com/a/can-companies-benefit-from-big-science/feed/ 0 Feature The findings of a recent study reveal the significant and widespread impact on high-tech companies of contracts with CERN. https://cerncourier.com/wp-content/uploads/2003/12/cernett1_12-03.jpg
International interactions at Fermilab https://cerncourier.com/a/international-interactions-at-fermilab/ https://cerncourier.com/a/international-interactions-at-fermilab/#respond Sat, 01 Nov 2003 00:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/international-interactions-at-fermilab/ The Lepton Photon 2003 conference, hosted by Fermilab, provided an opportunity to take stock and to look ahead to TeV-scale physics in the coming decade. John Womersley and James Gillies report.

The post International interactions at Fermilab appeared first on CERN Courier.

]]>
cernlep1_11-03

The big international summer conferences provide the venue for high-energy physicists to show their newest results in front of a large, influential audience. While delegates always hope for something new and exciting, the pace of research doesn’t always oblige. This year, 650 physicists attended the XXI International Symposium on Lepton and Photon Interactions at High Energies (Lepton Photon 2003) at Fermilab. While radical revisions to our view of the universe are probably not required, interesting new results were presented, and a series of excellent review talks covered the broad sweep of particle physics.

As one might expect for a conference at Fermilab, the first day was devoted largely to collider physics and electroweak-scale phenomena. The experimental state of play with results from LEP, the Tevatron and HERA was covered by a series of speakers: Patrizia Azzi of Padova on top physics, Terry Wyatt of Manchester on electroweak, Michael Schmitt of Northwestern and Emmanuelle Perez of Saclay on Higgs/SUSY and other searches, respectively, and Bob Hirosky of Virginia on QCD. The overall picture remains one of consistency with the Standard Model, but the good news is that with roughly 200 pb-1 of data now recorded at CDF and DØ, the Tevatron’s Run II has entered previously unexplored territory. The experiments are opening up new areas of parameter space and potential discovery reach for new physics, both with their direct searches and through precise measurements of the properties of the top quark and the W and Z bosons. On the theory front, CERN’s Paolo Gambino talked about the status of electroweak measurements and global fits, including the muon (g-2) measurement and NuTeV’s anomalous value of sin2θW, and this led to a spirited discussion in the question and answer session after the talk. Gian Giudice, also from CERN, described theoretical predictions for new physics at colliders, and Thomas Gehrmann of Zurich covered developments in QCD. Cornell University’s Peter Lepage described recent notable progress in lattice QCD calculations, and the last two speakers, Augusto Ceccucci of CERN and Yuval Grossman of Technion, described rare K and B decays. There are recent results from experiments at Fermilab, CERN and Brookhaven, and new initiatives that will significantly extend our sensitivity to new TeV-scale physics through such windows are underway or planned. A reception and poster session wound up the day, with music by the Chicago Hot Six, led by trombonist Roy Rubinstein – otherwise known as Fermilab’s assistant director.

The second day was devoted to heavy-flavour physics and the CKM matrix. Probably the most talked about new result of the conference was presented by Tom Browder of Hawaii, who reported on Belle’s determination of sin 2Φ1 (sin 2ß) from B→ΦKS decays. In the Standard Model, this should be the same as the value extracted from the familiar B→J/ΨKS process, namely 0.74 ±0.05. In 2002, both BaBar and Belle had found negative values, but the errors were large. With more data, Belle reported a new value at the meeting of -0.96 ±0.50 (+0.09/-0.11). By itself this measurement is 3.5σ from the Standard Model, but the situation was confused by a new measurement from BaBar of the same quantity, which has moved much closer to the Standard Model and is +0.45 ±0.43 ±0.07. This puzzle will take either more data or more study before it is resolved.

cernlep2_11-03

The Fermilab organizers introduced a number of innovations to the 2003 edition of the conference, one of which was designed to attract the media. Shortly after the new Belle and BaBar results were reported, the first of a series of informal media briefings brought physicists and journalists together over a sandwich lunch to discuss the physics of the previous session.

Hassan Jawahery of Maryland described progress from the B-factories towards constraining the other two angles of the unitarity triangle, and Kevin Pitts of Illinois outlined the complementary capabilities of hadron colliders, which allow access to the BS and to b-baryons. Dresden’s Klaus Schubert reported on significant progress in determining the magnitudes (as opposed to the phases) of the CKM matrix elements, using results from a broad array of experiments. The CKM matrix appears unitary at the 1.8σ level, but some important inputs are still awaited. Gerhard Buchalla of Munich explored tools to understand the QCD aspects of heavy hadron decays. Liverpool’s John Fry covered measurements of rare hadronic decays, while Mikihiko Nakao of KEK did the same for electroweak and radiative rare decays.

In the session on charm and quarkonium physics, Bruce Yabsley of Virginia Tech discussed the limits on new physics from charm decay, concluding that the results from CLEO-c are eagerly awaited. Tomasz Skwarnicki of Syracuse described a revitalized scene in heavy-quarkonium physics, thanks in part to large data samples at BES-II in Beijing, CLEO-III at Cornell and Fermilab’s E835 experiment. He pointed to solid theoretical progress and to new experimental opportunities at BaBar and Belle, as well as CLEO-c. Jussara de Miranda of the Brazilian Centre for Research in Physics asked the question: “why is charm so charming?”, concluding that it provides a powerful bridge to the parton world.

Rounding off the second day was another innovation, a special open session on the Grid, to which the public was invited. Ian Foster of Argonne and Chicago introduced the concept of Grid computing, and CERN’s Ian Bird described its application to the LHC. Bob Aiken from Cisco, Stephen Perrenod from Sun and David Martin of IBM explained the industry’s perspective, while Dan Reed from the National Center for Supercomputing Applications provided the view from an academic computer centre.

cernlep3_11-03

Day three began with a session on astroparticle physics, with Pennsylvania’s Licia Verde reporting the exciting results from the Wilkinson Microwave Anisotropy Probe (WMAP). The detailed cosmic microwave background images provided by WMAP bring new insights into the early universe and into the amount of baryonic matter in the universe. Polarization measurements with WMAP also give a handle on the formation of the first stars. The results point to a flat universe consisting of just 4% baryonic matter, with the first stars forming earlier than previously thought at around 200 million years.

Harvard’s Bob Kirshner focused on the universe’s invisible 95%. Supernovae observations indicate an accelerating universe, consisting of about a third matter and two-thirds dark energy. Concluding that there is a bright future for dark energy, he looked forward to a resolution of the questions surrounding the cosmological constant through forthcoming supernovae studies.

Esteban Roulet of Bariloche gave an update on very-high-energy cosmic rays, whose spectrum is described by a “knee” at around 1015 eV and an “ankle” at 1019 eV. Recent work has shown that the chemical composition becomes heavier above the knee, and that above the ankle the extragalactic component dominates. Roulet also raised the question of doing astronomy with ultra-high-energy cosmic rays, saying that although it would be like trying to do optical astronomy with a telescope at the bottom of a swimming pool, the field has promise.

Lyon’s Maryvonne de Jesus gave a comprehensive overview of the status of dark-matter experiments. So far, only the DAMA experiment at Italy’s Gran Sasso underground laboratory has reported a positive signal for WIMPS. A new detector, DAMA/LIBRA, is expected to report its first result towards the end of 2003. A range of experiments planned or in preparation in Europe and the US can exclude much, but not all of the area allowed by the DAMA measurement. Giorgio Gratta of Stanford discussed tritium and double beta decay experiments. The tritium experiments study the endpoint of the decay spectrum from tritium decays to helium, an electron and a neutrino, which is sensitive to neutrino mass. Results from spectrometer-based experiments at Mainz (< 2.2 eV) and Troitsk (< 2.05 eV) are the most sensitive so far. In the future, KATRIN at Karlsruhe should achieve a sensitivity of around 0.25 eV.

cernlep4_11-03

In his review of accelerator neutrino experiments, Koichiro Nishikawa of Kyoto focused on the controversial LSND result, which suggests an oscillation scenario incompatible with the accepted picture. Final results from CERN’s NOMAD experiment do not entirely exclude the LSND region, and attention has now passed to MiniBOONE at Fermilab, which has so far recorded some 125,000 events. The first results are expected this autumn. Further ahead are the long-baseline projects NUMI/MINOS at Fermilab-Soudan, and the ICARUS and OPERA detectors, which will observe a neutrino beam from CERN at the Gran Sasso laboratory in Italy.

Reactor-neutrino experiments were covered by Kunio Inoue of Tohoku, who pointed out that all modern experiments are extrapolations of the 1956 Reines and Cowan original, but with vast improvements in scale and flux, along with a better understanding of reactor neutrinos. The fact that Japan’s Kamioka mine has 70 GW of reactor power at distances of 130-240 km and an existing cavern, led to the KamLAND experiment, whose first results provide evidence for neutrino disappearance from a reactor source. Reactor experiments also probe θ13, the last remaining mixing angle in the neutrino sector. The current best measurement of θ13 less than 10 degrees comes from the French CHOOZ experiment. Turning to solar neutrinos, Carleton’s Alain Bellerive gave a comprehensive historical overview from Homestake to Sudbury, but kept the audience waiting for new results from SNO.

In his wide-ranging theoretical review, Alexei Smirnov pointed out that the Lepton Photon conference devoted a full day and a half to discussion of the unitarity triangle for quarks, but no time to the equivalent for leptons. To redress the balance he constructed a leptonic unitarity triangle, which he used to discuss the possibility of measuring CP violation in the leptonic sector. He also pointed out that the only chance of reconciling LSND’s oscillation result with other experiments is by invoking extra, sterile, neutrinos. Deborah Harris of Fermilab then took the podium to give a detailed analysis of the beamline strategies and detector options for facilities ranging from the near-term, such as the Japanese J-PARC to Kamioka beam, to long-term projects such as the neutrino factory.

Day four was devoted to hadron structure and detector research and development. It began with a review of structure functions from deep-inelastic scattering at HERA given by Paul Newman of Birmingham. HERA shut down in 2000 for an upgrade that was designed to boost the luminosity five fold. Now running again as HERA-II, a few tens of inverse picobarns of data are expected this year, and this is sufficient to begin studies of polarization dependence. The theoretical perspective was given by Robert Thorne of Cambridge, who discussed the parton distributions that are essential for analyses at high-energy hadron colliders such as Fermilab’s Tevatron and CERN’s LHC. Toki-Aki Shibata of the Tokyo Institute of Technology discussed measurements with polarized hadrons, both in deep-inelastic scattering and proton_proton collisions, as a probe of the spin structure of hadrons and a tool to develop QCD. After the break, Yuji Yamazaki of KEK continued the QCD theme in his discussion of diffractive processes and vector meson production. Diffractive processes have historically been described in terms of pseudo-particle exchange within Regge theory. Today they can be described at least in part by perturbative QCD, though there remains work to be done.

cernlep5_11-03

In his review of heavy-ion collisions, David Hardtke of Berkeley asked the question: “have we seen the quark-gluon plasma at RHIC?” He concluded that although the density and the temperature produced in RHIC gold-gold collisions is at or above the predicted phase transition, no direct evidence has been seen for excess entropy production.

Ties Behnke of DESY and SLAC had the honour of presenting the only detector R&D talk of the conference, and he focused on detectors for a future linear electron-positron collider, for which the requirements are rather different from those for hadron machines.

The final day’s programme looked forward. Veronique Boisvert of CERN reported on a meeting held during the conference by the Young Particle Physicists’ organization, and complimented the conference organizers on the visible role that young physicists had played at Lepton Photon 2003. CERN’s director-general, Luciano Maiani, reported on the status of the LHC and the steps that have been taken over the past couple of years to firm up its financial situation. Dipole magnet production remains the main item that is setting the pace, and first beam is still foreseen for 2007. Vera Luth of SLAC reported on IUPAP’s Commission on Particles and Fields, the umbrella under which the Lepton Photon and ICHEP conferences are held. She observed that the current difficulties experienced by scientists trying to obtain visas to enter the US were of great concern and had a negative impact on the attendance at the conference.

Looking further ahead, Francois Richard of Orsay described the physics motivation for a TeV-scale linear collider, including the constraints that can now be placed on SUSY models from WMAP’s cosmic microwave background measurements, assuming that cosmic dark matter actually consists of neutralinos. SLAC’s Jonathan Dorfan reported for ICFA, and Maury Tigner of Cornell for the International Linear Collider Steering Committee. An international report on the desired parameters of a linear collider is expected this autumn, and this will be followed by the setting up of a committee of “wise persons” from the Americas, Asia and Europe to make a technology recommendation by the end of 2004. In parallel, a task force will recommend how to set up an internationally federated “pre-global design group” intended to evolve into a design group for the accelerator as the project gains approval. Inter-governmental contacts have started with a pre-meeting that was held in London in June.

Ed Witten of Princeton’s Institute for Advanced Study described supersymmetry and other scenarios for new physics. Witten admitted that an anthropic principle could account for the fine tuning and hierarchies that plague the Standard Model, but he hoped that this would not turn out to be the case. The experimenters in the audience enjoyed hearing such a prominent theoretician argue that we need new results that put experiments ahead of theory again – “as is customary in science”. The closing talk, an outlook for the next 20 years, was given by Hitoshi Murayama of Berkeley. He argued that there is a convergence, at the TeV scale, of questions to do with flavours and generations (neutrino masses and mixings, CP violation, B physics), questions to do with unification and hierarchies of forces, and questions to do with the cosmos (dark matter and dark energy). The TeV scale offers the answers to many of these questions, but also forms a cloud that blocks our vision. The next decade holds the exciting promise of dispersing this cloud and giving us the first clear view of what lies ahead.

The post International interactions at Fermilab appeared first on CERN Courier.

]]>
https://cerncourier.com/a/international-interactions-at-fermilab/feed/ 0 Feature The Lepton Photon 2003 conference, hosted by Fermilab, provided an opportunity to take stock and to look ahead to TeV-scale physics in the coming decade. John Womersley and James Gillies report. https://cerncourier.com/wp-content/uploads/2003/11/cernlep1_11-03.jpg
CEBAF celebrates seven years of physics https://cerncourier.com/a/cebaf-celebrates-seven-years-of-physics/ https://cerncourier.com/a/cebaf-celebrates-seven-years-of-physics/#respond Sat, 01 Nov 2003 00:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/cebaf-celebrates-seven-years-of-physics/ Douglas Higinbotham reports from the Jefferson Lab symposium on results that span the boundary between nuclear-meson models and quark-gluon physics.

The post CEBAF celebrates seven years of physics appeared first on CERN Courier.

]]>
cernceb1_11-03

Jefferson Lab in Newport News, Virginia, recently celebrated the first seven years of physics with the Continuous Electron Beam Accelerator Facility, CEBAF. The unique design of this electron accelerator allows three experimental halls to be operated simultaneously, with a total beam current of 200 µA and a beam polarization of up to 80%. With this facility, a user community of more than 1000 scientists from 187 institutions in 20 countries has completed 81 nuclear-physics experiments, with substantial data taken on 23 more. From the data obtained in these experiments, more than 250 refereed journal articles have been published and 146 doctoral degrees have been awarded. In the near future more than 60 experiments are planned, and there are currently 128 PhD theses in progress.

To celebrate and review these accomplishments, while also looking toward the future, the Jefferson Lab user group board of directors organized a symposium, which was held on 11-13 June and dedicated to the memory of Nathan Isgur, Jefferson Lab’s first chief scientist. The meeting was divided into eight physics topics: nucleon form factors, few-body physics, reactions involving nuclei, strangeness production, structure functions, parity violation, deep exclusive reactions and hadron spectroscopy. Each topic was presented by one experimentalist and one theorist.

The symposium began with presentations by Donal Day of Virginia and John Ralston of Kansas on nucleon form factors, which probe the electromagnetic structure of the proton and neutron. The presentations included a discussion of the most referenced and surprising result from Jefferson Lab, that the proton’s form factors do not follow an expected simple relation. While theorists have proposed different models to explain this result, the basic ingredient in almost all new models is the addition of relativistic effects.

The talks continued with presentations focusing on few-body systems, such as the deuteron and 3He, by Paul Ulmer of Old Dominion University and Franz Gross from the College of William and Mary. In these experiments, the Jefferson Lab electron beam is used to knock out a proton from the few-body system or to probe it with elastic scattering. The expected yield can be calculated exactly, assuming nucleons and mesons are the underlying particles. The presentations showed that even with beam energies of up to 5.7 GeV, the electron scattering results are surprisingly well explained by the nucleon-meson models to distance scales of the order of 0.5 fm. In contrast, experiments on deuteron photodisintegration, which probe even smaller distance scales, have revealed clear evidence of the limitations of the nucleon-meson models and of the onset of quark-gluon degrees of freedom.

For reactions involving nuclei, i.e. many-body systems such as oxygen and carbon, statistical methods in the context of the nucleon-meson picture are used to calculate the expected yields of the quasi-elastic reaction. Larry Weinstein of Old Dominion University presented a talk entitled “So where are the quarks?”, in which he showed that the nucleon-meson model describes even the highest momentum transfer Jefferson Lab data, while Misak Sargsian of Florida International presented a talk looking mostly to the future, when the quark-gluon nature of matter should become evident from experiments with a 12 GeV electron beam.

Reinhard Schumacher of Carnegie Mellon and Steve Cotanch of North Carolina State presented reactions involving strangeness production, which includes the production of particles such as kaons. They showed new Jefferson Lab data confirming the θ+ particle as discovered by SPring-8 in Japan. This new particle is comprised of five quarks and has been dubbed the pentaquark. This had been described as the first observed nucleon resonance comprised of more than three valence quarks and has sparked international excitement. A new Jefferson Lab experiment to further study this new particle has already been approved.

cernceb2_11-03

Structure-function experiments, which provide information on the quark and gluon structure of the nucleon, were presented by Keith Griffioen of the College of William and Mary, and Wally Melnitchouk of Jefferson Lab. While Jefferson Lab’s beam energy is relatively low for this type of experiment, the high luminosity available has allowed many high-precision structure-function results to be produced. An interesting feature of the Jefferson Lab data is that if one scales the smooth deep-inelastic cross-section results from high-energy physics to the laboratory’s kinematics, the scaled results will pass through the average of the resonant peaks of the laboratory’s data. This effect, known as duality, may lead to a better understanding of how the underlying quarks and gluons link to the nucleon-meson models.

Krishna Kumar of Massachusetts and Michael Ramsey-Musolf from the California Institute of Technology presented the parity-violation experiments, where the strange quark distributions in the proton can be extracted by measuring the extremely small asymmetry in the elastic scattering of polarized electrons from an unpolarized proton target. One series of these experiments has already been completed at Jefferson Lab and several more are planned, including the G0 and HAPPEX-II experiments scheduled for next year.

Deep exclusive reactions – experiments done in deep-inelastic kinematics but where the detection of multiple particles allows the final state of the system to be determined – were presented by Michel Garcon of SPhN/Saclay and Andrei Belitsky of Maryland. Generalized parton distribution models, which should enable a complete description of the nucleon’s quark and gluon distributions to be extracted from this type of data, were presented along with the results of the deeply virtual Compton scattering experiments at HERMES, DESY, and at Jefferson Lab. The results indicate that generalized parton distributions can indeed be extracted from this type of data. Several high-precision experiments are also planned for the coming years.

cernceb3_11-03

Steve Dytman of Pittsburgh and Simon Capstick of Florida State presented the wealth of hadron spectroscopy data that is coming from Jefferson Lab. The analysis of the vast set of data produced by the laboratory on the nucleon resonances has been only partially completed, but hints of new states are already emerging and work on a full partial-wave analysis of the data is now getting underway.

Following the physics presentations, Larry Cardman, the Jefferson Lab associate director for physics, presented the long-term outlook for the laboratory. This talk focused primarily on upgrading the CEBAF to a 12 GeV electron machine and building a fourth experimental hall. The higher energy will allow Jefferson Lab to continue its mission of mapping out the transition from the low-energy region where matter can be thought of as made of nucleons and mesons, to the high-energy region that reveals the fundamental quark and gluon nature of matter.

The post CEBAF celebrates seven years of physics appeared first on CERN Courier.

]]>
https://cerncourier.com/a/cebaf-celebrates-seven-years-of-physics/feed/ 0 Feature Douglas Higinbotham reports from the Jefferson Lab symposium on results that span the boundary between nuclear-meson models and quark-gluon physics. https://cerncourier.com/wp-content/uploads/2003/11/cernceb1_11-03.jpg
Training benefits from basic research https://cerncourier.com/a/viewpoint-training-benefits-from-basic-research/ Sun, 05 Oct 2003 22:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/viewpoint-training-benefits-from-basic-research/ Basic research, such as particle physics, not only attracts much needed young people to science, it also provides valuable training, says Maurice Jacob.

The post Training benefits from basic research appeared first on CERN Courier.

]]>
cernvie1_10-03

In 1992-1993, the American Physical society, the European Physical Society and the Japanese Physical Society issued common position papers that were published simultaneously in their respective bulletins. One of the papers dealt with education and expressed concern about the trend observed in all industrialized countries, where fewer students are entering university to study physics as their main subject. This is becoming an increasingly serious problem. How can we obtain new physics knowledge if the number of physicists is sharply decreasing? How can we apply it if there are fewer knowledgeable people to do so? How can we avoid a two-tier society with too few people who know enough to judge the pros and cons of new technologies? This is a real challenge for a knowledge-based economy.

It may be argued that not so many people come to physics because they choose computer science or biology instead. These subjects are also very good at providing new practical knowledge. However, it is the whole of science that is suffering from a decrease in interest. We live in a society where there is nothing better than an MBA. Why suffer through difficult physics studies, as they are claimed to be, if the outcome of the effort does not look so promising?

To investigate the decline, surveys have been made of students who do enter university to study physics, asking them: “How come you chose physics?” The usual answer is: the attraction of mysterious and fundamental objects such as black holes, the Big Bang, quarks and so on.

This gives rise to the notion of “beacon science” – new developments in science that attract young people. It is of course to be hoped, and it is in practice the case, that once young students begin to study physics seriously they discover that there are many exciting things besides black holes and quarks, such as nanotubes and high-temperature superconductivity, and they turn to these topics with enthusiasm. However, black holes and quarks still have a special role to play.

This is important because physics studies at university are not only useful for training professional physicists. Learning how to master abstract concepts to apply them to practical problems, and learning how to appreciate orders of magnitudes and the values and limits of specific approximations are very useful for many activities. Having enough people trained that way is a prerequisite for a dynamic knowledge-based economy.

Consider particle physics. About half of the young people who receive PhDs in particle physics are working in industry within two years of acquiring their degree. The value of the wide range of training provided by such basic research should not be undermined – it is one of the obvious short-term economical returns. Our research with large detectors needs more young people than academia can absorb. Many of them come to research for training and leave it with much appreciated skills.

Indeed, industry does not care much about the research topics of new PhDs. What it appreciates is that people trained in particle physics have worked at the limits of knowledge and technology in large international collaborations, under severe time constraints, often becoming computer wizards in the process. The style of the research matters more than its subject. The great physicist Hendrik Casimir, who was for a long time head of research at Philips, said that: “It is so important to be confronted early in life with research of a greater depth, greater difficulty and greater beauty than one will find later during one’s career”. He also said: “I have heard statements that the role of academic research in innovation is slight. This is about the most blatant piece of nonsense it has been my fortune to stumble upon.”

The value of training through research within a large international research organization like CERN is such that some member states have agreed to also try it for young engineers and have found it very valuable. For several years Spain and Portugal have regularly sent young engineers to CERN so they could acquire valuable training. At the same time, CERN also finds this input of young people very valuable, even if they have to be trained. In 2002, for example, 23 Portuguese and 12 Spanish young engineers joined CERN, many to work on the Grid.

Research domains that make young people wonder and dream should be supported if we are to attract more people to physics and science in general. As Victor Weisskopf said: “We need basic science not only for the solution of practical problems, but also to keep alive the spirit of this great human endeavour. If our students are no longer attracted by the sheer interest and excitement of the subject, we have been delinquent in our duty as teachers.” Physics research, and particularly in the case of CERN, basic research, offers wonderful stimulation for innovation but also for the training of highly competent people for many walks of life – prerequisites for a dynamic knowledge-based economy.

• This is the second extract from the closing talk at a special workshop of Marie Curie Fellows on Research and Training in Physics and Technology, held at CERN in 2002. The first extract was in CERN Courier June p42.

The post Training benefits from basic research appeared first on CERN Courier.

]]>
Opinion Basic research, such as particle physics, not only attracts much needed young people to science, it also provides valuable training, says Maurice Jacob. https://cerncourier.com/wp-content/uploads/2003/10/cernvie1_10-03-feature.jpg
Developing countries and CERN https://cerncourier.com/a/developing-countries-and-cern/ https://cerncourier.com/a/developing-countries-and-cern/#respond Mon, 30 Jun 2003 22:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/developing-countries-and-cern/ John Ellis looks at what CERN can offer scientists - and the wider society - in developing countries.

The post Developing countries and CERN appeared first on CERN Courier.

]]>
cerndev1_7-03

To paraphrase Tolstoy’s introduction to Anna Karenina, every developing country is developing in its own way. It is for each developing country to define its own needs and set its own agenda. So in this context, what is, or what should be, the relationship of CERN to developing countries? In what ways do they already benefit from the work at CERN, and how might they benefit from further collaboration?

CERN’s original raison d’être was to provide a vehicle for European integration and development, whilst also enabling smaller countries to participate in cutting-edge research, and to reduce the brain drain of young European scientists to the United States. Nowadays, CERN is internationally recognized for setting the standard of excellence in a very demanding field, and serves as a beacon of European scientific culture. CERN is open to qualified scientists from anywhere in the world, and beyond its 20 European member states currently has co-operation agreements with 30 countries. Prominent among these – beyond North America and Japan – are Brazil, China, India, Iran, Mexico, Morocco, Pakistan, Russia and South Africa, and more than 1000 people from these countries are listed in the database of scientists as using CERN for their experiments.

Experimental groups from developing nations are not asked to make large cash contributions to the construction of detectors, but rather to produce components. These are valued according to European prices, and if the developing countries can produce them more cheaply using local resources, then more power to their elbows. In Russia’s case, European and American funds were important in helping to convert military institutes into civilian work.

cerndev4_7-03

In addition to participating in experiments, some of these countries, notably Russia and India, have also contributed to the construction of accelerators at CERN. Russia and India are now making important contributions to the Large Hadron Collider (LHC) that is being constructed at CERN, and Pakistan has also offered to contribute. Again, CERN does not require these countries to pay any money towards the construction or operation of its accelerators. Indeed, CERN pays cash for the accelerator components that Russia and India provide, which these countries use to support their own scientific activities.

What, then, are the main benefits for developing countries in collaborating with CERN? It certainly provides them with a way to participate in research at the cutting edge, just as it always has for physicists from smaller European countries. In general, these users spend limited periods at CERN, preparing experiments, taking data and meeting other scientists. Thanks to the Internet, and to CERN’s World Wide Web in particular, particle physicists were the first to make remote collaboration commonplace, and this habit has spread to many other fields beyond the sciences. It is now relatively easy for scientists working on an experiment at CERN to maintain contact with their colleagues around the world, and they can even contribute to software development, data analysis and hardware construction from their home institute. The Web has enabled Indian experimentalists to access LEP data, and their theoretician colleagues to access the latest scientific papers from around the world, all while sitting at their home desks.

CERN is now also a leading player in European Grid computing initiatives. These will benefit many other scientific fields, for which applications are already being developed. Grid projects involve writing a great deal of software and middleware, which is split up into many individual work packages. CERN is keen to share the burden of preparing the Grid with developing countries. For example, several LHC Grid work packages have been offered to India, and other countries such as Iran and Pakistan have expressed an interest and would be welcome to join. In this way, such countries can become involved in developing the technology themselves, thus avoiding the negative psychological dependency on technological “hand outs” (as in the “cargo cults” in New Guinea after the Second World War).

cerndev5_7-03

The everyday acts of collaborating with colleagues in more developed nations exposes physicists from developing countries to the leading global standards in technology, research and education. Collaborating universities and research institutes are therefore provided with applicable standards of comparison and excellence, as well as training opportunities for their young scientists. These may be particularly valuable when educational values are threatened by a combination of increasing demand, insufficient resources and inefficiencies. One country where this is currently a concern is Pakistan. Its chief executive, Pervez Musharraf, has clearly stated his interest in encouraging scientific and technological development in Pakistan, and has exhorted other Islamic countries to do likewise.

How might such “ISO 9000” educational and academic standards be transferred to the wider society? Their value is limited if only a few élite institutions in each country benefit from the international contacts and they are not available throughout the educational system. This is essentially an issue for the internal organization within the country concerned, but CERN is happy to help out. The laboratory has archives of lectures in various formats available through the Web, offering resources for remote learning.

In India, for example, the benefits of collaborating with CERN increase to the extent that physicists from smaller universities outside the main research centres are brought into particle-physics research. In South Africa there are clear priorities in human development. However, a South African experimental group has joined the ALICE collaboration and CERN has welcomed a number of South Africans to its summer student programme, as well as a participant to its high-school teacher programme.

The information technologies that CERN has available should be of benefit to wider groups in developing societies. For example, could video archiving and data-distribution systems be used to disseminate public health information? This exciting idea was proposed to CERN by Rajan Gupta from the Los Alamos National Laboratory, and Manjit Dosanjh of CERN is now developing a pilot project in collaboration with the Ecole Superieure des Beaux Arts de Genève, supported by the foundation “Project HOPE” (see “Project HOPE” box).

cerndevbox_7-03

This project will be demonstrated at the conference on The Role of Science in the Information Society (RSIS) that CERN is organizing in December 2003 as a side event of the World Summit on the Information Society (WSIS) (see “The Role of Science in the Information Society” below). Other sessions at this event will explore the potential of scientific information tools for aiding problems related to health, education, the environment, economic development and enabling technologies.

In 1946 Abdus Salam left his native Pakistan to pursue his scientific dreams in the West – dreams that were more than fulfilled with the award of the Nobel Prize for Physics in 1979. However, his dream of bridging the gap between rich and poor through science and technology remained largely unfulfilled, as Riazuddin has described. If the world can develop its information society properly, a future Salam might not have to leave his – or her – country in order to do research in fundamental physics at the highest level. Moreover, a country’s participation in research at CERN might benefit not only academics and students, but also the wider society at large.

The Role of Science in the Information Society

On 10-12 December 2003, the first phase of the World Summit on the Information Society (WSIS) will take place in Geneva. The aim is to bring together key stakeholders to discuss how best to use new information technologies, such as the Internet, for the benefit of all. The International Telecommunications Union, under the patronage of UN secretary-general Kofi Annan, is organizing WSIS, and the second phase will take place in Tunis in November 2005.

cerndev2_7-03

The “information society” was made possible by scientific advances, and many of its enabling technologies were developed to further scientific research and collaboration. For example, the World Wide Web was invented at CERN to enable scientists from different countries to work together. It has gone on to help break down barriers around the world and democratize the flow of information.

For these reasons, science has a vital role to play at WSIS. Four of the world’s leading scientific organizations: CERN, the International Council for Science (ICSU), the Third World Academy of Science (TWAS) and UNESCO, have teamed up to organize a major conference on The Role of Science in the Information Society (RSIS), as a side event to WSIS. The conference will take advantage of CERN’s location close to Geneva to play a full role at the Summit.

Through an examination of how science provides the basis for today’s information society, and of the continuing role for science, the conference will provide a model for the technological underpinning of the information society of tomorrow. Parallel sessions will examine science’s future contributions to information and communication issues in the areas of education, healthcare, environmental stewardship, economic development and enabling technologies, and the conference’s conclusions will be discussed at the UNESCO round table on science at the Summit itself.

cerndev3_7-03

ICSU, TWAS and UNESCO have a long tradition of scientific, political and cultural collaboration across boundaries. CERN produces knowledge that is freely available for the benefit of science and society as a whole – the World Wide Web was made freely available to the global community and revolutionized the world’s communications landscape. Working together, these organizations are providing a meeting place for scientists of all disciplines, policy makers and stakeholders to share and form their vision of the developing information society.

The RSIS conference will take place on 8-9 December. Its conclusions will feed in to the UNESCO round table at WSIS, and it will set goals and deliverables that will be reported on at Tunis in 2005. The scientific community’s commitment is long-term.

Participation at the RSIS conference will be by invitation and is limited to around 400. However, anyone who feels they have something to contribute to the debate can do so via a series of on-line forums that are accessible through the conference website. These forums will have the same themes as the parallel sessions at the conference and will be moderated by the session convenors. Their conclusions will provide valuable input to the conference itself, and as an added incentive, CERN is offering up to 10 expenses-paid invitations to the conference for those making the most valuable on-line forum contributions.

The post Developing countries and CERN appeared first on CERN Courier.

]]>
https://cerncourier.com/a/developing-countries-and-cern/feed/ 0 Feature John Ellis looks at what CERN can offer scientists - and the wider society - in developing countries. https://cerncourier.com/wp-content/uploads/2003/06/cerndev1_7-03-feature.jpg
CERN-Asia Fellows and Associates Programme https://cerncourier.com/a/cern-asia-fellows-and-associates-programme/ https://cerncourier.com/a/cern-asia-fellows-and-associates-programme/#respond Mon, 30 Jun 2003 22:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/cern-asia-fellows-and-associates-programme/ CERN offers three grants every year to East, Southeast and South Asia* postgraduates under the age of 33, enabling them to participate in its scientific programme in the areas of experimental and theoretical physics and accelerator technologies.

The post CERN-Asia Fellows and Associates Programme appeared first on CERN Courier.

]]>
Within the framework of the CERN-Asia Fellows and Associates Programme, CERN offers three grants every year to East, Southeast and South Asia* postgraduates under the age of 33, enabling them to participate in its scientific programme in the areas of experimental and theoretical physics and accelerator technologies. The appointment will be for one year, which might, exceptionally, be extended to two years.

Applications will be considered by the CERN Associates and Fellows Committee at its meeting on 18 November 2003. An application must consist of a completed application form, on which “CERN-Asia Programme” should be written; three separate reference letters; and a curriculum vitae including a list of scientific publications and any other information regarding the quality of the candidate. Applications, references and any other information must be provided in English only.

Application forms can be obtained from: Recruitment Service, CERN, Human Resources Division, 1211 Geneva 23, Switzerland. E-mail: Recruitment.Service@cern.ch, or fax: +41 22 767 2750. Applications should reach the Recruitment Office at CERN by 17 October 2003 at the latest.

The CERN-Asia Fellows and Associates Programme also offers a few short-term Associateship positions to scientists under 40 years of age who are on a leave of absence from their institute. These are open either to scientists who are nationals of the East, Southeast and South Asian* countries who wish to spend a fraction of the year at CERN, or to researchers at CERN who are nationals of a CERN Member State and wish to spend a fraction of the year at a Japanese laboratory.

• Candidates are accepted from: Afghanistan, Bangladesh, Bhutan, Brunei, Cambodia, China, India, Indonesia, Japan, Korea, the Laos Republic, Malaysia, the Maldives, Mongolia, Myanmar, Nepal, Pakistan, the Philippines, Singapore, Sri Lanka, Taiwan, Thailand and Vietnam.

The post CERN-Asia Fellows and Associates Programme appeared first on CERN Courier.

]]>
https://cerncourier.com/a/cern-asia-fellows-and-associates-programme/feed/ 0 Feature CERN offers three grants every year to East, Southeast and South Asia* postgraduates under the age of 33, enabling them to participate in its scientific programme in the areas of experimental and theoretical physics and accelerator technologies.
Kavli Institute inaugurated at SLAC https://cerncourier.com/a/kavli-institute-inaugurated-at-slac/ https://cerncourier.com/a/kavli-institute-inaugurated-at-slac/#respond Sat, 31 May 2003 22:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/kavli-institute-inaugurated-at-slac/ The new Kavli Institute for Particle Astrophysics and Cosmology has been inaugurated at SLAC.

The post Kavli Institute inaugurated at SLAC appeared first on CERN Courier.

]]>
cernnews9_6-03

The new Kavli Institute for Particle Astrophysics and Cosmology has been inaugurated at SLAC. It is named after physicist and philanthropist Fred Kavli, whose Kavli Foundation pledged $7.5 million to establish the new institute. The institute, which will focus on recent developments in astrophysics, high-energy physics and cosmology, will eventually be located in a new building at SLAC between the research office building and the auditorium, and will open its doors in 2005. At the site of the future institute, Kavli unveiled a 2 m tall, steel and glass sculpture that incorporates a piece of SLAC history in the form of the window from the 1 m (40 inch) bubble chamber.

Roger Blandford, who will become the institute’s director in October, was one of the speakers at the ceremony. He said that initially he intends to follow a roadmap that balances theory, computational astrophysics and phenomenology on one side, and experimental astrophysics and high-energy observing on the other. It will draw upon existing strengths in theoretical physics and astrophysics, gravitational physics and underground physics at Stanford. As Blandford noted, “Part of the excitement of the field is that it is impossible to predict where it will be in five years’ time and what its scientific focus will be”.

The post Kavli Institute inaugurated at SLAC appeared first on CERN Courier.

]]>
https://cerncourier.com/a/kavli-institute-inaugurated-at-slac/feed/ 0 News The new Kavli Institute for Particle Astrophysics and Cosmology has been inaugurated at SLAC. https://cerncourier.com/wp-content/uploads/2003/05/cernnews9_6-03.jpg
Surveying the status of Bulgarian particle physics https://cerncourier.com/a/surveying-the-status-of-bulgarian-particle-physics/ https://cerncourier.com/a/surveying-the-status-of-bulgarian-particle-physics/#respond Sat, 01 Mar 2003 00:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/surveying-the-status-of-bulgarian-particle-physics/ Bulgaria joined CERN in June 1999 as the laboratory's 20th member state. An ECFA visit took place in September 2002 to find out about particle physics in the country.

The post Surveying the status of Bulgarian particle physics appeared first on CERN Courier.

]]>
cernbul1_3-03

Last September, the European Committee for Future Accelerators (ECFA) visited Bulgaria for the first time, as part of an ECFA mission to survey at first hand the status of particle physics in CERN member states. The visit was to Sofia, beautifully situated in a valley overlooked by Mount Vitosha and the Balkan range. Sofia has a history going back thousands of years, and counts the Thracian Serdi tribe, the Romans and the Byzantines among its previous occupants.

The academician Blagovest Sendov, a renowned mathematician and vice-president of the Bulgarian parliament, welcomed the committee. He recalled his own first contact with CERN; in 1986, as chair of the Bulgarian Science Foundation, he approved a grant of SwFr3 million (€2 million) for the participation of Bulgarian scientists and engineers in the L3 experiment at CERN. Sendov explained his appreciation of CERN’s important role in the development of science and technology in the modern world, particularly in Bulgaria. While praising the laboratory’s remarkable contributions in the domain of information technology, he recalled that the very first electronic digital computer was actually invented by an American of Bulgarian origin. John Vincent Atanasoff, who lived from 1903 to 1995, received a PhD in theoretical physics and went on to collaborate with electrical engineering student Clifford Berry, building what later came to be called ABC (the Atanasoff-Berry computer).

Following the welcoming ceremony, the status of particle physics and closely related areas was presented in a number of talks by Bulgarian scientists. A key player in the scientific research sector is the Bulgarian Academy of Sciences (BAS), which was formally established in 1911, but has its roots in a society founded in 1869. Today it is an autonomous national association, and runs a number of institutes, laboratories and other independent research centres. It funds and carries out research in collaboration with universities (primarily the University of Sofia) as well as independently. Its activities are organized in 11 departments, including physical, chemical, mathematical and engineering sciences.

Jordan Stamenov, director of the Institute for Nuclear Research and Nuclear Energy (INRNE) of the BAS, gave an overview of experimental high-energy physics in Bulgaria. The study of cosmic rays began as early as the 1950s by placing nuclear emulsions at an observatory situated on Mussala, the highest peak on the Balkan Peninsula (2925 m above sea level). Later, extended air showers in the energy range 1013-1017 eV were studied high in the Tien-Shan Mountains of Kazakhstan. The Mussala and Tien-Shan sites are still used for a variety of cosmic-ray and astroparticle physics experiments.

Bulgarian particle physicists initially carried out their research primarily using facilities in the former Soviet Union. Bulgaria was one of the founding states of the Joint Institute for Nuclear Research (JINR) in Dubna in 1956, and has been an active partner in many experiments there. From the early 1970s, Bulgarian scientists also began participating in experiments at CERN, mainly through JINR. For example, three Bulgarian physicists and one mathematician took part in the NA4 deep-inelastic muon scattering experiment in the 1980s, as members of the JINR group in the Bologna-CERN-Dubna-Munich-Saclay (BCDMS) collaboration.

Vladimir Genchev described Bulgarian involvement with the CMS experiment in preparation for the Large Hadron Collider (LHC). Bulgarians have been involved with CMS since the beginning, initially concentrating on the software. Bulgarian physicists did the Monte Carlo simulation of the CMS hadron calorimeter, and also took part in the optimization of its design and performance. Later they oversaw the production of the calorimeter’s brass absorber plates by Bulgarian industry. Bulgarians also took on major responsibility for the production, assembly and testing of 125 so-called resistive plate chambers. This is partially funded by the Bulgarian Ministry of Education and Science. Some 27 Bulgarian physicists and engineers have been involved in these efforts. Bulgarians also participate in the ATLAS project, as part of the JINR group.

Leander Litov of the University of Sofia reviewed Bulgarian participation in fixed-target experiments at CERN, such as NA48, NA49 and HARP. The Bulgarian group in HARP, for example, includes 12 people, and there are as many students participating in the experiments. Bulgarians also participate in the COSY experiments at Germany’s Jülich laboratory, where they study collisions between protons and light ions. This work has been partially funded through a bilateral agreement between Germany and Bulgaria.

Fulfilling potential

Bulgaria is a young nation in terms of higher education. The St Kliment Ohridski University of Sofia was founded as a Higher Pedagogical School in 1888. In 1904, by a royal decree from Prince Ferdinand, grandfather of the current prime minister of Bulgaria, the school was transformed into Bulgaria’s first state university. The University of Sofia is a leading institution for the education of young scientists, as well as for fundamental and applied physics research. ECFA delegates were impressed by the high level of scientific education of Bulgarian physicists, and by the quality as well as quantity of work they perform, in spite of a lack of resources. It was felt that there is a great deal of potential in the Bulgarian particle physics community, but not enough resources to realize it.

Matters related to LHC computing and the GRID project, from a Bulgarian point of view, were presented by Vladimir Dimitrov of the Faculty of Mathematics and Informatics at Sofia University. Bulgaria will not build a Tier 1 centre; a possibility that is being discussed is to create two Tier 2 centres for the Balkan countries – one in Greece and one in Bulgaria. One piece of good news is that there will soon be a 6 Mb/s data transfer line to the BAS, with the possibility of an increase to 622 Mb/s at a relatively small cost later on. It is very likely that all Bulgarian universities and research institutes will be optically connected to one another in the near future.

A a mutual smooth collaboration between CERN and the Bulgarian Ministry of Education and Science is of vital importance.

In Bulgaria, there are a number of small accelerators for industrial and medical applications. There is substantial know-how in accelerator physics, but funding is meagre. Furthermore, the facilities for medical physics and radiotherapy are inadequate given the health needs of the country. There is a strong wish to construct a neutron therapy facility, but this would mean overcoming obstacles related to the widespread fear of radiation in officialdom.

Matey Mateev, head of the Department of Theoretical Physics at the University of Sofia, reported on the status of theoretical physics in Bulgaria. Historically, almost all of the staff members in the field were trained at the Dubna, Moscow or St Petersburg theory schools. Research in theoretical particle physics is carried out at the University of Sofia, the BAS, the University of Plovdiv and the University of Shumen. The range of topics covered is broad, ranging from mathematical physics (for example conformal field theory) to topics directly applicable to experiments, such as the partonic spin content of the nucleon, or calculation of energy levels of the antiprotonic helium atom (studied experimentally by the ASACUSA collaboration at CERN). The Bulgarian theoretical particle physics community has strong ties with those in several other countries, in particular France, Germany, Italy, the UK and the US, as well as with CERN. Many theorists are grateful to Ivan Todorov for his pioneering leadership in creating a strong school of theoretical physics in Bulgaria.

Joining CERN in 1999 was a milestone for Bulgaria – it was essential not only for the development of high-energy physics in the country, but also for nuclear physics, electronics, informatics and other disciplines of importance for the future of the Bulgarian scientific community. This point was raised many times during the ECFA visit. The student representative, Stefan Piperov, also emphasized how he had been attracted to particle physics not only because of the fundamental nature of the subject, but also because of the opportunity to visit CERN. However, there was also a general feeling of discontent that promises given to Bulgarian physicists by the authorities had not been fulfilled. It was clear that the government faces a difficult economical situation. Nonetheless, it was also obvious that with more support, Bulgarian physicists, engineers and technicians could reach their full potential. To this end, a mutual smooth collaboration between CERN and the Bulgarian Ministry of Education and Science is of vital importance. The existing link between advanced technology and particle physics would then stimulate Bulgarian industry and technology, and be a valuable investment in the future economic development of the country.

The post Surveying the status of Bulgarian particle physics appeared first on CERN Courier.

]]>
https://cerncourier.com/a/surveying-the-status-of-bulgarian-particle-physics/feed/ 0 Feature Bulgaria joined CERN in June 1999 as the laboratory's 20th member state. An ECFA visit took place in September 2002 to find out about particle physics in the country. https://cerncourier.com/wp-content/uploads/2003/03/cernbul1_3-03.jpg
Amsterdam hosts ICHEP conference https://cerncourier.com/a/amsterdam-hosts-ichep-conference/ https://cerncourier.com/a/amsterdam-hosts-ichep-conference/#respond Mon, 30 Sep 2002 22:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/amsterdam-hosts-ichep-conference/ "Brilliant work by many people has resulted in an extraordinarily profound, precise description of the physical world," concluded Frank Wilczek of MIT, in summarizing the 31st International Conference on High-Energy Physics in Amsterdam.

The post Amsterdam hosts ICHEP conference appeared first on CERN Courier.

]]>
“Brilliant work by many people has resulted in an extraordinarily profound, precise description of the physical world,” concluded Frank Wilczek of MIT, in summarizing the 31st International Conference on High-Energy Physics in Amsterdam. “Because of this we can ask, and formulate plans to answer, some truly awesome questions.” Examples of new high-precision results presented at the conference included the measurements of the mass and width of the W boson at LEP and the Tevatron; the strong coupling constant at HERA and LEP; and CP violation in B mesons from the BaBar and Belle experiments, which Yossi Nir from the Weizmann Institute described in terms of the first successful precision test of the Kobayashi-Maskawa mechanism of CP violation. A full report of the conference will appear in next month’s issue of CERN Courier.

The post Amsterdam hosts ICHEP conference appeared first on CERN Courier.

]]>
https://cerncourier.com/a/amsterdam-hosts-ichep-conference/feed/ 0 News "Brilliant work by many people has resulted in an extraordinarily profound, precise description of the physical world," concluded Frank Wilczek of MIT, in summarizing the 31st International Conference on High-Energy Physics in Amsterdam.
Quarks and Kiwis interact in New Zealand https://cerncourier.com/a/quarks-and-kiwis-interact-in-new-zealand/ https://cerncourier.com/a/quarks-and-kiwis-interact-in-new-zealand/#respond Thu, 15 Aug 2002 22:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/quarks-and-kiwis-interact-in-new-zealand/ New Zealand's well known natural beauty goes hand-in-hand with geographical isolation. But now a new research initiative is set to strengthen the bonds between the land of Ernest Rutherford and the international particle physics community.

The post Quarks and Kiwis interact in New Zealand appeared first on CERN Courier.

]]>
cernkiwi1_9-02

In June, two New Zealand universities formally applied to join the CMS experiment at CERN’s Large Hadron Collider (LHC). This marked the launch of an initiative to establish a New Zealand high-energy particle physics and instrumentation programme called NZ_CMS. The basis of this programme is the formation of an experimental particle physics and instrumentation research group within New Zealand that not only contributes directly to the CMS experimental programme, but does so in a way that also optimizes the benefits for New Zealand, its industry and its young researchers. The application was made on behalf of six staff from the universities of Auckland and Canterbury, and also includes several graduate students. In addition, NZ_CMS is receiving support from staff and university groups within the two universities in the fields of electrical engineering, computer science, medical imaging, nanotechnology and optics.

The CMS pixel system was identified as the area where NZ_CMS should contribute, as it provides the best match in terms of personnel, resources, and the focus on instrumentation development sought by Auckland, Canterbury, and the New Zealand government. Over the last year members of NZ_CMS have been working within the CMS pixel community, ensuring a smooth integration into CMS as well as establishing the connections and the technology transfer necessary for the continued development of the programme.

Pixel systems

NZ_CMS has benefited greatly from input from New Zealand, the CMS management and the CMS pixel group at the Paul Scherrer Institut (PSI) in Villigen, Switzerland. Realistic goals have been outlined for long-term benefits and contributions to CMS, whist enabling NZ_CMS to establish itself within the New Zealand academic climate, as well as allowing the shorter-term goals attractive to funding agencies to be achieved. Roland Horisberger, the CMS pixel detector project leader, and the PSI group have strongly supported NZ_CMS, facilitating pixel technology transfer to New Zealand and helping to define the scope of the NZ_CMS deliverables. At present, members of NZ_CMS are working within the PSI pixel group on the pixel control systems and services. As the NZ_CMS collaboration develops, it is expected that the New Zealand-PSI connection will be strengthened, and a training-exchange programme for students, engineers and researchers will be established.

cernkiwi3_9-02

As a sign of the enthusiasm and support for NZ_CMS, the development from initial idea to application for CMS membership has taken only a year and a half, and it has already secured preliminary funding for pixel instrumentation research. The initiative was first presented at the New Zealand Institute of Physics conference in July last year, which was followed by a visit from a CMS-CERN delegation to New Zealand in January.

The week-long itinerary of the delegation, led by John Ellis (representing CERN) and Diether Blechschmidt (representing CMS), included formal meetings with the minister of research, science and technology, the Royal Society of New Zealand and the Universities of Auckland and Canterbury. The delegates also visited Industrial research Ltd (a Crown Research Institute of some 400 staff) and participated in the 18th International Workshop on Weak Interactions and Neutrinos (WIN 2002) held at Canterbury. There was also time for a public lecture by Ellis entitled “From Rutherford to Higgs” in which he described particle interactions using vocabulary from the sport of rugby.

Following the delegation’s visit, Steve Thompson, chief executive officer of the Royal Society of New Zealand, made an official information visit to CERN. Soon afterwards initial funding was obtained from the New Zealand government and it was decided to proceed with the NZ_CMS application to join CMS. It is now hoped that concurrent with the NZ_CMS application New Zealand and CERN can negotiate and sign an agreement on co-operation. This would facilitate the development of the country’s participation in the LHC.

Current programmes

In an effort to build on its strengths and resources, NZ_CMS is endeavouring to work in conjunction with the country’s existing particle physics programmes. Current areas of research in New Zealand include heavy-ion physics at Auckland, ultra-high-energy neutrino physics at Canterbury and theoretical physics at Massey University.

cernkiwi2_9-02

In addition to the University of Auckland’s taking a leading role in the NZ_CMS pixel programme with the establishment of a pixel laboratory at its Tamaki campus, Auckland’s David Krofcheck is augmenting NZ_CMS’s contribution to CMS through reaction-plane studies for the CMS Heavy Ions programme. This follows on from work done with gold-gold collisions at the E895 experiment at Brookhaven in the US. E895 used the Alternating Gradient Synchrotron (AGS) to deliver gold beams of 2, 4, 6 and 8 GeV per nucleon to measure the excitation functions of collective nuclear matter “flow”. The NZ_CMS Heavy Ions group currently consists of three researchers and is based entirely at the University of Auckland.

As far as Canterbury is concerned, its particle physics group is participating in the Radio Ice Cherenkov Experiment (RICE) in Antarctica. RICE is a neutrino telescope at the south pole using radio antennas to detect the coherent emission of radio-wavelength Cherenkov radiation from the electromagnetic shower of particles produced when an ultra-high-energy electron neutrino interacts in the ice. The Canterbury group is involved in the Monte-Carlo simulation of the shower and the subsequent detection of the Cherenkov pulse, and the investigation into other possible physics sources such as the transition radiation from air showers. The absence of any neutrino events in the data analysed to date implies upper limits for the neutrino flux comparable to the air shower experiments AGASA and Fly’s Eye over the neutrino energy range of around 107-1012 GeV.

With research into medical imaging instrumentation, digital-signal processing and nanotechnology, Canterbury is also looking to establish instrumentation applications associated with pixel systems and pixel data visualization. These would tie in with its new HITLab NZ, the annex of the Human Interface Technology Laboratory (HITLab) at the University of Washington in Seattle. The HITLab consortium is a world leader in virtual-reality technology such as remote surgery and virtual retinal display, which scans images directly into the retina of the eye.

An additional aspect to be developed is online and offline computing, with a contribution from New Zealand now being possible thanks to the installation of the high-bandwidth transpacific Southern Cross Cable, which started operation in late 2000. The cable removes the bandwidth bottleneck between Australasia and the United States, and delivers 120 Gbit/s of fully protected capacity (the equivalent of eight full-length motion pictures every second). An upgrade in early 2003 will double capacity to 240 Gbit/s. At present, the currently available bandwidth to the US from within the universities is around 100 Mbit/s. This removal of bandwidth constraints, coupled with a developing interest in GRID research within New Zealand’s IT community, has prompted discussion of possible contributions to online and offline computing within the context of NZ_CMS.

The third component of New Zealand’s existing particle physics programme is the theoretical physics group at Massey University, which focuses on nucleon-structure functions and deep inelastic scattering calculations. Their interest in NZ_CMS is ongoing, as experiments at the LHC are an excellent opportunity for studying quark and gluon distribution functions. Detailed knowledge of these distribution functions is needed for much of the physics that will be performed at the LHC, and the NZ_CMS programme will enable the Massey group to participate directly in a facility that should contribute significantly to this area of research.

Finally, the NZ_CMS initiative should be seen as part of the resurgence in New Zealand particle physics that looks to work in close collaboration with both the country’s established research groups and our international collaborators (PSI and CMS/CERN). This is a significant step towards New Zealand’s participation in the truly global “big science” projects associated with modern high-energy particle physics laboratories, and is based on access to research at the frontier of particle physics.

The NZ_CMS initiative is also a way to combat New Zealand’s perceived geographical isolation and the continued “brain drain” of young researchers who venture overseas for graduate and career opportunities – often never to return.

This brain drain is, of course, not a new phenomenon. One of the best documented cases is a young Kiwi (New Zealander) called Ernest Rutherford, who left the country in 1895 to work with J J Thompson at Cambridge University’s Cavendish Laboratory. NZ_CMS intends to reunite quarks and Kiwis in New Zealand!

The post Quarks and Kiwis interact in New Zealand appeared first on CERN Courier.

]]>
https://cerncourier.com/a/quarks-and-kiwis-interact-in-new-zealand/feed/ 0 Feature New Zealand's well known natural beauty goes hand-in-hand with geographical isolation. But now a new research initiative is set to strengthen the bonds between the land of Ernest Rutherford and the international particle physics community. https://cerncourier.com/wp-content/uploads/2002/08/cernkiwi1_9-02-feature.jpg
Survey helps US with long-range planning https://cerncourier.com/a/survey-helps-us-with-long-range-planning/ https://cerncourier.com/a/survey-helps-us-with-long-range-planning/#respond Fri, 25 Jan 2002 00:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/survey-helps-us-with-long-range-planning/ A group of young US physicists recently conducted a survey to find out where high-energy physics is heading. The results reveal that growing internationalization makes physicists at isolated centres aim for hands-on access to research elsewhere. The survey also showed a need for more outreach to stimulate and maintain public interest, together with a lamentable lack of job opportunities.

The post Survey helps US with long-range planning appeared first on CERN Courier.

]]>
Last year the US particle physics community mounted one of its periodic long-range planning exercises to provide a roadmap for the subject over the coming decade. The recommendations from this study, which was conducted by the High-Energy Physics Advisory Panel (HEPAP) of the US Department of Energy and the National Science Foundation, have been published.

cernsur1_1-02

As one conduit to this review, a Young Physicists Panel (YPP) was established by Bonnie Fleming of Columbia, John Krane of Iowa State and Sam Zeller of Northwestern to provide a platform and consensus view for younger researchers. Originally the YPP hoped to provide a brief summary document. However, the survey revealed instead that most respondents did not have a single opinion to convey, making the conclusions more difficult to digest, but at the same time probably more valuable.

A diverse range of questions

To be most useful to HEPAP, the survey, entitled “The future of high-energy physics”, focused on US aspects and needs. Although initially designed for and oriented towards “young” physicists (defined as those yet to achieve a permanent position or tenure), the study was extended to all particle physicists, both inside and outside the US. There were some 1500 replies, most of which were received via the Web.

The survey began with a request for demographic and personal information – current career status, geographical origin, place of work, type of physics done and size of collaboration. The highest profiles to emerge were of a North American working in North America, or a European working in Europe, on collider physics in a team of 500-1000 people. No surprises there.

The next series of questions were aimed at “balance versus focus”, asking what sort of research should be carried out at the next major physics machine to be built, how many detectors it should have and what sort of physicists it should employ.

In this section the emerging picture was of a machine supporting a diverse range of physics, with at least two detectors, employing comparable numbers of theorists, phenomenologists and experimentalists. An extra question showed that it is considered important for high-energy physics laboratories to host an astrophysics effort. Some do, some are in the process of doing so, and other laboratories have yet to satisfy this demand.

cernsur2_1-02

A section covering “globalization” lumped together anything to do with big science being concentrated at major centres. Most respondents admitted to seeing their detector at least weekly, so obviously they have easy access and would rather have it this way than be near their supervisor. A hands-on hardware requirement was seen as very important and, if a research centre was situated outside the US, national or regional access via a staging post was considered the best possible alternative to being centrestage.

Answers to specific questions on outreach underlined that particle physics is not doing nearly enough to communicate with either the relevant funding agencies or the general public. Half of the replies indicated that physicists were ready to dedicate more time to this important activity. (It is our opinion at CERN Courier that while this is very commendable, it is one thing to tick a box, but quite another to deliver. Unfortunately, few physicists have the imagination and commitment to contribute significantly to outreach activities.)

Scientific interest

The most revealing part of the survey was perhaps the one that asked participants why they had been attracted to particle physics originally and why they had remained in the field. The answers reveal that intrinsic scientific interest dominated the decision, whether for newcomers or for those further down the line. Science is clearly interesting, at least to some people, and here is a possible message for outreach.

Also very prominent was the opinion that a lack of jobs could drive young people out of the field. Another visible signal indicated that not enough talented physicists are retained. The big question that faces the field now is how to rectify this.

The remaining sections of the YPP survey focused on physics issues, and these were largely mirrored in the HEPAP recommendations. However, it was clear that opinions about the siting of future machines were polarized according to geographical base (North America or Europe). While most tenured US scientists think that it is important to have a major new facility in the US, this view is not mirrored among younger scientists.

With big issues at stake, casting a wider survey net could reveal a more global consensus within the physics community on the way to go forward.

The post Survey helps US with long-range planning appeared first on CERN Courier.

]]>
https://cerncourier.com/a/survey-helps-us-with-long-range-planning/feed/ 0 Feature A group of young US physicists recently conducted a survey to find out where high-energy physics is heading. The results reveal that growing internationalization makes physicists at isolated centres aim for hands-on access to research elsewhere. The survey also showed a need for more outreach to stimulate and maintain public interest, together with a lamentable lack of job opportunities. https://cerncourier.com/wp-content/uploads/2002/01/cernsur1_1-02.jpg
Database lists the top-cited physics papers https://cerncourier.com/a/database-lists-the-top-cited-physics-papers/ https://cerncourier.com/a/database-lists-the-top-cited-physics-papers/#respond Tue, 04 Dec 2001 00:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/database-lists-the-top-cited-physics-papers/ Citation tracking can point to the most influential trends in research. Heath O'Connell and Michael Peskin analyse the chart for the year 2000 and report the hottest topics in high-energy physics.

The post Database lists the top-cited physics papers appeared first on CERN Courier.

]]>
cernpapers1_12-01

The SPIRES-HEP database maintained by the library at the Stanford Linear Accelerator Center (SLAC) connects preprint or eprint versions to articles published in journals or conference proceedings, providing access to all phases of the publication history. The database lists virtually every high-energy physics paper published or even preprinted over the past 30 years.

In addition, most papers now have backward links to the papers that they cite and forward links to the papers citing them. These citation links provide a very effective means of searching the literature. In the past few years SPIRES-HEP has been automatically harvesting reference citations from eprints, creating a web of links that thoroughly indexes the literature.

As a by-product of this citation linkage, SPIRES-HEP can easily search out the papers most cited by publications in high-energy physics. The list of papers with the most citations in a given year provides a snapshot of the hottest topics that have engaged the attention of theorists and experimenters. For the past few years, SPIRES HEP has posted a scientific review of the year’s top-cited papers.

We have recently posted the “top-cited” lists for 2000. These materials include a list of the papers with more than 100 citations in the past year and a list of the papers with more than 1000 citations over the history of the SPIRES-HEP database.

So what are, by this measure, the hottest topics of 2000? Table 1 lists the top 10 cited papers and the number of citations of those papers in 2000. These papers represent major areas of activity that are discussed further in the review posted at the SPIRES Web site. The top-cited reference in high-energy physics is always the Review of Particle Properties. Below this in the list, the following areas are represented. (Papers appearing in the “top 10 cited list” are referred to by a number that indicates their position on the list.)

Maldacena’s duality

A broad swath of developments in string theory and related areas of mathematical physics has resulted from Maldacena’s 1997 paper (2), which propose a relation between supergravity and superstring theories in (d+1) dimensional anti-de Sitter space and supersymmetric Yang-Mills theories in d-dimensions.

Anti-de Sitter space, the homogeneous space of constant negative curvature, has a boundary in the sense that light signals propagate to space-like infinity in finite time. Maldacena proposed that, for a gravity theory living in the interior of the space, there would be a corresponding, and equivalent, scale-invariant quantum field theory living on the boundary. Subsequently, Witten (7), and Gubser, Klebanov and Polyakov (9), gave a precise relation between correlation functions in the boundary theory and S-Matrix elements for the gravity theory in the interior.

These developments have led to many insights, illuminating both the properties of strongly coupled Yang-Mills theory and quantum gravity theories. It is remarkable that Maldacena’s paper has managed, in just three years, to accumulate more than 1600 citations and to vault to position 25 on the all-time citation list.

Extra space dimensions

Though string theory predicts the existence of seven extra space dimensions, these have conventionally been considered to be unobservably small and irrelevant to ordinary particle physics. However, the next three papers on the “top-cited” list involve theoretical models in which extra space dimensions play a direct role in particle physics and, in particular, explain the mass scale of the Higgs boson. Randall and Sundrum (3, 4) have proposed two different scenarios in which our four-dimensional universe is a flat, three-dimensional surface in anti-de Sitter space.

Arkani-Hamed, Dimopoulos and Dvali (5) have proposed a scenario in which our universe is a surface in a large, flat space-time, the size of which may approach the millimetre scale. Further consequences of this model are developed in paper 10. Both of the models 4 and 5 will have crucial tests at CERN’s LHC collider, which may give direct experimental evidence for the presence of new space dimensions (Discovering new dimensions at LHC).

Non-commutative field theory

Many ideas about quantum gravity lead to the idea that space-time co-ordinates are non-commuting operators. Non-commutative Yang-Mills theory, which was invented by Connes, gives a simple field theory model in which consequences of the possible non-commutativity of space can be studied. Paper 6, by Seiberg and Witten, explained the connection between Connes’ model and various compactifications of string theory, launching an intense investigation into non-commutative dynamics.

Neutrino physics

In experimental particle physics the most surprising development of the past few years has been the discovery by the Super-kamiokande collaboration of atmospheric neutrino oscillations (8). This experimental result indicates the presence of neutrino mass and large mixing among the lepton generations. It has led to many speculations on the origin of flavour mixing and to a new, intense level of experimentation on neutrino properties.

The complete list of the top 40 cited papers of 2000 and a more detailed scientific review can be found at the SLAC Web site. The site also includes a “top-cited” list for each eprint archive relevant to high-energy physics. In Table 2 the top-cited paper (exclusive of the Particle Data Group’s Review of Particle Properties) in each archive is shown.

We make no claim that the papers that we have listed here are currently the most important papers in high energy physics. Year-by-year accounting is influenced as much by fashion as by logical scientific development. Both the standard electroweak model and string theory spent many years in the cellar of the citation counts before rising to their current prominence. If you favour a trend, a model or an experiment that is not listed here, more power to you. We hope that your insights will be well represented on our lists before the end of the decade.

Table 1: Top-cited articles of 2000

Article No. of
citations
Article No. of
citations
1 Particle Data Group, 1998 Review of particle
physics Eur. Phys. J. C3 1-794
1236 6 Nathan
Seiberg and Edward Witten, String theory and noncommutative geometry
(hep-th/9908142)
397
2 Juan Maldacena, 1998 The large
N limit of superconformal field theories and supergravity Adv. Theor. Math.
Phys.
2 231-252 (hep-th/9711200)
498 7 Edward
Witten, 1998 Anti-de Sitter space and holography Adv. Theor. Math. Phys.
2 253-291 (hep-th/9802150)
347
3 Lisa Randall
and Raman Sundrum, 1999 An alternative to compactification Phys. Rev.
Lett.
83 4690-4693 (hep-th/ 9906064)
446 8 Y Fukuda
et al., 1998 Evidence for oscillation of atmospheric neutrinos Phys. Rev.
Lett.
81 1562-1567 (hep-ex/ 9807003)
325
4
Lisa Randall and Raman Sundrum, 1999 A large mass hierarchy from a small extra
dimension Phys. Rev. Lett. 83 3370-3373
(hep-ph/9905221)
414 9 S S Gubser et al., 1998 Gauge theory
correlators from noncritical string theory Phys. Lett. B428105-114
(hep-th/ 9802109)
316
5 Nima Arkani-Hamed, Savas
Dimopoulos and Gia Dvali, 1998 The hierarchy problem and new dimensions at a
millimeter Phys. Lett. B249 263
(hep-ph/9803315)
403 10 Ignatios Antoniadis et al., 1998
New dimensions at a millimeter to a Fermi and superstrings at a TeV Phys.
Lett.
B436 257-263
(hep-ph/9804398)
301

Table 2: Top citations within each eprint archive

Archive Article No. of
citations
GR-QC S W Hawking, 1975 Particle creation by black
holesCommun. Math. Phys. 43
199-220
61
HEP-EX Torbjorn Sjostrand, 1994
High-energy physics event generation with PYTHIA 5.7 and JETSET 7.4 Comput.
Phys. Commun.
82
74-90
94
HEP-LAT Herbert Neuberger, 1998
Exactly massless quarks on the lattice Phys. Lett. B417 141-144
(hep-lat/9707022)
68
HEP-PH Y Fukuda et
al.,
1998 Evidence for oscillation of atmospheric neutrinos. Phys. Rev.
Lett.
81 1562-1567
(hep-ex/9807003)
265
HEP-TH Juan
Maldacena, 1998 The Large N limit of superconformal field theories and supergravity
Adv. Theor. Math. Phys. 2 231-252
(hep-th/9711200)
465
NUCL-EX J P Bondorf
et al., 1995 Statistical multifragmentation of nuclei Phys. Rept.
257 133-221
16
NUCL-TH R Wiringa,
V Stoks and R Schiavilla, 1995 An accurate nucleon-nucleon potential with charge
independence breaking Phys. Rev. C51 38-51 (nucl-th/
9408016)
53

The post Database lists the top-cited physics papers appeared first on CERN Courier.

]]>
https://cerncourier.com/a/database-lists-the-top-cited-physics-papers/feed/ 0 Feature Citation tracking can point to the most influential trends in research. Heath O'Connell and Michael Peskin analyse the chart for the year 2000 and report the hottest topics in high-energy physics. https://cerncourier.com/wp-content/uploads/2001/12/cernpapers1_12-01-feature.jpg
History centre publishes archiving guidelines https://cerncourier.com/a/history-centre-publishes-archiving-guidelines/ https://cerncourier.com/a/history-centre-publishes-archiving-guidelines/#respond Wed, 31 Oct 2001 00:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/history-centre-publishes-archiving-guidelines/ According to a recently released report by the American Institute of Physics Center for the History of Physics, the documentation of collaborative scientific research needs urgent attention.

The post History centre publishes archiving guidelines appeared first on CERN Courier.

]]>
According to a recently released report by the American Institute of Physics (AIP) Center for the History of Physics, the documentation of collaborative scientific research needs urgent attention. The problems that need to be addressed range from the way in which the contributions of distinguished individuals (or records of a project conducted by one institution) are preserved, to the fact that, almost without exception, research institutions and federal science agencies fail to provide adequate support to programmes to save records of significant research.

To remedy this, the AIP History Center has issued Documenting Multi-Institutional Collaborations – the final report of its decade-long study of multi-institutional collaborations in physics and allied fields.

The main recommendations of the report are that:

* scientists and others should take special care to identify past collaborations that have made significant contributions;

* research laboratories and other centres should set up a mechanism to secure records of future significant experiments;

* institutional archives should share information.

The report makes a broad distinction between “core records” – to be saved for all collaborations – and other records to be saved for “significant collaborations”.

The post History centre publishes archiving guidelines appeared first on CERN Courier.

]]>
https://cerncourier.com/a/history-centre-publishes-archiving-guidelines/feed/ 0 News According to a recently released report by the American Institute of Physics Center for the History of Physics, the documentation of collaborative scientific research needs urgent attention.
Report urges scientists to secure their records https://cerncourier.com/a/report-urges-scientists-to-secure-their-records/ https://cerncourier.com/a/report-urges-scientists-to-secure-their-records/#respond Wed, 31 Oct 2001 00:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/report-urges-scientists-to-secure-their-records/ A new report underlines the importance of keeping scientific records to safeguard our scientific heritage. Once an experiment is complete, its responsibilities are not over.

The post Report urges scientists to secure their records appeared first on CERN Courier.

]]>
cernarch1_9-01

According to a recently released report by the American Institute of Physics (AIP) Center for the History of Physics, there are many problems facing the documentation of collaborative research. These range from the way in which the contributions of distinguished individuals (or records of a project conducted by one institution) are preserved, to the fact that, almost without exception, research institutions and federal science agencies fail to provide adequate support to programmes to save records of significant research.

To help to find solutions, the AIP History Center has issued Documenting Multi-Institutional Collaborations – the final report of its decade-long study of multi-institutional collaborations in physics and allied fields.

The main recommendations of the report are that:

* scientists and others should take special care to identify past collaborations that have made significant contributions;

* research laboratories and other centres should set up a mechanism to secure records of future significant experiments;

* institutional archives should share information.

The long-term study focused on high-energy physics, space science, geophysics, ground-based astronomy, materials science, medical physics, nuclear physics and an area called computer-mediated collaborations. The main goal of the project was to learn enough about these transient communities to be able to advise on how to document them.

The study was built on interviews with more than 600 scientific collaborators; numerous site visits to archives, records offices and US federal agencies; and advice from working groups of distinguished scientists, archivists, records officers, historians and sociologists. The study group gathered and analysed data on characteristics of collaborations, such as their formation, decision-making structures, communication patterns, activities and funding.

According to the report, scientists in multi-institutional collaborations are well aware that their way of doing research is unlike that of others working alone or in small groups. All too often, however, scientists fail to realize how records needed to document research are prone to destruction. It may appear to them that their recollections and those of their colleagues are sufficient. This is thought to be unfortunate from the standpoint of present needs. From the standpoint of the future it is disastrous, for even the imperfect recollections will die with the scientists and later generations will never know how some of today’s important scientific work was done. For particle physics, the report has some specific suggestions.

Core records

The report makes a broad distinction between “core records” — those records to be saved for all collaborations — and records to be saved for “significant collaborations”. The definitions of the former are slanted towards traditional US procedures with Department of Energy or National Science Foundation funding for experiments carried out at major US laboratories. However, these can be paraphrased unambiguously for a more global audience without too much trouble.

The additional records for significant collaborations include correspondence between the experiment spokesperson, the experiment collaboration and laboratory administrators. Intracollaboration meetings, collaboration groups, interinstitutional committees, and project management and engineering documents are also deemed to be important under this heading.

The post Report urges scientists to secure their records appeared first on CERN Courier.

]]>
https://cerncourier.com/a/report-urges-scientists-to-secure-their-records/feed/ 0 Feature A new report underlines the importance of keeping scientific records to safeguard our scientific heritage. Once an experiment is complete, its responsibilities are not over. https://cerncourier.com/wp-content/uploads/2001/10/cernarch1_9-01.jpg
Language was no barrier at Budapest conference https://cerncourier.com/a/language-was-no-barrier-at-budapest-conference/ https://cerncourier.com/a/language-was-no-barrier-at-budapest-conference/#respond Mon, 01 Oct 2001 22:00:00 +0000 https://preview-courier.web.cern.ch:8888/Cern-mock-may/language-was-no-barrier-at-budapest-conference/ This year's venue for the European Physical Society's biennial Europhysics Conference on High-Energy Physics was the new campus of Eotvos University in Budapest, Hungary.

The post Language was no barrier at Budapest conference appeared first on CERN Courier.

]]>
cernnews4-10-01

This year’s venue for the European Physical Society’s biennial Europhysics Conference on High-Energy Physics was the new campus of Eotvos University in Budapest, Hungary. From 41 countries, nearly 600 registered participants and more than 100 registered “accompanying persons” attended the scientific and social events.

As well as the traditional parallel and plenary sessions with all of the physics developments (most of which have already been reported in CERN Courier), the meeting included several innovations. One was an open session of the European Committee of Future Accelerators (ECFA) in which ECFA chairman Lorenzo Foa presented the draft of an ECFA report on the future of European accelerator-based particle physics. Another innovation came when many outsiders were attracted to talks by two leading Hungarian high-energy physicists. To avoid language difficulties, the talks were presented in parallel, one in Hungarian, and the other in English. Julius Kuti (UC San Diego and an external member of the Hungarian Academy of Sciences) spoke on the cosmic significance of particle physics and Teraflop computing. Alex Szalay (Johns Hopkins and a member of the Hungarian Academy of Sciences) gave a talk on megamaps of the universe.

cernnews5-10-01

The next conference in the series will be held in Aachen, Germany, in 2003.

The proceedings of the Budapest conference will be published exclusively in electronic form by JHEP.

The post Language was no barrier at Budapest conference appeared first on CERN Courier.

]]>
https://cerncourier.com/a/language-was-no-barrier-at-budapest-conference/feed/ 0 News This year's venue for the European Physical Society's biennial Europhysics Conference on High-Energy Physics was the new campus of Eotvos University in Budapest, Hungary. https://cerncourier.com/wp-content/uploads/2001/10/cernnews5-10-01.jpg